Skip to content

Commit

Permalink
Merge branch 'master' into valgrind
Browse files Browse the repository at this point in the history
  • Loading branch information
ankane committed Nov 11, 2024
2 parents e7c7f1e + 0f1832f commit 63efb2a
Show file tree
Hide file tree
Showing 38 changed files with 1,026 additions and 449 deletions.
6 changes: 3 additions & 3 deletions .github/workflows/build.yml
Original file line number Diff line number Diff line change
Expand Up @@ -8,9 +8,9 @@ jobs:
matrix:
include:
- ruby: 3.3
os: ubuntu-22.04
os: ubuntu-24.04
env:
LIBTORCH_VERSION: 2.2.1
LIBTORCH_VERSION: 2.5.1
steps:
- uses: actions/checkout@v4
- uses: ruby/setup-ruby@v1
Expand All @@ -28,6 +28,6 @@ jobs:
cd ~
wget -q -O libtorch.zip https://download.pytorch.org/libtorch/cpu/libtorch-cxx11-abi-shared-with-deps-$LIBTORCH_VERSION%2Bcpu.zip
unzip -q libtorch.zip
- run: MAKE="make -j$(nproc)" bundle exec rake compile -- --with-torch-dir=$HOME/libtorch
- run: MAKE="make -j$(getconf _NPROCESSORS_ONLN)" bundle exec rake compile -- --with-torch-dir=$HOME/libtorch
- run: sudo apt-get update && sudo apt-get install valgrind
- run: bundle exec rake test:valgrind
27 changes: 27 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,30 @@
## 0.18.1 (unreleased)

- Improved `inspect` for `Device`
- Fixed equality for `Device`
- Fixed `index` method for `Device` when no index

## 0.18.0 (2024-10-22)

- Updated LibTorch to 2.5.0

## 0.17.1 (2024-08-19)

- Added `persistent` option to `register_buffer` method
- Added `prefix` and `recurse` options to `named_buffers` method

## 0.17.0 (2024-07-26)

- Updated LibTorch to 2.4.0
- Added `normalize` method
- Added support for tensor indexing with arrays

## 0.16.0 (2024-06-12)

- Updated LibTorch to 2.3.0
- Added `ELU` and `GELU` classes
- Dropped support for Ruby < 3.1

## 0.15.0 (2024-02-28)

- Updated LibTorch to 2.2.0
Expand Down
3 changes: 0 additions & 3 deletions Gemfile
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,3 @@ gem "rake-compiler"
gem "minitest", ">= 5"
gem "numo-narray"
gem "ruby_memcheck"

# for examples
gem "torchvision", ">= 0.2", require: false
36 changes: 17 additions & 19 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,19 +14,28 @@ Check out:

## Installation

First, [install LibTorch](#libtorch-installation). With Homebrew, it’s part of the PyTorch package:
First, [download LibTorch](https://pytorch.org/get-started/locally/). For Mac arm64, use:

```sh
brew install pytorch
curl -L https://download.pytorch.org/libtorch/cpu/libtorch-macos-arm64-2.5.1.zip > libtorch.zip
unzip -q libtorch.zip
```

Add this line to your application’s Gemfile:
For Linux x86-64, use the `cxx11 ABI` version. For other platforms, build LibTorch from source.

Then run:

```sh
bundle config build.torch-rb --with-torch-dir=/path/to/libtorch
```

And add this line to your application’s Gemfile:

```ruby
gem "torch-rb"
```

It can take 5-10 minutes to compile the extension.
It can take 5-10 minutes to compile the extension. Windows is not currently supported.

## Getting Started

Expand Down Expand Up @@ -398,31 +407,20 @@ Here’s a list of functions to create tensors (descriptions from the [C++ docs]
Torch.zeros(3) # tensor([0, 0, 0])
```

## LibTorch Installation

[Download LibTorch](https://pytorch.org/) (for Linux, use the `cxx11 ABI` version). Then run:

```sh
bundle config build.torch-rb --with-torch-dir=/path/to/libtorch
```
## LibTorch Compatibility

Here’s the list of compatible versions.

Torch.rb | LibTorch
--- | ---
0.18.x | 2.5.x
0.17.x | 2.4.x
0.16.x | 2.3.x
0.15.x | 2.2.x
0.14.x | 2.1.x
0.13.x | 2.0.x
0.12.x | 1.13.x

### Homebrew

You can also use Homebrew.

```sh
brew install pytorch
```

## Performance

Deep learning is significantly faster on a GPU.
Expand Down
12 changes: 6 additions & 6 deletions codegen/generate_functions.rb
Original file line number Diff line number Diff line change
Expand Up @@ -155,10 +155,10 @@ def generate_attach_def(name, type, def_method)
end

ruby_name = "_#{ruby_name}" if ["size", "stride", "random!"].include?(ruby_name)
ruby_name = ruby_name.sub(/\Afft_/, "") if type == "fft"
ruby_name = ruby_name.sub(/\Alinalg_/, "") if type == "linalg"
ruby_name = ruby_name.sub(/\Aspecial_/, "") if type == "special"
ruby_name = ruby_name.sub(/\Asparse_/, "") if type == "sparse"
ruby_name = ruby_name.delete_prefix("fft_") if type == "fft"
ruby_name = ruby_name.delete_prefix("linalg_") if type == "linalg"
ruby_name = ruby_name.delete_prefix("special_") if type == "special"
ruby_name = ruby_name.delete_prefix("sparse_") if type == "sparse"
ruby_name = name if name.start_with?("__")

"rb_#{def_method}(m, \"#{ruby_name}\", #{full_name(name, type)}, -1);"
Expand Down Expand Up @@ -216,7 +216,7 @@ def add_dispatch(function, def_method)
out_code = generate_dispatch(function["out"], def_method)
out_index = function["out"].out_index

return "if (_r.isNone(#{out_index})) {
"if (_r.isNone(#{out_index})) {
#{indent(base_code)}
} else {
#{indent(out_code)}
Expand Down Expand Up @@ -439,7 +439,7 @@ def generate_function_params(function, params, remove_self)
else
"#{func}Optional"
end
end
end

"_r.#{func}(#{param[:position]})"
end
Expand Down
Loading

0 comments on commit 63efb2a

Please sign in to comment.