This repo contains a Pytorch implementation of the high-quality convolution-based rotation introduced in IEEE TIP'95: "Convolution-Based Interpolation for Fast, High-Quality Rotation of Images" by Michael Unser, Philippe Thevenaz and Leonid Yaroslavsky [paper].
This implementation comes with a rotation method working for 4D and 5D tensors of shape (B,C,H,W) or (B,C,L,H,W). You can try the code by running
python main.py
and compare to pytorch interpolation functions with
python benchmark.py
In your python code, call the rotation function as follows:
import torch
import math
from torch_rotation import rotate_three_pass # same function for 4D and 5D tensor!
I = torch.rand(10, 3, 128, 128) # mock image (could be a mock volume too.)
angle = 30 * math.pi / 180 # the angle should be in radian.
I_rot = rotate_three_pass(I, angle) # By default do FFT-based interpolation.
Not that for the moment this package supports only the
basic 3D rotation
where the rotation
A 2D rotation matrix of angle
In practice, applying this transform matrix is done with 2D warp routines relying
on bilinear or bicubic interpolation, for instance with OpenCV or Pytorch.
The authors of the paper above remarked that a three-way decomposition of
This converts the 2D warp into three consecutive 1D shears, with no intermediate rescaling. This prevents losing too much details during the rotation process, and it can be efficiently implemented with row or column-wise translations.
This method extends naturally to the 3D case by applying the 2D rotation matrix around each axis of the space.
It can be installed from pypi with
pip install torch-rotation
or (deprecated) from source with
python setup.py install
An image is worth a thousand words. Below you will find a simple experiment
from the TIP paper consisting in rotating 16 times by
The three-pass approach (I used the FFT-based approach) is an order of magnitude of MSE more accurate than bicubic interpolation, a widespread technique for computing sharp rotated images.
Please open an issue to report any bug.