-
Notifications
You must be signed in to change notification settings - Fork 627
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Torchscript to CoreML conversion skips for loops #1995
Comments
We only have "expiremental" support for PyTorch models which have not been created by |
The same issue persists if the PyTorch model |
@the-neural-networker - the following works for me: import numpy as np
import torch
import torch.nn as nn
import coremltools as ct
def experiment(val: torch.Tensor):
val = int(val.item())
result = torch.zeros(val)
for i in range(val):
result[i] = i
return result
class Experiment(nn.Module):
def forward(self, x):
return experiment(x)
exp = Experiment().eval()
# Use a tensor as input, not an integer
input_tensor = torch.tensor(100)
traced_model = torch.jit.trace(exp, input_tensor)
y_t = traced_model(input_tensor)
# Specify the input type as ct.TensorType(name="x", shape=(1,))
coreml_exp = ct.convert(
traced_model,
source="pytorch",
inputs=[ct.TensorType(name="x", shape=(1,))],
convert_to="mlprogram"
)
# Create an input dictionary with the necessary input data
input_data = {
'x': np.array([100.0]),
}
# Make a prediction using the model
coreml_output = coreml_exp.predict(input_data)
coreml_output = list(coreml_output.values())[0]
assert all(coreml_output == y_t.numpy()) |
But won't this fail for other inputs (not 100)? Because technically the experiment function is traced which has a |
This example will give the wrong prediction when the input is not If you want to do something like this in Core ML, I think you'll need to write your own mlprogram. Take a look at our set of MIL ops: |
🐞Description
When torchscript code contains for loops and is converted to coreML, the for loops are skipped and the result before the for loop is returned.
To Reproduce
System environment (please complete the following information):
Additional Context
The for loop can be easily vectorized, but for simplicity, I wanted to show that the for loops get skipped in coreML conversion.
The text was updated successfully, but these errors were encountered: