Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Using onnx shape inference some operator doesn't support shape inference #6307

Open
eezhang123 opened this issue Aug 20, 2024 · 1 comment
Open
Assignees
Labels
bug contributions welcome shape inference Issues related to shape inference

Comments

@eezhang123
Copy link

eezhang123 commented Aug 20, 2024

Ask a Question

Describe the question

image
image

Use the timm repository generate the onnx file, and use onnx function shape inference ( infer_shapes ), the flatten operator and expand operater appear the unknown value in tensor dim , Input is static [1 ,3, 224, 224].

System information

  • OS Platform and Distribution (e.g. Linux Ubuntu 20.04):Linux Ubuntu 20.04
  • ONNX version (e.g. 1.13): 1.17
  • Python version: 3.10
  • Protobuf version: 5.27

Reproduction instructions

  • Describe the code to reproduce the behavior.
import onnx
import timm

model = timm.create_model('vit_base_patch16_224', pretrained=True)
torch.onnx.export(model, dummy_input, 'model.onnx', 
                  opset_version=17,
                  input_names=['input'], output_names=['output'])
inferred_model= onnx.load('model.onnx')
inferred_model = onnx.shape_inference.infer_shapes(inferred_model)
save_model(inferred_model, "infer_model.onnx")
...

Expected Behaviors

The expand and flatten operator should appear output shape , not unknow dim value.

Another question

The flatten opearor surported in opset_version[21, 13, 11, 9, 1], why when i use 17 opset version, the flatten operator decompose (shape/ slice/ concat/ reshape ) operator, i think it should be a fusion operator. Expand operator also appear this situation, it decompose of ( Mul Equal Where ) base operators.

@eezhang123 eezhang123 added the bug label Aug 20, 2024
@eezhang123 eezhang123 reopened this Aug 20, 2024
@justinchuby justinchuby added shape inference Issues related to shape inference contributions welcome labels Aug 24, 2024
@gramalingam gramalingam self-assigned this Sep 4, 2024
@xadupre
Copy link
Contributor

xadupre commented Sep 9, 2024

This case is tricky because shape inference requires the evaluation of nodes manipulating shapes. shape inference is implemented in C++ and onnx has no runtime in C++, only in python. In your particular case, parameter do_constant_folding is true by default. This part of the graph should have been folded into a constant. This is probably a bug related to torch.onnx.export.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug contributions welcome shape inference Issues related to shape inference
Projects
None yet
Development

No branches or pull requests

4 participants