Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Shape inference fails on Concat when first tensor is initialized empty #6276

Open
MatejUrbanQC opened this issue Aug 2, 2024 · 4 comments
Open
Labels
bug contributions welcome shape inference Issues related to shape inference
Milestone

Comments

@MatejUrbanQC
Copy link

Bug Report

Is the issue related to model conversion?

No

Describe the bug

My graph consists of a Concat node (with Axis=0) with two inputs with initializers. The first is initialized to [] and the second is [1,2,3]. When I run shape inference with strict_mode=True, data_prop=True, it fails with the following error

Traceback (most recent call last):
  File "/Users/matejurban/bug.py", line 3, in <module>
    onnx.shape_inference.infer_shapes(model, strict_mode=True, data_prop=True)
  File "/Users/matejurban/micromamba/lib/python3.12/site-packages/onnx/shape_inference.py", line 46, in infer_shapes
    inferred_model_str = C.infer_shapes(
                         ^^^^^^^^^^^^^^^
RuntimeError: [ShapeInferenceError] Inference error(s): (op_type:Concat, node name: _this_): [ShapeInferenceError] axis must be in [-rank, rank-1].

System information

  • OS Platform and Distribution (e.g. Linux Ubuntu 20.04): macOS 13.6.7
  • ONNX version (e.g. 1.13): 1.16.1
  • Python version: 3.12.4

Reproduction instructions

import onnx
model = onnx.load('model.onnx')
onnx.shape_inference.infer_shapes(model, strict_mode=True, data_prop=True)

model.onnx.zip

Expected behavior

The shape inference should succeed. The result of the concat should be a Tensor [1,2,3]

Notes

@justinchuby justinchuby added shape inference Issues related to shape inference contributions welcome labels Aug 7, 2024
@justinchuby justinchuby added this to the 1.18 milestone Aug 7, 2024
@Yosshi999
Copy link
Contributor

I think

int rank = input_data_0->dim_size();
if (axis < -rank || axis >= rank) {
fail_shape_inference("axis must be in [-rank, rank-1].");
return false;
}
is suspicious. This validation is not same as the Concat's shape inference.

This is also used by Gather and it is broken:

import numpy as np
import torch
import onnx  # build from commit f22a2ad78c9b8f3bd2bb402bfce2b0079570ecb6
from onnx import TensorProto
from onnx.helper import (
    make_model, make_node, make_tensor, make_graph,
    make_tensor_value_info)

print(np.take(np.array([], np.float32), np.array([], np.int64), axis=0))
# []
print(torch.gather(torch.tensor([]), 0, torch.LongTensor([])))
# tensor([])

model = make_model(
    make_graph(
        [make_node("Gather", ["X", "I"], ["Y"], axis=0)],
        "sample",
        [],
        [make_tensor_value_info("Y", TensorProto.INT64, [0])],
        initializer=[
            make_tensor("X", TensorProto.INT64, [0], np.array([], np.int64)),
            make_tensor("I", TensorProto.INT64, [0], np.array([], np.int64))
        ]),
    opset_imports=[onnx.helper.make_operatorsetid("", 13)])

print("check_model...")
onnx.checker.check_model(model)
print("infer_shapes...")
onnx.shape_inference.infer_shapes(model, strict_mode=True, data_prop=False)
print("infer_shapes w/ data_prop...")
onnx.shape_inference.infer_shapes(model, strict_mode=True, data_prop=True)

Traceback (most recent call last):
  File "/home/avocado/github/onnx/sample.py", line 30, in <module>
    onnx.shape_inference.infer_shapes(model, strict_mode=True, data_prop=True)
  File "/home/avocado/github/onnx/onnx/shape_inference.py", line 46, in infer_shapes
    inferred_model_str = C.infer_shapes(
onnx.onnx_cpp2py_export.shape_inference.InferenceError: [ShapeInferenceError] Inference error(s): (op_type:Gather): [ShapeInferenceError] axis must be in [-rank, rank-1].

@liwuhen
Copy link

liwuhen commented Nov 21, 2024

I would like to try to solve this issue, can you assign it to me?

@justinchuby
Copy link
Contributor

@liwuhen Please feel free to create a pull request!

@liwuhen
Copy link

liwuhen commented Nov 27, 2024

I found that the source code part of this problem is caused by the logic not being written, I can use simple logic for this problem to solve it, but this part involves more code logic, do I need to discuss it? @justinchuby

liwuhen added a commit to liwuhen/onnx that referenced this issue Nov 27, 2024
…alized empty onnx#6276

Signed-off-by: liwuhen <liwuhen5788@gmail.com>
github-merge-queue bot pushed a commit that referenced this issue Dec 11, 2024
### Description
Fix issue #6276 (data propagation
fails on Concat when first tensor is initialized empty).

---------

Signed-off-by: Ganesan Ramalingam <grama@microsoft.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug contributions welcome shape inference Issues related to shape inference
Projects
None yet
Development

No branches or pull requests

4 participants