Thanks to visit codestin.com
Credit goes to Github.com

Skip to content

Segmentation falut of shape inference with onnx.defs.OpSchema.set_type_and_shape_inference_function #7508

@ry3s

Description

@ry3s

Bug Report

Is the issue related to model conversion?

Describe the bug

Segmentation fault when calling onnx.shape_inference.infer_shapes with custom op.

System information

  • OS Platform and Distribution: Arch Linux
  • ONNX version: 5b5566eaeee97cc5dce34a2bb420ccf8cacf18a8
  • Python version: 3.13.9
  • GCC/Compiler version (if compiling from source): gcc 15.2.1
  • CMake version: 4.2.0

Reproduction instructions

import onnx
from onnx.defs import OpSchema

def create_model() -> onnx.ModelProto:
    # Create a simple ONNX model with Softmax and Relu nodes
    input_tensor = onnx.helper.make_tensor_value_info('input', onnx.TensorProto.FLOAT, [1, 3])
    output_tensor = onnx.helper.make_tensor_value_info('output', onnx.TensorProto.FLOAT, [1, 3])
    softmax_node = onnx.helper.make_node(
        'MySoftmax',
        inputs=['input'],
        outputs=['softmax_output'],
        domain='com.example',
        axis=1
    )

    relu_node = onnx.helper.make_node(
        'Relu',
        inputs=['softmax_output'],
        outputs=['output']
    )
    graph = onnx.helper.make_graph(
        nodes=[softmax_node, relu_node],
        name='SimpleGraph',
        inputs=[input_tensor],
        outputs=[output_tensor]
    )
    model = onnx.helper.make_model(graph,
                                   opset_imports=[onnx.helper.make_operatorsetid("", 17),
                                                   onnx.helper.make_operatorsetid("com.example", 1)])
    return model

def infer_mysoftmax(ctx: onnx.shape_inference.InferenceContext) -> None:
    # Inference function for MySoftmax operator
    input_type = ctx.get_input_type(0)
    output_type = ctx.get_output_type(0)
    output_type.tensor_type.elem_type = input_type.tensor_type.elem_type
    output_shape = ctx.get_input_type(0).tensor_type.shape
    output_type.tensor_type.shape.CopyFrom(output_shape)
    ctx.set_output_type(0, output_type)

def register_schema() -> None:
    schema = OpSchema(
        name='MySoftmax',
        domain='com.example',
        since_version=1,
        inputs=[OpSchema.FormalParameter('input', "float", 'Input tensor')],
        outputs=[OpSchema.FormalParameter('output', "float", 'Output tensor')],
        attributes=[
            OpSchema.Attribute(
                name='axis',
                type=OpSchema.AttrType.INT,
                description='Axis along which to compute softmax',
            )
        ],
    )
    # ↓ If we comment out below line, SEGV would not occur.
    schema.set_type_and_shape_inference_function(infer_mysoftmax)
    onnx.defs.register_schema(schema)

def main():
    model = create_model()
    onnx.checker.check_model(model, full_check=True)
    onnx.save(model, "simple_model.onnx")

    register_schema()
    inferred = onnx.shape_inference.infer_shapes(model)
    onnx.save(inferred, "inferred_simple_model.onnx")


if __name__ == "__main__":
    main()
$ python3 sample.py
zsh: segmentation fault (core dumped)  python sample.py

Expected behavior

Notes

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions