Can't run ONNX model via Ortex, input name cannot be empty

I have a ONNX model that I want to load in Elixir.

To do that, I’m using the Ortex library. I can load the model with no problem using Ortex.load function and it will give me the following model:

#Ortex.Model<
  inputs: [
    {"float_input",
     "Tensor {\n    ty: Float32,\n    dimensions: [\n        2341,\n        92,\n    ],\n}",
     [2341, 92]}
  ]
  outputs: [
    {"output_label",
     "Tensor {\n    ty: Int64,\n    dimensions: [\n        2341,\n    ],\n}",
     [2341]},
    {"output_probability",
     "Sequence(\n    Map {\n        key: Int64,\n        value: Float32,\n    },\n)",
     nil}
  ]>

The issue I’m having is when trying to run this model. it expects a "float_input" as input name, but I don’t know how to pass it in the run function.

If I try to just pass the tensor data I get the following error:

iex(19)> Ortex.run(model, x_test_scaled[0])
** (RuntimeError) Failed to run inference on model: Invalid rank for input: float_input Got: 1 Expected: 2 Please fix either the inputs/outputs or the model.
    (ortex 0.1.9) lib/ortex/model.ex:51: Ortex.Model.run/2

If I try to send something as the float_input I get the following error:

iex(19)> Ortex.run(model, {Nx.tensor(0), x_test_scaled[0]})
** (RuntimeError) Failed to run inference on model: input name cannot be empty
    (ortex 0.1.9) lib/ortex/model.ex:51: Ortex.Model.run/2

I also tried creating a Nx.Container and passing that to run:

defmodule MyData do
  @derive {Nx.Container, containers: [:float_input]}

  defstruct [:float_input]
end

iex(11)> Ortex.run(model, %MyData{float_input: x_test_scaled})
** (FunctionClauseError) no function clause matching in anonymous fn/1 in Ortex.Model.run/2    
    
    The following arguments were given to anonymous fn/1 in Ortex.Model.run/2:
    
        # 1
        %MyData{
          float_input: #Nx.Tensor<
            f32[2341][92]
            Ortex.Backend
            [
              [0.19482958316802979, 0.08041504770517349, -1.9489672183990479, 2.886521100997925, -0.4316384792327881, -0.3034050166606903, -0.2510624825954437, -0.08099287003278732, -0.010336779989302158, -0.2686353623867035, -0.2567836046218872, -0.25082194805145264, -0.23275044560432434, -0.24425934255123138, -0.31930288672447205, -0.3787024915218353, -0.27744582295417786, -0.029247770085930824, 0.7984414100646973, -0.47261980175971985, -0.2207702398300171, -0.4064401388168335, -0.06632958352565765, -0.031023602932691574, 0.0, -0.0206768736243248, -0.12976393103599548, -0.0206768736243248, -0.15836477279663086, -0.07027661055326462, -1.154808521270752, -0.034301552921533585, -0.1615263819694519, 1.3659762144088745, -0.2785608470439911, 1.2954195737838745, -0.27967265248298645, 0.35717639327049255, 1.4155744314193726, -0.48122650384902954, -0.6723564863204956, -0.402685284614563, -0.5743894577026367, -0.42710113525390625, -0.08297397941350937, -0.08165843784809113, -0.07546283304691315, -0.09343120455741882, -0.07104019820690155, ...],
              ...
            ]
          >
        }
    
    (ortex 0.1.9) lib/ortex/model.ex:49: anonymous fn/1 in Ortex.Model.run/2
    (elixir 1.16.2) lib/enum.ex:1700: Enum."-map/2-lists^map/1-1-"/2
    (ortex 0.1.9) lib/ortex/model.ex:49: Ortex.Model.run/2

In python, I can easily run the model using the following code:

onnx_session = onnxruntime.InferenceSession("logistic_regression_model.onnx")

output = onnx_session.run(None, {"float_input": input_data})

What I’m doing wrong?

Hey Sezaru, thanks for the detailed outputs! What’s the shape of your x_test_scaled[0], specifically? Seems to me like that first call to Ortex.run should have worked. The Invalid rank for input... Got 1, expected 2 makes me wonder if the input shapes aren’t quite right (i.e., passing a vector when expecting a matrix).

Hey @gregszumel the x_test_scaled variable is the following tensor:

#Nx.Tensor<
  f32[2341][92]
  [
    [0.19482958316802979, 0.08041504770517349, -1.9489672183990479, 2.886521100997925, -0.4316384792327881, -0.3034050166606903, -0.2510624825954437, -0.08099287003278732, -0.010336779989302158, -0.2686353623867035, -0.2567836046218872, -0.25082194805145264, -0.23275044560432434, -0.24425934255123138, -0.31930288672447205, -0.3787024915218353, -0.27744582295417786, -0.029247770085930824, 0.7984414100646973, -0.47261980175971985, -0.2207702398300171, -0.4064401388168335, -0.06632958352565765, -0.031023602932691574, 0.0, -0.0206768736243248, -0.12976393103599548, -0.0206768736243248, -0.15836477279663086, -0.07027661055326462, -1.154808521270752, -0.034301552921533585, -0.1615263819694519, 1.3659762144088745, -0.2785608470439911, 1.2954195737838745, -0.27967265248298645, 0.35717639327049255, 1.4155744314193726, -0.48122650384902954, -0.6723564863204956, -0.402685284614563, -0.5743894577026367, -0.42710113525390625, -0.08297397941350937, -0.08165843784809113, -0.07546283304691315, -0.09343120455741882, -0.07104019820690155, -0.09284767508506775, ...],
    ...
  ]
>

I believe I should be able to pass it directly to the input, but when I try it, it will panic on rust side:

iex(18)> Ortex.run(model, x_test_scaled)
[(ortex 0.1.9) lib/ortex/model.ex:52: Ortex.Model.run/2]
thread '<unnamed>' panicked at src/tensor.rs:206:46:
                                                    called `Option::unwrap()` on a `None` value
                                                                                               note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace
                                              ** (ErlangError) Erlang error: :nif_panicked
    (ortex 0.1.9) Ortex.Native.run(#Reference<0.3644578789.793903171.73739>, [#Reference<0.3644578789.793903152.73507>])
    (ortex 0.1.9) lib/ortex/model.ex:55: Ortex.Model.run/2

x_test_scaled[0] is just the first row:

iex(18)> x_test_scaled[0]
#Nx.Tensor<
  f32[92]
  [0.19482958316802979, 0.08041504770517349, -1.9489672183990479, 2.886521100997925, -0.4316384792327881, -0.3034050166606903, -0.2510624825954437, -0.08099287003278732, -0.010336779989302158, -0.2686353623867035, -0.2567836046218872, -0.25082194805145264, -0.23275044560432434, -0.24425934255123138, -0.31930288672447205, -0.3787024915218353, -0.27744582295417786, -0.029247770085930824, 0.7984414100646973, -0.47261980175971985, -0.2207702398300171, -0.4064401388168335, -0.06632958352565765, -0.031023602932691574, 0.0, -0.0206768736243248, -0.12976393103599548, -0.0206768736243248, -0.15836477279663086, -0.07027661055326462, -1.154808521270752, -0.034301552921533585, -0.1615263819694519, 1.3659762144088745, -0.2785608470439911, 1.2954195737838745, -0.27967265248298645, 0.35717639327049255, 1.4155744314193726, -0.48122650384902954, -0.6723564863204956, -0.402685284614563, -0.5743894577026367, -0.42710113525390625, -0.08297397941350937, -0.08165843784809113, -0.07546283304691315, -0.09343120455741882, -0.07104019820690155, -0.09284767508506775, ...]
>

Got it - looks like the call on x_test_scaled runs, but Ortex fails to parse the output.

The problem is in the output of your model, we’ve got: {"output_probability", "Sequence(Map {key: Int64,value: Float32,},)",nil}. However, Ortex does not currently support Sequences / Maps. Instead, it expects the output to be a list of tensors (that’s why there’s a panic, it’s expecting a tensor, but it’s finding a sequence/map structure).

Is there an easy way to convert your model’s output to a list of tensors?

1 Like

I’m using scikit learn LogisticRegression to generate my model in python and then using sklearn-onnx to convert it to a onnx model.

I’m gonna take a look to see if I can change that when converting the model.

Thanks for the help so far

I was able to change the output and now the model works fine, thanks @gregszumel !

1 Like