I have a ONNX model that I want to load in Elixir.
To do that, I’m using the Ortex library. I can load the model with no problem using Ortex.load
function and it will give me the following model:
#Ortex.Model<
inputs: [
{"float_input",
"Tensor {\n ty: Float32,\n dimensions: [\n 2341,\n 92,\n ],\n}",
[2341, 92]}
]
outputs: [
{"output_label",
"Tensor {\n ty: Int64,\n dimensions: [\n 2341,\n ],\n}",
[2341]},
{"output_probability",
"Sequence(\n Map {\n key: Int64,\n value: Float32,\n },\n)",
nil}
]>
The issue I’m having is when trying to run this model. it expects a "float_input"
as input name, but I don’t know how to pass it in the run function.
If I try to just pass the tensor data I get the following error:
iex(19)> Ortex.run(model, x_test_scaled[0])
** (RuntimeError) Failed to run inference on model: Invalid rank for input: float_input Got: 1 Expected: 2 Please fix either the inputs/outputs or the model.
(ortex 0.1.9) lib/ortex/model.ex:51: Ortex.Model.run/2
If I try to send something as the float_input
I get the following error:
iex(19)> Ortex.run(model, {Nx.tensor(0), x_test_scaled[0]})
** (RuntimeError) Failed to run inference on model: input name cannot be empty
(ortex 0.1.9) lib/ortex/model.ex:51: Ortex.Model.run/2
I also tried creating a Nx.Container
and passing that to run:
defmodule MyData do
@derive {Nx.Container, containers: [:float_input]}
defstruct [:float_input]
end
iex(11)> Ortex.run(model, %MyData{float_input: x_test_scaled})
** (FunctionClauseError) no function clause matching in anonymous fn/1 in Ortex.Model.run/2
The following arguments were given to anonymous fn/1 in Ortex.Model.run/2:
# 1
%MyData{
float_input: #Nx.Tensor<
f32[2341][92]
Ortex.Backend
[
[0.19482958316802979, 0.08041504770517349, -1.9489672183990479, 2.886521100997925, -0.4316384792327881, -0.3034050166606903, -0.2510624825954437, -0.08099287003278732, -0.010336779989302158, -0.2686353623867035, -0.2567836046218872, -0.25082194805145264, -0.23275044560432434, -0.24425934255123138, -0.31930288672447205, -0.3787024915218353, -0.27744582295417786, -0.029247770085930824, 0.7984414100646973, -0.47261980175971985, -0.2207702398300171, -0.4064401388168335, -0.06632958352565765, -0.031023602932691574, 0.0, -0.0206768736243248, -0.12976393103599548, -0.0206768736243248, -0.15836477279663086, -0.07027661055326462, -1.154808521270752, -0.034301552921533585, -0.1615263819694519, 1.3659762144088745, -0.2785608470439911, 1.2954195737838745, -0.27967265248298645, 0.35717639327049255, 1.4155744314193726, -0.48122650384902954, -0.6723564863204956, -0.402685284614563, -0.5743894577026367, -0.42710113525390625, -0.08297397941350937, -0.08165843784809113, -0.07546283304691315, -0.09343120455741882, -0.07104019820690155, ...],
...
]
>
}
(ortex 0.1.9) lib/ortex/model.ex:49: anonymous fn/1 in Ortex.Model.run/2
(elixir 1.16.2) lib/enum.ex:1700: Enum."-map/2-lists^map/1-1-"/2
(ortex 0.1.9) lib/ortex/model.ex:49: Ortex.Model.run/2
In python, I can easily run the model using the following code:
onnx_session = onnxruntime.InferenceSession("logistic_regression_model.onnx")
output = onnx_session.run(None, {"float_input": input_data})
What I’m doing wrong?