Axon: How come my predicted values have the same shape as my input tensor?

I am slowly but surely making my way through learning about Neural Networks with Axon, but I did have this question. For some reason, my predicted values have the same shape as my input tensor (which is 60 x 60 x 1) and I am not sure the reason why. I expected the predicted values to have the shape (60, ), (as in, a single vector with 60 elements), but that is not the case for some reason.

In case you need it, here is my model:

input = Axon.input("input", shape: {prediction_days, 1})

model =
  input
  |> Axon.lstm(50)
  |> then(fn {seq, _} -> seq end)
  |> Axon.dropout(rate: 0.2)
  |> Axon.lstm(50)
  |> then(fn {seq, _} -> seq end)
  |> Axon.dropout(rate: 0.2)
  |> Axon.lstm(50)
  |> then(fn {seq, _} -> seq end)
  |> Axon.dropout(rate: 0.2)
  |> Axon.dense(1, activation: :sigmoid)

And here is my prediction code:

params =
  model
  |> Axon.Loop.trainer(:mean_squared_error, :adam)
  |> Axon.Loop.metric(:mean_absolute_error, "MAE")
  |> Axon.Loop.run(Stream.zip(batched_x_train, batched_y_train), %{}, epochs: 25, compiler: EXLA)

y_pred = Axon.predict(model, params, x_test)

What am I missing here? Thanks!

I am a bit confused as to how you could end up with the same input and output shapes too. A little tip is that on Axon main branch you can make use of display functions to trace shapes through the graph. What does the output of Axon.Display.as_table/2 give you? For usage you would do something like:

Axon.Display.as_table(model, x_test) |> IO.puts

Hi, @seanmor5, thanks for the reply. When I run the given code, I get the following error:

** (BadFunctionError) expected a function, got: {:tuple, [#Function<49.132341458/2 in Axon.lstm/4>, #Function<49.132341458/2 in Axon.lstm/4>, #Function<49.132341458/2 in Axon.lstm/4>, #Function<49.132341458/2 in Axon.lstm/4>]}
    (axon 0.2.0) lib/axon/display.ex:215: anonymous fn/2 in Axon.Display.render_parameters/2
    (elixir 1.14.0) lib/enum.ex:1658: Enum."-map/2-lists^map/1-0-"/2
    (axon 0.2.0) lib/axon/display.ex:214: Axon.Display.render_parameters/2
    (axon 0.2.0) lib/axon/display.ex:189: Axon.Display.do_axon_to_rows/6
    (axon 0.2.0) lib/axon/display.ex:86: Axon.Display.axon_to_rows/6
    (axon 0.2.0) lib/axon/display.ex:152: anonymous fn/4 in Axon.Display.do_axon_to_rows/6
    (elixir 1.14.0) lib/enum.ex:1780: Enum."-map_reduce/3-lists^mapfoldl/2-0-"/3

I think this is either due to my using the then functions in the model or perhaps it is because I am running Axon off of the GitHub main branch and not from Hex? I am not sure here.

That’s a bug, if you open an issue I will fix it tonight!

Thanks!

Edit: For future (Google-able) reference, here is the (now closed) GitHub issue: Axon.Display.as_table(model, x_test) gives error when using LSTM models with .then in them · Issue #388 · elixir-nx/axon · GitHub

1 Like

So, I updated my model to use Nx.squeeze (actually had to use Nx.gather in the end), but it still seems like my predictions are way off (they almost form a horizontal line in VegaLite). If anyone wants to take a look at my (somewhat messy) code to help explain what’s going on, feel free to do so here: livebook-axon-test/testing-axon-and-vegalite.livemd at main · danieljaouen/livebook-axon-test · GitHub

Unless someone can explain it to me, I think I give up for now. Thanks anyway, @seanmor5.