I just picked up the Machine Learning in Elixir book and am already stuck in Chapter 1.
I am working through the first livebook example. My results are slightly different than in the book
#Explorer.DataFrame<
Polars[150 x 5]
sepal_length f64 [5.1, 4.9, 4.7, 4.6, 5.0, ...]
#Explorer.DataFrame<
Polars[150 x 5]
sepal_length float [5.1, 4.9, 4.7, 4.6, 5.0, "..."]
I included the below code, as per instructions
Mix.install([
{:axon, "~> 0.5"},
{:nx, "~> 0.5"},
{:explorer, "~> 0.5"},
{:kino, "~> 0.8"}
])
but my actual version of axon is 0.7, which might be related…
When i try to evaluate this block:
trained_model_state =
model
|> Axon.Loop.trainer(:categorical_cross_entropy, :sgd)
|> Axon.Loop.metric(:accuracy)
|> Axon.Loop.run(data_stream, %{}, iterations: 500, epochs: 10)
I get the following warning and error:
20:17:36.449 [warning] passing parameter map to initialization is deprecated, use %Axon.ModelState{} instead
Epoch: 0, Batch: 0, accuracy: 0.0250000 loss: 0.0000000
** (ArgumentError) argument at position 3 is not compatible with compiled function template.
%{i: nx.Tensor<
s32, model_state: inspect.Error<
got Protocol.UndefinedError with message:""" protocol Enumerable not implemented for #Nx.Tensor< f32[3] > of type Nx.Defn.TemplateDiff (a struct). This protocol is implemented for the following type(s): Date.Range, Explorer.Series.Iterator, File.Stream, Function, GenEvent.Stream, HashDict, HashSet, IO.Stream, Kino.Control, Kino.Input, Kino.JS.Live, List, Map, MapSet, Range, Stream, Table.Mapper, Table.Zipper """
while inspecting:
%{ data: %{ "dense_0" => %{ "bias" => #Nx.Tensor< f32[3] >, "kernel" => #Nx.Tensor< f32[4][3] > } }, state: %{}, __struct__: Axon.ModelState, parameters: %{"dense_0" => ["bias", "kernel"]}, frozen_parameters: %{} }
Stacktrace:
(elixir 1.17.1) lib/enum.ex:1: Enumerable.impl_for!/1 (elixir 1.17.1) lib/enum.ex:166: Enumerable.reduce/3 (elixir 1.17.1) lib/enum.ex:4423: Enum.reduce/3 (axon 0.7.0) lib/axon/model_state.ex:359: anonymous fn/2 in Inspect.Axon.ModelState.get_param_info/1 (stdlib 6.0) maps.erl:860: :maps.fold_1/4 (axon 0.7.0) lib/axon/model_state.ex:359: anonymous fn/2 in Inspect.Axon.ModelState.get_param_info/1 (stdlib 6.0) maps.erl:860: :maps.fold_1/4 (axon 0.7.0) lib/axon/model_state.ex:320: Inspect.Axon.ModelState.inspect/2
, optimizer_state: {%{scale: nx.Tensor<
f32
>}}, loss_scale_state: %{}, loss:
<<<<< Expected <<<<<
nx.Tensor<
f32==========
nx.Tensor<
f64Argument >>>>>
, y_pred: nx.Tensor<
f64[120][3] >,
y_true: nx.Tensor<
u8[120][3] >
}(nx 0.9.1) lib/nx/defn.ex:342: anonymous fn/7 in Nx.Defn.compile_flatten/5 (nx 0.9.1) lib/nx/lazy_container.ex:73: anonymous fn/3 in Nx.LazyContainer.Map.traverse/3 (elixir 1.17.1) lib/enum.ex:1829: Enum."-map_reduce/3-lists^mapfoldl/2-0-"/3 (elixir 1.17.1) lib/enum.ex:1829: Enum."-map_reduce/3-lists^mapfoldl/2-0-"/3 (nx 0.9.1) lib/nx/lazy_container.ex:72: Nx.LazyContainer.Map.traverse/3 (nx 0.9.1) lib/nx/defn.ex:339: Nx.Defn.compile_flatten/5 (nx 0.9.1) lib/nx/defn.ex:331: anonymous fn/4 in Nx.Defn.compile/3 #cell:lze5noxdnhytfee2:5: (file)
If we ignore the warning, I guess the message is telling me that it expected an f32 tensor instead of the f64.
Is this due to my version of Axon or am I missing something else. I have run it both on the recommended version of Elixer/erlang (Elixir 1.14.3 (compiled with Erlang/OTP 25)) as well as Elixir 1.17.1 (compiled with Erlang/OTP 27)