# Generate spiral data with Nx

Hi folks, I just started my journey on learning Machine Learning and decided that it would be funnier if I used Elixir.

So I’m following along the Neural Network from Scratch book that implements a NN using just python and NumPy, in my case, Elixir and NX. However, I am having issues now because they have a function that generates some data, but I’m not being able to reproduce it in elixir. Can someone help me with that?

Here is the python code:

``````import numpy as np

# Copyright (c) 2015 Andrej Karpathy
# Source: https://cs231n.github.io/neural-networks-case-study/
def create_data(samples, classes):
X = np.zeros((samples*classes, 2))
y = np.zeros(samples*classes, dtype='uint8')
for class_number in range(classes):
ix = range(samples*class_number, samples*(class_number+1))
r = np.linspace(0.0, 1, samples)
t = np.linspace(class_number*4, (class_number+1)*4, samples) + np.random.randn(samples)*0.2
X[ix] = np.c_[r*np.sin(t*2.5), r*np.cos(t*2.5)]
y[ix] = class_number
return X, y
``````

The linspace function is not available in the latest release of Nx, although it is present in the `main` branch, so I’m using that, but I have no idea how to do the `np.c_` part and loop the classes.

1 Like

I think `np.c_` translates to either some form of `Nx.concatenate` or `Nx.stack`

For looping you can either vectorize the function altogether or use `while/for` inside a `defn`

edit: We don’t have `for`, but we do have `for`-style generators in `while` 1 Like

thank you, that helped me out, I kinda managed to get the same results with:

``````def create(samples, classes) do
Enum.reduce(0..classes-1, {Nx.broadcast(0, {1, 2}), Nx.tensor()}, fn(i, acc) ->
r = linspace(0.0, 1, n: samples)
t = linspace(i*4, (i+1)*4, n: samples) |> Nx.add(Nx.random_normal({samples}, -1.0, 1.0, type: :f64) |> Nx.multiply(0.2))

tensor = Nx.concatenate([elem(acc, 0), Nx.stack([r |> Nx.multiply(Nx.sin(Nx.multiply(t, 2.5))), r |> Nx.multiply(Nx.cos(Nx.multiply(t,2.5)))], axis: 1)])

c = Nx.concatenate([elem(acc, 1), Nx.broadcast(i, {samples, 1})])

{tensor, c}
end)
end
``````

Although I’m facing a major challenge, I can’t figure out how to plot this in a graph, it looks like I need to use explorer, but I’m finding it hard to initialize a dataframe from this multi-dimensional tensor, all the examples using lists.

And the second and minor issue is that it looks like there is no way to have an empty Tensor right? I had to initialize the `acc` with a `{1,2}` tensor so things would work out but in the final result I ended up with extra data I didn’t need it, was no big deal but it would be nice if I could reproduce exactly what is in python.

1 Like

You shouldn’t need Explorer for plotting.

Nx.to_flat_list might help in the usage with VegaLite. This post might provide some inspiration: https://dockyard.com/blog/2022/10/19/creating-polar-plots-in-elixir-part-2

As for the way you are using Enum.reduce, I’d recommend rethinking the code to use `defn` and `while` inside it. You could pre-allocate output tensors with `Nx.broadcast` and then update them in each iteration with `Nx.indexed_put`.

1 Like