# Matrix operation for "shoelace formula"

I’m not terribly familiar with matrix calculations but I’d like to start using Nx to see if I can speed up some things. This question is not directly about Nx so much as generally about the appropriate term for this type of calculation in matrix lingo. To whit:

``````input = [{x1, y1}, {x2, y2}, {x3, y3}]

matrix = | x1  x2  x3 |
| y1  y2  y3 |

output = x1 * y2 + x2 * y3 + x3 * y1 - x2 * y1 - x3 * y2 - x1 * y3
``````

Is there a name for that matrix operation?

For whatever it’s worth, the `numpy` implementation of that algorithm doesn’t do anything special:

``````Area=np.abs(np.sum(x[i-1]*y[i]-x[i]*y[i-1])*0.5) # one line of code for the shoelace formula
``````

That looks closely related to a determinant, but those are for square matrices, so I’m not sure what the generalized name might be.

1 Like
``````| x1  x2  x3 |
| y1  y2  y3 |
``````

I wonder if you could just do a series of matrix multiplications and transpositions.

``````| x1 x2 x3 | * | x2 x3 x1 | => | x1*y2 x2*y3 x3*y1 |
| y1 y2 y3 |   | y2 y3 y1 |    | y1*x2 y2*x3 y3*x1 |
``````

So then you take the sum of the top row and subtract the sum of the bottom row?

I’ve been recently goofing around with Nx a bunch, including for AOC solutions, and I don’t claim to have a good handle on it, but I’ve been nerd sniped. This is what I tried in a Livebook. Sample inputs were based on the Rosetta Code link from upthread.

Shorter summary:

• Get to an `{2, N}` tensor via `Nx.transpose`
• Extend it along the axis with `Nx.tile`, compare with `List.duplicate`
• Slice along the same axis to get cells that are offset by one step from the original pairs
• Stack them a few times so that the tensor contains `[[first row, offset second row], [second row, offset first]]`
• I didn’t quickly find out a good way to do this without intermediate variables and Access syntax, but it’s probably quite possible for someone more deft
• Perform pair-wise multiplication along a single axis to get the `x1 * y2...` bits
• Negate the second row of one axis to subtract
• Sum, absolute value, divide by 2, wrapped in a `dbg` to see the shorter steps

Very happy to receive pointers and critique. I’m almost certain some of my steps could be elided, beyond just the places where I left intermediate steps to illuminate how the data was shifting in the Markdown output. Also notably I’ll bet this is very not-optimal for allocations.

Pretty much what I arrived at!

1 Like

Brilliant! The code for the offset is a bit more complex than I expected. I would probably default to using the input lists to generate both the original and the offset as lists of lists and then make tensors of the results.

``````input = [[3, 4], [5, 11], [12, 8], [9, 5], [5, 6]]
offset = input |> Stream.cycle() |> Stream.drop(1) |> Stream.take(length(input)) |> Enum.to_list()
[in_tensor, offset_tensor] = [input, offset] |> Enum.map(fn vs -> Nx.tensor(vs, names: [:vertices, :coords]) end)
``````

If you’re fine with doing the transform inside the vanilla Elixir surroundings, it’s tough to beat `Enum.slide`:

``````Enum.slide(1..5, 0, -1)
[2, 3, 4, 5, 1]

Enum.slide(1..5, 0..1, -1)
[3, 4, 5, 1, 2]

Enum.slide(1..5, 2..3, 0)
[3, 4, 1, 2, 5]
``````
1 Like

Nice. I missed when that got introduced in 1.13.