Diff between javascript hex encoding and elixir

I’m working with an API that has examples in Javascript for authentication. Reproducing the logic in elixir, I’m getting a different resultant string from hex encoding which seems wrong.


$ node
crypto.createHmac('sha384', 'some_secret').update('1234').digest('hex')
> '1928a6349d1b71bc23e1aa3e143aeecfb0b5c8b5b7eb24c08d08dc5d91f4c1a7c51e9fca851c628fb5697264ff1804e8'
crypto.createHmac('sha384', 'some_secret').update('1234').digest('base64')
> 'GSimNJ0bcbwj4ao+FDruz7C1yLW36yTAjQjcXZH0wafFHp/KhRxij7VpcmT/GATo'


$ iex
iex>:crypto.hmac(:sha384, 'some_secret', '1234') |> Base.hex_encode32(case: :lower, padding: false)
iex> :crypto.hmac(:sha384, 'some_secret', '1234') |> Base.encode64

From the above, you can see that the base 64 encoded versions return the same value, but the hex encoded versions differ. I’m getting a bad signature error from the api and I need to find the magic incantation in elixir to match the behaviour of javascript.

FYI: You can reproduce both the above from the standard repls. No dependencies are needed.

In javascript you are converting it to hex16 (that is what hex means there, encoded to base 16).

In Elixir you are encoding to base32, not base16, hence why it is different. You should be encoding to base 16.

1 Like

Thanks for the quick response. That solves it. I’m not sure if that seems obvious now or not.

1 Like

Honestly I think the Elixir name is a bit off, the ‘hex’ in the name kind of visually overrides the ‘32’ to make me think ‘base16’ encoding. ^.^;