Elixir Blog Posts

Tags: #<Tag:0x00007fbcb1fd0ca8> #<Tag:0x00007fbcb1fd0938> #<Tag:0x00007fbcb1fd0758>


I spent a little time this week diving into property testing. I used StreamData to property test a Base58Check encoder (that we built in a previous article) against an external, command line oracle. StreamData made the whole process amazingly simple, and I can definitely see myself using property testing much more in the future.


10 Elixir Tips Episode #7

Check out the Episode 7 on 10 Killer Elixir Tips

Happy Coding…


8 posts were split to a new topic: FP/OCaml/Elixir/Scala


Hey all,

I continue my journey through Mastering Bitcoin. This week I read about using BIP-39 style mnemonics to memorize wallet seeds. The algorithm used to create them seemed like an excellent exercise to train my Elixir muscles. I think I’m starting to get a little too comfortable with binary manipulation in Elixir, haha.

If you’re curious, check out the article: From Bytes to Mnemonic using Elixir. Let me know what you think!


Some GDPR-compliancy tips, inspired by the upcoming deadline (May 25th) :laughing:


I would probably generate a lookup table out of function heads of the wordlist at compile time to replace

for <<chunk::11 <- entropy>> do
  Enum.at(wordlist, chunk)


@spec lookup_in_wordlist!(pos_integer) :: String.t | no_return
defp lookup_in_wordlist!(chunk)

# or something like that (I would probably replace fetch_env with File.stream!)
for {word, index} <- :bip39 |> Application.fetch_env!(:wordlist) |> Enum.with_index() do
  defp lookup_in_wordlist!(unquote(index)), do: unquote(word)

defp lookup_in_wordlist!(unmatched_chunk) do
  raise("No match for chunk: #{unmatched_chunk}")

@spec entropy_to_wordlist!(bits) :: [String.t] | no_return
def entropy_to_wordlist!(entropy) do
  for <<chunk::11 <- entropy>> do

it also doesn’t load the whole wordlist in memory every time but stores it with the rest of the code once.


Great article, I’m a bit hesitant on encryption though …

I usually try to follow https://andre.arko.net/2014/09/20/how-to-safely-store-user-data/


Wow, that’s an awesome idea. I was considering building out a map of index to word to speed up lookup, but I never would have considered programmatically building out function heads to match on indexes. Thanks!


HTTP is message passing, Why I stopped using Plug

Article to articulate in more detail why I have been spending time building an alternative web to plug.


Opinionated vs modular web frameworks

I see the reasoning, but I think Plug should be considered to be more like a Pipeline, literally like lots of |> to transform the requests at each step until eventually returning, and it is indeed done in it’s own process, so it still seems conceptually similar?


I had to look into what sort of encryption was mentioned.

Your example has Cipher.encrypt and Cipher.decrypt but following this it is just plain AES-CBC-128 without any authentication. This means is open to padding-oracle attack, which are fast and straight-forward to implement.

The article even states:
Cipher is a popular library for Elixir, based on Erlang’s crypto module. It makes it easy to encrypt and decrypt data
It may be popular and it may be easy to use but the way the have documented usage is not secure in its current form. At least they add a MAC of some sort (and do a constant time comparison when checking the mac) or use and authenticated cipher like AES-GCM.

This feels too loose for me. It doesn’t feel like he understands crypto and there is too much indirection to other posts here. Which also doesn’t make it superclear what they are doing. It is very light on details. It may be that he understands all of it but he clearly doesn’t communicate it well.

Keyless encryption. What they are doing is deriving the encryption key from user data. It doesn’t discuss what key derivation algorithms they use or what sort of crypto protocol to actually protect the data.

Pubkey encryption. Is never used to protect “data” as such. It is way to slow for that. Instead it is used in hybrid-crypto protocols or just to exchange symmetric keys with each others. It is sort of mentioned in the section but pub key encryption is there to be able to exchange keys without giving away the secret key - not to encrypt data.

Plain old encryption. Well, this is symmetric crypto, which is the only way to do bulk encryption and it is used virtually every crypto protocol. But he is too light on the details, such as how to securely exchange keys (if needed), what algorithms to use or not and doesn’t mention any authentication. I don’t even know what “plain old encryption” means.


My biggest problem with plug is the places were you can’t consider it as a transformation of a request.

Reading the body is an impure action and is unreliable if done multiple times. I had a real world issue were we were using message signing to authenticate requests. Our authenticate plug read the body, (needed to check the signature) then the Parser plug was broken because it tried to read the body (direct from the socket) again. We had to we write the parser. hardly a good example of compose-able.
If the request(conn) was just a data structure the model would be fine, but that is not the case


Thanks for spotting this! I noticed you opened an issue on GitHub as well — let’s take the discussion there.

For anyone interested: https://github.com/rubencaro/cipher/issues/18


Benchee, a slow query, ecto, Postgres, explain analyze, a gripping story!



Hah, good read. ^.^


Which tool are you using now? How do you recommend getting started with it?


We’re using pryin but there are multiple ones. Appsignal is there and also looks nice, it crashed our app once in the early days but they fixed it. The Adopting Elixir book also mentions wombatoam and scout. There’s also a skylight agent but work on that seems to sadly have somewhat frozen.

@OvermindDL1 :wave: thanks :grin:


There is also https://github.com/deadtrickster/prometheus.erl for prometheus.


I was focussing on SaaS solutions, sorry should have mentioned. If you go beyond SaaS there are a bunch more there is exometer, InfluxDB and many more…


Things got a little weird last week. I used the Bitcoin BIP-39 mnemonic generator I built a few weeks ago to build a “mnemonic haiku miner”. The basic idea was to create a stream that generates BIP-39 mnemonics and filters out those that don’t meet the structural requirements of a haiku.

The Haiki.is_valid? is probably a good opportunity for code golfing. I’m sure there are better solutions out there.

Let me know what you think!