Elixir gzcompress string

Is there any way to use gzcompress on strings and put them in the database?

1 Like

NB! EDIT: this isn’t pure gzip - see Nobbz’ answer below Elixir gzcompress string

large_string = “Very large string”
binary_compressed = :erlang.term_to_binary(large_string, [:compressed])
binary_uncompressed = :erlang.term_to_binary(large_string)

you can check the byte_size that the compression worked:
byte_size binary_compressed
byte_size binary_uncompressed

decode:
decoded = :erlang.binary_to_term binary_compressed

NOTE: the string must be very long for the compression to kick in (or use large_string = Enum.to_list(1…1_000_000) ), else the bytesize will be identical, :erlang.binary_to_term figures out automatically if the binary is compressed or not.

If you want to build this into Ecto, you can take a look at https://github.com/danielberkompas/cloak for inspiration.

2 Likes

This does not what was asked for. The result of :erlang.term_to_binary/* is not a plain gzip compressed binary, but has some meta-information to convert backwards into erlang terms. This is the so called “External Term Format”.

I’d rather suggest looking at the :zlib module, especially :zlib.gzip/1 and :zlib.gunzip/1.

5 Likes

Also, as I recall, PostgreSQL already compresses string fields?

1 Like

Yes, kind of yes. however, I heard that’s good to compress larger JSON objects for storing.

1 Like