Is there any way to use gzcompress on strings and put them in the database?
NB! EDIT: this isn’t pure gzip - see Nobbz’ answer below Elixir gzcompress string
large_string = "Very large string"
binary_compressed = :erlang.term_to_binary(large_string, [:compressed])
binary_uncompressed = :erlang.term_to_binary(large_string)
you can check the byte_size that the compression worked:
decoded = :erlang.binary_to_term binary_compressed
NOTE: the string must be very long for the compression to kick in (or use large_string = Enum.to_list(1…1_000_000) ), else the bytesize will be identical, :erlang.binary_to_term figures out automatically if the binary is compressed or not.
If you want to build this into Ecto, you can take a look at https://github.com/danielberkompas/cloak for inspiration.
This does not what was asked for. The result of
:erlang.term_to_binary/* is not a plain gzip compressed binary, but has some meta-information to convert backwards into erlang terms. This is the so called “External Term Format”.
Also, as I recall, PostgreSQL already compresses string fields?
Yes, kind of yes. however, I heard that’s good to compress larger JSON objects for storing.