I’m responding to a request with:
Plug.Conn.send_resp(conn, 200, Jason.encode_to_iodata!(large_map))
where large_map
has 100K elements (each element is pretty small, no big strings or similar). However when the response is actually sent the memory usage of the app increases in around 100-150MB:
My question is: what is the best way to handle this kind of response? I can think of partial content, multipart, stream or maybe I’m missing something?
What is the size of serialised json ? Did you try using other json libraries ?
Previous discussion about Trying to figure out why 2.3 MB JSON binary allocates 35 MB of heap
1 Like
size is 6meg:
iex(15)>IO.inspect(IO.iodata_length(Jason.encode_to_iodata!(large_map)))
6477874
I was using Poison
but switched to Jason
cause I saw some benchmarks stating that it consumed at least half the memory but it remained more or less the same in this case, so this leads me to think that the problem is not the encoding but the actual payload delivery? I’m not sure if Plug
parses once more the payload before actually leaving and this contributes to the increase
Thanks for the reference i’ll take a look