@NobbZ jackpot!
as the matter of fact I was using Jaxon for parsing since files were pretty big before. but just tested with Jason strings: :copy and issue does not occur anymore. I suspect this was misuse of Jaxon.stream on my side but I am completely satisfied with Jason results
now when process did finish there was like XXXMB leftover per each iteration. Anyway what I am searching for is kind of stream parser and it seems that code below is misuse because memory spikes immediately to ~1.5GB and drops sequentialy with each chunk (not as stated with jaxon)
That is probably because of Enum.map, depending on what it does exactly, it will force the full stream.
Also if you keep references to binaries in that Enum.map instead of using :binary.copy/1 as suggested, you might keep all the lines in memory until the sub references get collected.
But overall, this is hard to say without knowing how you process the chunks…