Plug.Parsers.RequestTooLargeError when uploading large files

Hi, how can i upload large file? it throws Plug.Parsers.RequestTooLargeError even though i added length in Plug.Parsers.

# endpoint.ex
  plug Plug.Parsers,
    parsers: [
      {:multipart, length: 100_000_000},
    length: 100_000_000,
    pass: ["*/*"],
    json_decoder: Phoenix.json_library()

My heex template form has multipart and file_input/3 has multiple attribute. I’m uploading 3 files, no larger than 5mb each and it throws this:

Plug.Parsers.RequestTooLargeError at POST /upload
the request is too large. If you are willing to process larger requests, please give a :length to Plug.Parsers
351      {:next, conn} ->
352        reduce(conn, rest, type, subtype, params, pass, query_string_length, validate_utf8)
354      {:error, :too_large, _conn} ->
355        raise RequestTooLargeError
356    end
357  end
359  defp reduce(conn, [], type, subtype, _params, pass, query_string_length, validate_utf8) do
360    if accepted_mime?(type, subtype, pass) do

Tried also uploading a single file that’s 20mb and still throws the same error. What did i miss?

Hi! length is in the wrong location. Here is what works for me:

plug Plug.Parsers,
    parsers: [:urlencoded, {:multipart, validate_utf8: false}, :json],
    pass: ["*/*"],
    json_decoder: Poison,
    length: 25_000_000

For something smallish it does make sense to update the limits, but for larger uploads (imo start at ~30-50 MB) I’d suggest using streamed (e.g. LV uploads) or chunked uploads (e.g. tus protocol).


Or just upload directly to the storage system.

Even with a dedicated storage system there’s benefit in chunked uploads though – especially around the ability to reupload on failures.

1 Like

same thing. still throws error.

also in hexdocs Plug.Parser it says:

However, the above will increase the maximum length of all request types. If you want to increase the limit only for multipart requests (which is typically the ones used for file uploads), you can do:

plug Plug.Parsers,
     parsers: [
       {:multipart, length: 20_000_000} # Increase to 20MB max upload

What’s tus protocl? can you guide me with that part please?

And Tus – tus v0.1.3 !

It do not seem to be maintained. The GH repo 404 for me.

I guess I’ll try the LV chunked uploads, currently stuck at the part where using external upload (s3) basing on the docs. Right now a simple form action does fine (ex_aws, ex_aws_s3).

Do You use SSL? Does it work on http?

I had something similar, I solved this by using LiveUpload. Here is the topic Large file uploads with ssl on Phoenix 1.6

One parameter really important is the chunk_size. Incrementing the value works well for large files (around 4 Gb) and I could get a real speed increase. (more than double the speed)

Yes, working fine on http. using ex_aws/ex_aws_s3

Wow good to hear you solved it by LV. Do you mean this repo?, you mentioned this in your thread and looks like you’re using waffle.

Yes, but waffle is not the reason…

1 Like