Encoding multipart/form-data HTTP body

I don’t think Finch supports building actual multipart body. This is a task on it’s own.

What I am doing is using Tesla to help me out with that task.

First, build multipart body using Tesla.Multipart

  multipart =                                                                                                           
    Multipart.new()                                                                                                     
    |> Multipart.add_content_type_param("charset=utf-8")                                                                
    |> add_fields(form_data)                                                                                            
    |> add_files(file_parts)                                                                                            
                                                                                                                        
  mp_headers = Multipart.headers(multipart)                                                                             
  stream = Multipart.body(multipart)
  headers = ... # your other headers, optional

And then you can either POST it as a streaming body request, once this PR is merged https://github.com/keathley/finch/pull/107

  :post                                                                                                                 
  |> Finch.build(url, headers ++ mp_headers, {:stream, stream})                                                         
  |> Finch.request(YourFinch) 

Or, you can POSt it in non-streaming fashion (should work on current release/master):

  :post                                                                                                                 
  |> Finch.build(url, headers ++ mp_headers, stream |> Enum.join())                                                         
  |> Finch.request(YourFinch)
2 Likes

I have tested a normal one, and it works fine.

Do you think the streaming one could be any performance impact?

So the intent behind creating streaming one is that I have to send multipart requests that contain files, and these files can be many megabytes in size. If I do N of these requests at the same time, I could easily end up exhausting my memory quota.

So, the intent was not to make it more performant, but make it not crash production server.

I actually have no idea how the streaming performance versus non-streaming compares. It’d be good test to do.

3 Likes

Okay I have phoenix dashboard working.

I am going to be in production with without streaming in the end of the day again.

then can you tell me what place is to look for in Phoenix dashboard?

image

with HTTPoison and Hackney its like this,

I can show you both, with stream and without stream. by tomorrow.

I have a use case of sending 800 * 2 something requests, per second or per 5 minutes and per 10 minutes.

You more likely want to look at the charts in observer, which show you memory usage over time, since memory should be cleaned up after the file contents will no longer be needed.

But you will see the memory usage here in the “Binary” section as well, and in fact you probably already see if you are making a lot of these requests and/or take long time to send.

Hi,

as in hackney it supports to send a file such


HTTPoison.post(url, {:file, file_path}, headers)

Can we send a file in mint or finch?

I have tried multipart of tesla for this but it add content type multipart. where as my headers are octet/stream.

Which thing do you want to do, because you can do both:

a) send a multipart request with form data and/or more than one file in it. This is as if you submitted a form on the web site - you need Tesla.Multipart + Finch for that

b) send a single file, as octet-stream - you need just Finch on it’s own for that

?

Yeah I sorted. I just did File.read! and put it on body.

it was like binary data of curl request. and I sent the binary with octet stream, it worked like a charm, I was just trying to difficult way first. and It costed me 3 hours , I should have tried simple one first.

It looks kind of dangerous to read all the content into the memory. If you have large files (it looks like it is because you’ve mentioned uploading via file streaming) it could consume much memory.

All the cases you’ve described supported by HTTPoison but for some reason you don’t want to use it. Can’t really understand why.

there is a long details above for not using HTTPoison and hackney and many around over internet

I want to use Finch.

and I just read the file content and send it as a body in finch.

That is the issue I’ve mentioned. This approach will consume more memory especially if you have 1GB files. You have to allocate 1GB of memory just to send a file.

I would be really thankful if you provide couple of examples. I’m using HTTPoison in all of my projects and been happy with it. Am I missing something important?

I am sending the binary data such as

File.read!

do you think there could be a better way of doing it?

As I want the headers to be octet and send a file, in past it was just
{:file, path},

hi as I described above, the basic auth is not working for me.

In past, with HTTPoison it was going in hackney opts, but now I have added it to headers so it’s giving me a closed error. can you share how are you doing basic auth?

I needed to construct multipart messages recently, but also didn’t want to include Tesla in my dependencies, so I built a separate library for message construction. It should cover most use cases for multipart/form-data requests, but very open to suggestions to improve the API and functionality.

6 Likes

A huge thank you for putting multipart together. I’m using the PSPDFKit API in an application where I was already using Finch and it helped me enormously.

2 Likes