Put_assoc in many_to_many crash the server

hi guys!

i am facing a common situation which i want to add records to my many_to_many field , let’s say i want to add a lots of comments to post:

schema "post" do 
  many_to_many(:comments, Comment, join_through: "cms_posts_comments")
  # ...
def comment_post(post_id, body) do
    with {:ok, post} <- find_content(Post, post_id),
         {:ok, comment} <- create_comment(%{body: body}) do
      post_with_comments = post |> Repo.preload(:comments)

      |> Ecto.Changeset.change()
      |> Ecto.Changeset.put_assoc(:comments, post_with_comments.comments ++ [comment])
      |> Repo.update()

      # die
      {:error, reason} ->
        {:error, reason}

it works , but seems the put_assoc it`s not the right tool to do it. i programed seed 10k comments to the same post , but the server crashed in about 3300 records

def random(count \\ 10000) do
    for _u <- 1..count do
      CMS.comment_post(21, "fake comment")

and after about 10 mins server crashed …

[error] Postgrex.Protocol (#PID<0.261.0>) disconnected: ** (DBConnection.ConnectionError) client #PID<0.73.0> timed out because it checked out the connection for longer than 15000ms

so it’s there a better way to do it ?
how to add, update, and delete data from a many_to_many. ?

see also Many-to-many associations in phoenix and ecto

You can try inserting directly into the join table cms_posts_comments by using Repo.insert_all.

thanks !
it’s great to use Repo.insert_all to solve it, except the hard code join-table name part

I thinks there is a missing macro , say add_assoc ?

That is because you have your timeout set to too short of a time for the data load that you are adding. Be sure to always set the timeout based on the amount of data that you are submitting. The standard 15 seconds is for ‘standard’ short data handling where if 15s is hit that usually means a major bug. Change the timeout to whatever is appropriate for your data load.

i think i misunderstand the usage of put_assoc, see ecto/issuse

1 Like