How to get json result as raw binary from Postgres Query

I would like the resulting json from Postgrex query; to not be decoded by Jason.

Instead let it return as a binary / string and sent straight as a response; to avoid the decoding from db and then encoding when sending to client.

  • Below code does not work. Currently sending it by using

Jason.encode!(List.flatten(result.rows))"

query = "select array_to_json(array_agg(customers)) from customers"
{:ok, result} = Ecto.Adapters.SQL.query(Repo, query, [])

conn
|> put_resp_content_type("application/json")
|> send_resp(200, List.flatten(hd(result.rows)))

Have you tried casting in the DB? Like:

query = "select array_to_json(array_agg(customers))::text from customers"
{:ok, result} = Repo.query(query, [])

conn
|> put_resp_content_type("application/json")
|> send_resp(200, List.flatten(hd(result.rows)))
3 Likes

WoW. That was easy.

Just to share some benchmarks;
to return 1000 customer records went from

[debug] QUERY OK db=5.3ms decode=0.1ms queue=0.3ms
select * from customers
[info] Send 200 in 31ms

went down to

[debug] QUERY OK db=7.7ms queue=0.2ms
select array_to_json(array_agg(customers))::text from customers
[info] Send 200 in 8ms

Thank you :smile:

2 Likes

Also, you don’t need the List.flatten(hd(result.rows)) part. The body passed to send_resp can be iodata, which, among others, means a nested list of binaries. send_resp(200, result.rows) should work just fine.

Also using SELECT json_agg(customers)::text FROM customers might improve things on the DB side.

4 Likes

Haha, you should see the difference it makes in Ruby on Rails; 4x speedup is piddling :wink:

Anyway I actually have a PostgresUtils library that I’ll probably release someday when I’m happy with it. Some of the functions just take a “regular” query and return JSON. You can see from the way you got that JSON for your query that you could wrap any arbitrary query in the same functions to get JSON back, thus it’s pretty easy to make a utility function that takes a pgconn, query, arg list, and returns JSON. When building a straight JSON API, an ORM just really doesn’t add much…

1 Like

Taking @michalmuskala, @sribe and @hauleth advise and
looking around the inter-webs; came up with the below examples.
Hope it helps others.

# customer_controller.ex
def index(conn, _params) do
  query = "SELECT json_agg(customers)::text FROM customers"
  result = Repo.query!(query, [])

  conn
  |> put_resp_content_type("application/json")
  |> send_resp(200, ["{\"list\":", result.rows, "}"])
end

More complex example; return an invoice as json object with customer details and list of items bought

# invoice_controller.ex
def show(conn, %{"id" => id}) do
  query = """
  SELECT row_to_json(invoice)::text
  FROM (
    SELECT *,
    (
      SELECT row_to_json(c)
      FROM (
        SELECT *
        FROM customers
        WHERE invoices.customer_id = customers.id
      ) c
    ) AS customer,
    (
      SELECT json_agg(i)
      FROM (
        SELECT id, product_id, qty, rate, total
        FROM invoice_details
        WHERE invoice_id = $1::bigint
      ) i
    ) AS items
    FROM invoices
    WHERE id = $1::bigint
  ) invoice;
  """

  result = Repo.query!(query, [id])

  conn
  |> put_resp_content_type("application/json")
  |> send_resp(200, ["{\"data\":", result.rows, "}"])
end
5 Likes

Not to be downer there, but I would do that in the application view instead of the DB. It can be quite hard to follow logic otherwise. Especially for new developers or people less familiar with SQL.

@hauleth Totally agree; this can be confusing for beginners and new developers.

However, for like data warehousing projects that require ad-hoc queries / reports to explore a database
and present something quickly to users / front-end. This technique can be quite useful.

Personally, I feel comfortable writing SQL queries. :sweat_smile:

1 Like

No doubt there. However remember that you can kill DB with malicious SELECT. And parsing data in Elixir and sending it further shouldn’t provide much overhead, especially if you use HTTP streams or sockets.

I would not agree, keep the code in the proper place of where it belongs, I.E. the database should return the data in as processed a format as possible given the abilities of SQL (which is for data transformation), which can run it far more optimized than about anything you could do after receiving it. Talking your datastore’s language is just important in such designs as is knowing the backend language or knowing your front-end interfaces.

Well put… I would only add that the code sample posted would need some refactoring before being put into production. It illustrates the technique perfectly well for a post here, but transform of a query into a JSON-returning one should be separated from the guts of the query so that it becomes easier to read the query.

2 Likes

I realize that this technique has issues when trying to return json response via a websocket channel i.e.

For example, I have a channel with the following response

def handle_in("customers", %{}, socket) do
  query = "SELECT json_agg(customers)::text FROM customers"
  result = Repo.query!(query, [])

  {:reply, {:ok, %{customers: result.rows}}, socket}
end

I think the reply gets encoded again. Hence, have to JSON.parse(payload.response.customers) on the client end.

Not sure if there is a way to control how the channel encodes the reply?

So there’s no way to avoid what you’re talking about without digging into the guts of Phoenix. It’s not particularly hard to hack something in, but I am not satisfied that what I’ve done is clean enough for release. Essentially, you can add a function on, say, {:json, myjson} so that if you pass that in instead of a map, it doesn’t re-encode your data, but splices the json string directly into the iodata that will be sent.

If you’re using jason, you can use Jason.Fragment.new(iodata) which was designed especially for that use case. The docs are… a bit lacking :sweat_smile:.

This basically implements exactly what I described in my blog post about doing something similar in Poison.

4 Likes

Thank you @michalmuskala. Yup that works flawlessly :clap:

def handle_in("customers", %{}, socket) do
  query = "SELECT json_agg(customers)::text FROM customers"
  result = Repo.query!(query, [])

  {:reply, {:ok, %{customers: Jason.Fragment.new(result.rows)}}, socket}
end
1 Like

Yes. My hack is not a lot of code, nor complicated, but it hardwires an assumption of JSON into Phoenix at an inappropriate place. Jason.Fragment.xxx is an appropriate API for decoupling that so that encoders for other formats could be hooked in.

1 Like