Also, you don’t need the List.flatten(hd(result.rows)) part. The body passed to send_resp can be iodata, which, among others, means a nested list of binaries. send_resp(200, result.rows) should work just fine.
Also using SELECT json_agg(customers)::text FROM customers might improve things on the DB side.
Haha, you should see the difference it makes in Ruby on Rails; 4x speedup is piddling
Anyway I actually have a PostgresUtils library that I’ll probably release someday when I’m happy with it. Some of the functions just take a “regular” query and return JSON. You can see from the way you got that JSON for your query that you could wrap any arbitrary query in the same functions to get JSON back, thus it’s pretty easy to make a utility function that takes a pgconn, query, arg list, and returns JSON. When building a straight JSON API, an ORM just really doesn’t add much…
Taking @michalmuskala, @sribe and @hauleth advise and
looking around the inter-webs; came up with the below examples.
Hope it helps others.
# customer_controller.ex
def index(conn, _params) do
query = "SELECT json_agg(customers)::text FROM customers"
result = Repo.query!(query, [])
conn
|> put_resp_content_type("application/json")
|> send_resp(200, ["{\"list\":", result.rows, "}"])
end
More complex example; return an invoice as json object with customer details and list of items bought
# invoice_controller.ex
def show(conn, %{"id" => id}) do
query = """
SELECT row_to_json(invoice)::text
FROM (
SELECT *,
(
SELECT row_to_json(c)
FROM (
SELECT *
FROM customers
WHERE invoices.customer_id = customers.id
) c
) AS customer,
(
SELECT json_agg(i)
FROM (
SELECT id, product_id, qty, rate, total
FROM invoice_details
WHERE invoice_id = $1::bigint
) i
) AS items
FROM invoices
WHERE id = $1::bigint
) invoice;
"""
result = Repo.query!(query, [id])
conn
|> put_resp_content_type("application/json")
|> send_resp(200, ["{\"data\":", result.rows, "}"])
end
Not to be downer there, but I would do that in the application view instead of the DB. It can be quite hard to follow logic otherwise. Especially for new developers or people less familiar with SQL.
@hauleth Totally agree; this can be confusing for beginners and new developers.
However, for like data warehousing projects that require ad-hoc queries / reports to explore a database
and present something quickly to users / front-end. This technique can be quite useful.
Personally, I feel comfortable writing SQL queries.
No doubt there. However remember that you can kill DB with malicious SELECT. And parsing data in Elixir and sending it further shouldn’t provide much overhead, especially if you use HTTP streams or sockets.
I would not agree, keep the code in the proper place of where it belongs, I.E. the database should return the data in as processed a format as possible given the abilities of SQL (which is for data transformation), which can run it far more optimized than about anything you could do after receiving it. Talking your datastore’s language is just important in such designs as is knowing the backend language or knowing your front-end interfaces.
Well put… I would only add that the code sample posted would need some refactoring before being put into production. It illustrates the technique perfectly well for a post here, but transform of a query into a JSON-returning one should be separated from the guts of the query so that it becomes easier to read the query.
So there’s no way to avoid what you’re talking about without digging into the guts of Phoenix. It’s not particularly hard to hack something in, but I am not satisfied that what I’ve done is clean enough for release. Essentially, you can add a function on, say, {:json, myjson} so that if you pass that in instead of a map, it doesn’t re-encode your data, but splices the json string directly into the iodata that will be sent.
Yes. My hack is not a lot of code, nor complicated, but it hardwires an assumption of JSON into Phoenix at an inappropriate place. Jason.Fragment.xxx is an appropriate API for decoupling that so that encoders for other formats could be hooked in.