Large Object feature of Postgres

Has anyone ever tried out the Large Object feature of Postgres?
I have a blob column in my table, and was wondering what data type to use, and that’s when I came across :bytea and this Large Object feature. Since :bytea is quite memory hungry as explained in the Postgres Documentation I was thinking about using Large Object. So, if anyone knows more about it, please help me out

3 Likes

Hey @shijith.k, I know it’s been five years, but I thought I might still ask if you found a solution. I was looking into it, but I didn’t find any way to execute the large object functions on the client side.

Found this on the Postgrex issue tracker:

The different behavior of lo_import seems like a possible problem, depending on exactly which functions you were hoping to call.

I saw that too, and it sort of works, but I can only execute those in the server machine where the database is, and I need them to execute on the client where elixir is. It should be possible according to postgres’ docs, but the client has to support it, and it doesn’t seem to be

I just published a new library for working with large objects in PostgreSQL: pg_large_objects | Hex

This makes it easy to stream data into/out of Postgres, e.g.:

# Stream enumerable into large object
{:ok, object_id} =
  "/tmp/recording.mov"
  |> File.stream!()
  |> Repo.import_large_object()

# Stream data of large object into Collectable
stream = File.stream!("/tmp/recording.mov")
:ok = Repo.export_large_object(object_id, into: stream)
2 Likes