Hey all, not sure if I have this in the right section.
I have a fairly complicated Elixir/Phoenix project currently running using Postgres to store most of the data.
Users come to the site and can post listings which are products they want to sell. The products can be from a number of categories: books, furniture, cars, etc. meaning different attributes are necessary for different categories.
I’ve essentially hit a point where it’s very difficult to represent the data I need in a RDBS and I’m looking for something more flexible. I know there are plenty of tools out there like: MongoDB, ElasticSearch, Cassandra, Redis, etc. but I don’t know which one will work best for my use case.
I need a tool flexible enough to model the unique product attributes for different categories. It needs to be fast and scalable. I’m just hoping somebody can point me in the right direction.
You can still use Postgres but have a JSONB (basically a
Map) column where needed. With the help of Ecto’s
fragment macro you can query keys inside the JSONB/Map column as you see fit. If you give a few examples of your difficulties then maybe we can help.
Thanks for the reply!
Basically my problem is this: different product categories have different attributes.
Ex. Books vs. Cars
Books have the following attributes:
- page count
Cars have the following attributes:
Both will share common attributes like: product title, description, price, etc.
My solution thus far has been to oversimplify the database so as to only store what is necessary for the product listing. Basically leaving it up to the user to be as descriptive as possible when writing about their product. This is no longer an option.
So this is what I would do:
- All common attributes become strictly-typed DB columns.
- Everything else lives in a generic JSONB column, f.ex. called
Check Ecto.Schema docs on JSONB columns for more details.
This looks excellent and is a much simpler solution than I was expecting to implement.
I’ll give it a go and let you know how it goes!
I’m not sure of the performance of the postgres+jsonb solution when searching: for example if you require “get me all products of color:red and all attributes:values to refine searches even further” I’d think that a document store like solr or elastic search would fare better at this kind of faceted queries
Very likely they’d be faster, yeah. But I’d still stick with Postgres because document storages abandon any and all data typing information and that becomes a headache just a week of development later.
Using Postgres gives you the advantage of strictly typing all your data that has a well-known shape, plus a lot of abilities around unstructured and loosely-typed data.
search is different from storage - I would certainly not run an e-commerce solution on elasticsearch - I might use elastic for the search part and analytics, should I need that down the line…