Gettext Backed by DB (ecto)

Hi,

I have project requirements that I need to provide an interface to user for translating text on the page manually.
I am wondering is it possible to use gettext library and then configure it to use translation from DB instead of .po files so that I can provided an simple web interface to allow user to translate text.

Best
Dev.

No, AFAIK it is not possible. What you can achieve though is that you can provide your own helper that will localise the field from the DB, for example you if you store text in PostgreSQL and you use hstore plugin to localise fields then you can do:

def localise_db(field) do
  field[Gettext.get_locale()]
end

And then in your views use localise_db article.title.

2 Likes

If you want something a bit more specific to full DB language translations instead of just gettext style things (which you can do with a simple DB lookup and maybe a cache) then trans might be what you want as well.

3 Likes

This hstore thing and trans project are interesting, but am I wrong in my understanding that the application will retrieve all the translations from the DB before filtering one for the desired locale? That seems really inefficient to me, and the more you add locales the worse it becomes…?

I’m interested because I’m wondering about this kind of things now too. Wouldn’t the most efficient option be to duplicate columns for every locale? No joins, no fat around the filtered and retrieved data. But then the queries become a pain to write… And the schemas too.

Thank you @hauleth and @OvermindDL1 for your information and recommendation.
trans project looks really interesting.
I think poeditor service could be a solution for this also, but it’s a paid service.
I will be checking on poeditor service and trans.
Will end up picking one of them for this solution.

Trans makes use of a translations field (a map basically => jsonb in Postgres) you’ll store in the database in your model schema that will contain the translations of the fields you need translated. So, it’s not an extra db query as you would fetch it normally with the rest of the model.

And then simply call to Trans with the fetched model that already contains all translations, field to translate and the locale.

1 Like

I understood everything is fetched in one query, and avoiding joins which is nice, but for every translatable field it will fetch a map containing all the possible translations in order to filter for the desired one afterwards. Right? If yes, it’s convenient indeed but what a waste of bandwidth. If no, it means that postgres can filter within the fields he knows contain maps/json : that would be neat and I would like to be sure in order to hop into that wagon.

Postgres can certainly select only parts of json fields. I’m not sure if Trans uses it though.

2 Likes

Oh nice, I’m going to look it up now as it makes this perfect for storing translations this way. I’ll test and inspect the SQL generated by Trans when I have time to check that!

Edit :
https://www.postgresql.org/docs/9.3/functions-json.html :sunny:
https://www.compose.com/articles/faster-operations-with-the-jsonb-data-type-in-postgresql/ :smile:

2 Likes

It does, it adds a translated macro that can be used in query’s (anywhere to reference the specific field, so in where’s, select’s, etc…).