Auto-fill Postgres Database via SQL Server datas

I rephrase my question from yesterday because it was not very clear, I want to automatically add the data stored in SQL Server in Postgres, i.e. the data in Postgres is always the same as that in SQL Server, That is to say that if there is an update of data in SQL Server, those in Postgres are also updated.

Postgres is UI-bound and SQL Server is read-only for data retrieval.

My table structure for Postgres and SQL Server is like this :
id , description , price , title , views , inserted_at and updated_at.

I receive format of data like this from SQL Server and how I can to store this into Postgres:

QUERY OK source=“products” db=3.0ms queue=4.8ms idle=1400.1ms
SELECT p0.[id], p0.[description], p0.[price], p0.[title], p0.[views], p0.[inserted_at], p0.[updated_at] FROM [products] AS p0 []
[
%SyncDb.Catalog.Product{
meta: #Ecto.Schema.Metadata<:loaded, “products”>,
description: “taille haute”,
id: 2,
inserted_at: ~N[2022-03-22 00:00:00],
price: #Decimal<2.000000>,
title: “pantalon”,
updated_at: ~N[2022-03-23 00:00:00],
views: 1
},
%SyncDb.Catalog.Product{
meta: #Ecto.Schema.Metadata<:loaded, “products”>,
description: “taille haute”,
id: 3,
inserted_at: ~N[2022-03-22 00:00:00],
price: #Decimal<2.000000>,
title: “pantalon”,
updated_at: ~N[2022-03-23 00:00:00],
views: 1
},
%SyncDb.Catalog.Product{
meta: #Ecto.Schema.Metadata<:loaded, “products”>,
description: “taille haute”,
id: 4,
inserted_at: ~N[2022-03-22 00:00:00],
price: #Decimal<2.000000>,
title: “pantalon”,
updated_at: ~N[2022-03-23 00:00:00],
views: 1
},
%SyncDb.Catalog.Product{
meta: #Ecto.Schema.Metadata<:loaded, “products”>,
description: “taille haute”,
id: 5,
inserted_at: ~N[2022-03-22 00:00:00],
price: #Decimal<2.000000>,
title: “pantalon”,
updated_at: ~N[2022-03-23 00:00:00],
views: 1
},
%SyncDb.Catalog.Product{
meta: #Ecto.Schema.Metadata<:loaded, “products”>,
description: “taille haute”,
id: 6,
inserted_at: ~N[2022-03-22 00:00:00],
price: #Decimal<2.000000>,
title: “pantalon”,
updated_at: ~N[2022-03-23 00:00:00],
views: 1
},
%SyncDb.Catalog.Product{
meta: #Ecto.Schema.Metadata<:loaded, “products”>,
description: “taille haute”,
id: 7,
inserted_at: ~N[2022-03-22 00:00:00],
price: #Decimal<2.000000>,
title: “pantalon”,
updated_at: ~N[2022-03-23 00:00:00],
views: 1
},
%SyncDb.Catalog.Product{
meta: #Ecto.Schema.Metadata<:loaded, “products”>,
description: “taille haute”,
id: 8,
inserted_at: ~N[2022-03-22 00:00:00],
price: #Decimal<2.000000>,
title: “pantalon”,
updated_at: ~N[2022-03-23 00:00:00],
views: 1
},
%SyncDb.Catalog.Product{
meta: #Ecto.Schema.Metadata<:loaded, “products”>,
description: “taille haute”,
id: 9,
inserted_at: ~N[2022-03-22 00:00:00],
price: #Decimal<2.000000>,
title: “pantalon”,
updated_at: ~N[2022-03-23 00:00:00],
views: 1
},
%SyncDb.Catalog.Product{
meta: #Ecto.Schema.Metadata<:loaded, “products”>,
description: “taille haute”,
id: 10,
inserted_at: ~N[2022-03-22 00:00:00],
price: #Decimal<2.000000>,
title: “pantalon”,
updated_at: ~N[2022-03-23 00:00:00],
views: 1
},
%SyncDb.Catalog.Product{
meta: #Ecto.Schema.Metadata<:loaded, “products”>,
description: “taille haute”,
id: 11,
inserted_at: ~N[2022-03-22 00:00:00],
price: #Decimal<2.000000>,
title: “pantalon”,
updated_at: ~N[2022-03-23 00:00:00],
views: 1
},
%SyncDb.Catalog.Product{
meta: #Ecto.Schema.Metadata<:loaded, “products”>,
description: “Taille Xl”,
id: 13,
inserted_at: ~N[2022-03-23 00:00:00],
price: #Decimal<2.000000>,
title: “t-shirt”,
updated_at: ~N[2022-03-24 00:00:00],
views: 1
},
%SyncDb.Catalog.Product{
meta: #Ecto.Schema.Metadata<:loaded, “products”>,
description: “Taille X”,
id: 14,
inserted_at: ~N[2022-03-23 00:00:00],
price: #Decimal<3.000000>,
title: “short”,
updated_at: ~N[2022-03-24 00:00:00],
views: 1
}
]

Can someone help me because I don’t see how it works

Thanks! :smiley: :slight_smile: :pray:

Hmmm, the question is syncing data between two database. I don’t think this is a problem to be solved exclusively with elixir. Considering you only have read only access to SQL server, tools like Debezium (https://debezium.io/) which Change Data Capture probably wouldn’t work (if you could use Debezium, you could capture data from SQL server and write it to PostgresDB).

In the past, what i have done is making a data pump (which is inspired by Ingest data from a relational database into Elasticsearch Service | Elasticsearch Service Documentation | Elastic) which periodically read from source DB (SQL Server) (limited by last updated_at/inserted_at), and write the data to sink DB (PostgreSQL in this case).

The program that I am doing was made with Elixir the database qho stored data for the application is Postgres, and SQL Server is the database who is used to import and read datas into Postgres database.

Did you consider postgres foreign data wrapper functionality? With that you’d not need to sync data in the first place.

Can you explain more please because I don’t understand very well

Thanks everybody, I’ve solve the probleme, the probleme comes from the datas that I want to store

How did you solve it exactly? Might be useful for future readers.

Hey, I’m really sorry for the late reply:

The solution that I’ve found is like this :

queries = MRepo.all(from a in Product ,
select: a.id)
query = Repo.all(from i in Product,
where: i.id not in ^queries,
select: %{title: i.title,
id: i.id,
description: i.description,
price: i.price,
views: i.views,
inserted_at: i.inserted_at,
updated_at: i.updated_at
}
)

MRepo.insert_all(Product, query)
MRepo.all(Product)