Automating data transfer from production to local database in Elixir/Phoenix backend

Description: I am working on an Elixir/Phoenix backend project and need to automate the process of fetching data something as a SQL dump file from the production environment and loading it into my local database. I am looking for a way to achieve this by running commands in my terminal.

can anyone give me a brief step by with an example of how Can I achieve this?

Not before you say from where must the production DB dumps be coming from, as a start.

@dimitarvp from Amazon AWS

Okay but there are still a number of ways people use AWS.

What have you tried?

1 Like

@dimitarvp To store the data , so my production data is present on AWS instance
so my requirement is that I want to fetch the all data from production as a sql dump and then restore that dump file to my local data base in postgres , so how can I do this , like writing some script which run through command and fulfil my need.

This is never going to get anywhere if you don’t give us any details. :smiley:

What kind of AWS DB you’re using? RDS?

@dimitarvp Yes right . RDS

Have you tried their docs? I found the following page on the first try:

@dimitarvp I guess creating a DB snapshot is not required for this task. A DB snapshot is a point-in-time copy of your database and is used for backup and recovery purposes.
My requirement is that I want to fetch all of the data from the production and then load it to my local data base so I need some script , fulfills the requirement by running some command.
I did like it this way.
created a fetch-data.sh

#!/bin/bash

# Install PostgreSQL client tools (only needed if not already installed)
# sudo apt-get install postgresql-client

# Production database credentials
production_db_host="xxx.rds.amazonaws.com"
production_db_name="xxx"
production_db_user="postgres"
production_db_password="xxx"

# Local database credentials
local_db_host="localhost"
local_db_name="xxxx"
local_db_user="postgres"
local_db_password="xxx"

# Dump file name
dump_file="dump.sql"

# Fetch the data dump from the production database
PGPASSWORD="$production_db_password" "C:/Program Files/PostgreSQL/13/bin/pg_dump.exe" --host="$production_db_host" --port=5432 --username="$production_db_user" --dbname="$production_db_name" > "$dump_file"

# Transfer the dump file to the local machine
scp -P 2222 "$dump_file" "$local_db_user@$local_db_host:~"

# Restore the dump file to the local database
PGPASSWORD="$local_db_password" "C:/Program Files/PostgreSQL/13/bin/pg_restore.exe" --host="$local_db_host" --port=5432 --username="$local_db_user" --dbname="$local_db_name" --verbose "$dump_file"

# Remove the dump file
rm "$dump_file"

then running this command on terminal
bash fetch-data.sh

but getting this error

ssh: connect to host localhost port 2222: Connection refused
scp: Connection closed
pg_restore: error: input file does not appear to be a valid archive

Hey @belwalshubham Fundamentally at least right now the issues you are having don’t really have anything to do with Elixir or Phoenix. I’m gonna move this post to a more general programming area for now.

Your scp command is a bit confusing. You’re trying to copy a local file to… localhost? Which is the same place right? That is the error that is failing

1 Like

I’m guessing he’s trying to copy a file from the proper Windows filesystem to the WSL sandbox.

But even with that he still doesn’t need scp.

In that case wouldn’t it be easier to just pg_dump from WSL in the first place?

Oh yes, absolutely.

But if that’s not an option for some reason then you can still directly copy files from your host Windows OS directly to WSL if I remember correctly.