Using elixir to do compress and ftp & Code cleaning - Elixir

The task is to:

  1. SSH to a Linux server.
  2. Stop a service.
  3. Using tar create a tar.gz file.
    from a directory (30GB size Max).
  4. Upload that file to an FTP server.
  5. Start the stopped service.
  6. Whole above process is going to be done on Each Sunday

I have covered pretty much all steps and failures in the code, assuming that there is going to be a service running and also directory is present as well on which it’s going to do tar operation.

I am using https://github.com/rubencaro/sshex for SSH to the server and then do all the commands.

Questions:

  1. How can I make the pipe to wait until the tar process completed and then run the next command? as on 30GB file, It almost takes more than 10 minutes to tar that size of the directory. Task 3 and Task 4 are the most time consuming, And I want to make pipe wait until these commands are done then proceed to next.
  2. Is this the best way to do tar on a big directory (32GB max).
  3. Is it the best way to do FTP? in linux?
  4. Is it the best solution to do that? I mean Elixir is great but is doing all these steps through an Elixir Task is good?
  5. Can you suggest me some notes and changes to the code, I have written to make it better?

UPDATE:
While doing tar I get this error from ssh connection

{:error, "Timeout. Did not receive data for 5000ms."}

All Module:

defmodule EvercamMedia.BackUpSeaweedfsFiler do
  require Logger

  @zip_date Calendar.Strftime.strftime(Calendar.DateTime.now_utc, "%Y%m%d") |> elem(1)
  @ftp_domain Application.get_env(:evercam_media, :ftp_domain)
  @ftp_username Application.get_env(:evercam_media, :ftp_username)
  @ftp_password Application.get_env(:evercam_media, :ftp_password)
  @filer_server Application.get_env(:evercam_media, :filer_server)
  @filer_username Application.get_env(:evercam_media, :filer_username)
  @filer_password Application.get_env(:evercam_media, :filer_password)
  @stop_seaweedfs_command "systemctl stop seaweedfs.service"
  @start_seaweedfs_command "systemctl start seaweedfs.service"
  @zip_filer_directory "tar -czf #{@zip_date}.tar.gz /storage/filer"
  @copy_to_ftp "curl -T #{@zip_date}.tar.gz #{@ftp_domain} --user #{@ftp_username}:#{@ftp_password}"

  def start do
    connect_to_server(@filer_server, @filer_username, @filer_password)
    |> connected?
    |> grant_for_backup()
  end

  defp grant_for_backup({:exit, true}), do: :noop
  defp grant_for_backup(connected) do
    connected
    |> run_command_on_server(@stop_seaweedfs_command)
    |> response_is(connected)
    |> run_command_on_server(@zip_filer_directory)
    |> response_is(connected)
    |> run_command_on_server(@copy_to_ftp)
    |> response_is(connected)
    |> run_command_on_server(@start_seaweedfs_command)
    |> completed?
  end

  defp connected?({:ok, connected}), do: connected
  defp connected?({:halt, true, message}) do
    Logger.info("Process stopped due to: " <> message)
    send_email_for_backup("Process stopped due to: " <> message)
    {:exit, true}
  end

  defp completed?({:halt, reason}) do
    Logger.info("Process stopped due to: " <> reason)
    send_email_for_backup("Process stopped due to: " <> reason)
  end
  defp completed?(_conn) do
    Logger.info "Process has been completed."
    send_email_for_backup("Process has been completed.")
  end

  defp run_command_on_server({:halt, reason}, _command) do
    Logger.info("Process stopped due to: " <> reason)
    send_email_for_backup("Command Couldn't run due to: " <> reason)
    :noop
  end
  defp run_command_on_server(connected, command) do
    connected
    |> SSHEx.run(command)
  end

  defp response_is({:ok, _res, _}, connected), do: connected
  defp response_is({:error, reason}, _connected), do: {:halt, reason}

  defp connect_to_server(ip, username, password, tries \\ 1)
  defp connect_to_server(_ip, _username, _password, 3), do: {:halt, true, "Not possible to make connection with server."}
  defp connect_to_server(ip, username, password, tries) do
    SSHEx.connect(ip: ip, user: username, password: password)
    |> case do
      {:error, :nxdomain} -> {:halt, true, "IP does'nt seem correct."}
      {:error, :timeout} -> connect_to_server(ip, username, password, tries + 1)
      {:error, reason} -> {:halt, true, reason}
      {:ok, connected} -> {:ok, connected}
    end
  end

  defp send_email_for_backup(message) do
    EvercamMedia.UserMailer.seaweedfs_filer_backup(message)
  end
end

Thanks in Advance.

1 Like

For that specific error I think you may want to look into SSH keepalive. That will ensure that something is always being sent down the pipe so that the connection doesn’t die.

Although for me personally I would probably just run a bash script on the remote server to do the heavy lifting.

1 Like

I’d even do this per Cron on the remote. Without any tools connecting from another computer.

I’d probably use a shell script too, it’s just too simple to do it, but if you want to use the BEAM VM then a few things:

Upload that file to an FTP server.

Uhh, if you are uploading to the same server that you are SSH’d to, then why not use the SSH file transfer service to send the file over that instead?

How can I make the pipe to wait until the tar process completed and then run the next command? as on 30GB file, It almost takes more than 10 minutes to tar that size of the directory. Task 3 and Task 4 are the most time consuming, And I want to make pipe wait until these commands are done then proceed to next.

Ping it every once in a while, or set a keepalive or so, and make sure to set the timeout to way high enough considering you are dealing with a giant file.

Not really, rsync would be better, though if the files are at all compressable then just zipping them or tar/gzing them would be far better.

FTP has SO many issues everywhere as well as no real security whatsoever that I would never ever ever recommend it, plus if you already have an SSH connection then just use the file transfer service on it.

Is it the best solution to do that? I mean Elixir is great but is doing all these steps through an Elixir Task is good?

Elixir is fine, but honestly I’d just use a shell script as it would be just a couple of lines long is all (and much easier rsync work).

1 Like