Advice please: how to best automate deployment to many different VPS

Hi all, I need some sage advice on my small dev-ops situation.

I develop and support a niche administrative tool for a few university departments. Standard stuff, coding-wise: Elixir, Postgres, React. Unfortunately, every department insists on hosting the app themselves (student data privacy, yada yada) and to that end they all provide me with a Linux VM I can SSH into. Of course, every server is a little different in how it integrates with the respective university network; some run Red Hat (I run Ubuntu), etc., so I dread every upgrade.

I am looking for suggestions because I am sick and tired of my current deployment process, which includes:

  • Having to build separate distillery releases for each server, just because they have slightly different names and host URLs in the config.exs,
  • Having to build some of them within docker (on my painfully slow laptop) just to prevent glibc errors on the Red Hat machines
  • Having to then manually upload each build package, ssh in, restart, etc., with every single server

I’ve toyed with the thought of setting up a uniform docker image on each server and targeting that with a single build, and potentially using automation tools (Ansible?) to create some shortcuts for myself. At the same time, I’m wary of creating a container-hell time sink.

Has anyone here dealt with a similar situation? How have you solved it? What are some solutions I should look into, what are some tools I should avoid? If I don’t need communication between multiple nodes, are there still reasons against running one (or multiple) Elixir apps within a docker container?

Thank you! :slight_smile:

Couple of ideas:

1/ “Having to build separate distillery releases for each server, just because they have slightly different names and host URLs in the config.exs,”
Why? You can use REPLACE_OS_VARS=true then just config variables for these?
2/ Yes, use ansible or similar tool to basically build the release using distillery, then sending the final tar.gz to the servers. Each server should have a config file it can read from, such as systemd or similar. Or just a simple shell script. Or however you do it.

Using docker/containers can be a bit of a sink, but will be muuuuch easier in the long run if you have full access over each machine and can run docker. It means you can run on a mixed distro environment and not have to build for each environment.

1 Like

Hah, my job at my college is to combine all this distinct systems into one cohesive whole so that things like student phone numbers and such all scattered around are actually in sync and everything pulls from the central Banner oracle database for class access and permissions and all. :slight_smile:

However I manage my own RedHat server that they all access (via various DNS names) so I’m not sure about that… Maybe edeliver I keep hearing can handle this?