Shipping Phoenix with Python in one Docker container for Machine Learning

Howdy all :wave:

I’d like to ship a Phoenix application that can call Python libraries. There are a few threads on this topic however packaging all the frameworks/libraries into 1 docker container (in general) does not appear to be so common (for “reasons” of course).

In any case, I’d like to do so. Just as Elixir releases create a single unit or bundle I “think” PyInstaller can do the same thing for a Python app/lib.

Ideally the final stage of my Dockerfile would copy the Elixir release and the Python release into the final container and therefore my Phoenix app could call to the python app (via ports or http).

Can this be done?
Has anyone done this?
Are there any gotchas I should know about? (aside from multi-app complexity in one container)
Are there Dockerfiles that people recommend for doing this?

Thank you for any feedback

1 Like

I don’t know how easy to copy a python environment… (compared to copying elixir release)… but you can do it with multi-stage docker build

  • build python part in python stage
  • build elixir part in elixir stage
  • in final stage - install runtime dependencies (e.g. SSL, any dynamic linked libraries), and copy both python and elixir parts from each stage.

How do you want to make them communicate? If you want to use http (e.g. from elixir phoenix app to python web server) - then it’s more “conventional” to have separate containers - for example Kubernetes makes it easy to run two containers in the same network by having multiple containers in a pod.

1 Like

Either ports or http. I’d rather have one container to keep things “simpler”. I’m also not sure python can just be copied over like an elixir release for the final stage of a docker build :thinking: