Adding a file to an existing zip

Hoping to tap into the community for some advice. Is there a way to add a file to an existing zip archive?

This seems like maybe it’s not supported by the Erlang :zip module:

Changing a zip archive is not supported. To add or remove a file from an archive, the whole archive must be recreated.

But maybe someone knows of another library or other algorithm (bzip, gzip, ?) that might support this more easily.

It’s relevant that these zip files may be stored in S3, so maybe it would be ok to do this via a Lamba (looks like maybe this article is poking at that idea Zip files from Amazon S3 without saving them locally | tech, life & music.)

Thanks for any ideas!

1 Like

You could resort to calling zip in a separate process:

iex(1)> for file <- ["foo", "bar", "baz"], do: File.write(file, file)
[:ok, :ok, :ok]
iex(2)> System.cmd("zip", ["zipped.zip", "foo", "bar", "baz"])
{"  adding: foo (stored 0%)\n  adding: bar (stored 0%)\n  adding: baz (stored 0%)\n",
 0}
iex(3)> File.write("extra_file", "some more stuff")
:ok
iex(4)> System.cmd("zip", ["zipped.zip", "extra_file"])
{"  adding: extra_file (stored 0%)\n", 0}
iex(5)>
BREAK: (a)bort (A)bort with dump (c)ontinue (p)roc info (i)nfo
       (l)oaded (v)ersion (k)ill (D)b-tables (d)istribution
^C~ $ mkdir example
~ $ mv zipped.zip example/
~ $ cd example/
~/example $ unzip zipped.zip
Archive:  zipped.zip
 extracting: foo
 extracting: bar
 extracting: baz
 extracting: extra_file

Thank you! I think this is the easiest solution. I was trying to figure out how to do this in S3, and nope… that’s pushing the river. S3 may be good for some things, but simple it is not, and trying to juggle lots of small files and zipping them up is far easier with a local volume and gold old *NIX commands.

I guess this stems from expecting a filesystem, but S3 is not a filesystem. It‘s a database for files. You wouldn‘t try to zip up files in e.g. postgres.

1 Like

Yeah… it’s sort of a database. I’m finding it painful to do searches or limit the number of results returned by S3, and ultimately those headaches have been helpful in driving our research to find a better solution for a few particular pieces of our app. Live and learn.