Intellectual Property and Elixir

Elixir noob here. I am contemplating climbing the Elixir learning curve to gain some security of an existing product and would like to entertain this discussion to decide if I should bother with Elixir or attempt some other platform or do nothing at all. From my brief dip into the water, I’ve learned that Elixir-based software can be distributed as a beam that does not contain debug symbols. Stop me here if I have that all wrong.

Background: I already have a NodeJS+Express+Socket.IO app that serves as a cloud-based, for-profit service. I am considering Elixir as a complete port of the codebase to a still for-profit system that allows self-hosting. In the absence of this port, I would be distributing a minified JS bundle, probably from webpack.

For the basis of comparison of Elixir to JS in this respect, I’d like to use a classic compiled language, such as C as a benchmark.

On a scale of 1-10, where 1 is the system distributed as C source code and makefiles, and 10 is the system distributed in binary but runnable form of the C source code, place both the minified NodeJS and the distributed Elixir beam on that scale with respect to the following two scenarios. I hope that makes sense.
Scenario 1) Defeating the system’s monetization efforts. An analog would be to hack the MS Windows distribution to defeat it’s license enforcement so I could enjoy free Windows for life (sic).

Scenario 2) Completely lifting the IP, rebranding it, and selling it as one’s own. An analog would be to hack the MS Windows distribution, reverse engineer it, rebuild it as my own OS, and either selling it or giving it away to all.

I am aware of the issue that observable network traffic poses. Please answer on the merits of the hackability of the distributed files, not the hackability of the overall system. My intent of this post is to gauge risk/reward of a port of a working system from JS to Elixir.

Thank you

TBH it doesn’t matter: the protection doesn’t need to be bulletproof, it just needs to be sufficient to establish “you know you’re messing with this” enough to get the lawyers going.

IIRC the Github Enterprise distribution uses something silly like a fixed XOR for this.

1 Like

Beam byte code should be on par of java byte code in term of reverse engineering possibilities. So I would rate it higher than minified JS and a tad lower than striped native binary. By default mix release will strip debug_info and documentation.

1 Like

I would agree with @al2o3cr, protecting intellectual property is probably better left to lawyers than engineers.

What is a “fixed XOR” ? You mean they are running a XOR with a well-known binary and the source or the binaries of their program to see if that returns the same result as with their own source/binaries ?

The Ruby files on disk are XORed with a fixed “key” that’s something silly like “sssshhh this is a secret” and then “decrypted” when loaded. The point is to discourage casual modification / browsing without imposing a lot of runtime overhead; the mechanism isn’t designed to be “unbreakable” since it’s fundamentally shipping a lock along with the key…