What role can Elixir play in reclaiming or building a better internet?

@Garrison’s comment in another thread reminded me of this post by Joe:

With the big five exerting more control than ever, new (AI) players entering the fray, and *increasingly draconian internet laws being passed all over the world more and more people want to take back control of the internet or to find an alternative - with the original promise of the net in tact and something fit for a modern free world.

*details/articles

So with the call for a new or freer internet being at an all time high, what kind of role do you think Elixir could play in its creation from a technical standpoint?

Maybe:

  • Peer-to-peer networks?
  • Local mesh networks?
  • Blockchain based domains with decentralized DNS systems?
  • Decentralised VPNs?
  • IPFS (InterPlanetary File System)?
  • Pluggable Transports?
  • Something else?

Is a free and open web important to you?

Can you think of ways in which Elixir can help?

5 Likes

The elixir comm is most definitely missing out on “web 3”, I understand many people dont like crypto and the usual languages used in this space are golang, java, rust & c#. Elixir can play in the c#, golang, java arena side of things and can help in some aspect for the rust side of things

For example in ethereum there is something called client diversity , the aim is that if you have the same protocol & vm programmed in different programming language’s, if one programming language has a glaring flaw in the future like an rce, since the control of the consensus is spread out between multiple implementations in different PL mitigation would be trivial. I always thought an elixir implementation would show the world how great elixir is at running network applications

Another point, some people in the evm community felt the “standard” golang implementation go-ethereum was slow at certain tasks and then attempted the herculean tasks of rewriting everything in rust, to fit there high performance checklist, I think it took them ~3 years to get to 1.0 and I thought if the standard implementation was written in a beam language and the high performance parts were brought in with rust NIFs how fast would there time to market have been?

I think elixir is perfect for the crypto/decentralized space, now granted ethereum is on the slower end of things with ~12s block times, you have something like solana that builds tx in 200ms, but again how much of the code could’ve been simplified if you handled all the network stuff to the beam and then rust NIFs for the heavy lifting stuff? Alot of decentralized protocols have slow build times

3 Likes

Decentralized technologies are a very interesting topic. Here are some loose thoughts off the top of my head.

~urbit

  • Bespoke decentralized personal server platform that allows users to own and control their data, functioning as their own server in a peer-to-peer network
  • I like it in principle, I strongly believe it won’t gain widespread adoption for many reasons.
  • Ethereum is essential for identity/ownership system, which I see as a weak point of the design.

LBRY

  • Free-speech-focused alternative to YouTube, censorship-resistant content publishing, monetization through their cryptocurrency “LBC”.
  • I liked both the design and implementation, unfortunately leadership made some suboptimal legal decisions.
  • In 2022, U.S. District Judge ruled that LBC was an unregistered security. As a result, LBRY Inc. announced its shutdown in 2023.
  • Perhaps Elixir’s Membrane Framework could be used in this space.

Bonfire Networks

  • It will be the best Fediverse platform, and it’s built entirely with Elixir (including LiveView for the frontend)
  • It could be a good choice for communities. Unfortunately I don’t ever see widespread adoption, mainly due to the limitations of the ActivityPub Protocol.
  • Worth noting, when I did a deep dive on the project in it’s inception a few years ago, I vaguely remember them getting EU grands for developing censorship tools…

AT Protocol

bitchat

  • Earlier this year I was looking for a bluetooth chat app and every option was insufficient. Glad to see I wasn’t the only one. Jack Dorsey, the founder of Twitter, announced working on his own just 3 months ago.
  • Bitchat is a decentralized peer-to-peer messaging app with dual transport architecture: local Bluetooth mesh networks for offline communication and internet-based Nostr protocol.
  • Nostr flips the current internet paradigm of “dumb client/smart server” to “smart client/dumb server”, by using relays and public-key cryptography. Nostr empowers users to control their own data.

SimpleX Chat

  • A private and encrypted messenger without any user IDs (not even random ones)
  • I strongly believe it to be the leader in the private messaging space.
  • You can already reliably use it to talk to non-tech-savvy friends and relatives.
  • In 2024 it got around $1 million investment, led by Jack Dorsey. Around the same time the cryptographic design got independently reviewed.
  • Broadly, instead of user IDs, the fundamental are anonymous pairwise connections between each contact. The messages then travel mixed in a network in some ways similar to Tor Network. Metadata exposure is minimized.
  • It’s build in Haskell. :jack_o_lantern:
  • I imagine integrating the protocol in Elixir applications would work well.

I hope this was on-topic enough :slight_smile:

3 Likes

I believe the future of a free internet lies with personal web applications. A personal web application is self hosted, often self-developed, has few power users other than the owner himself, yest still provide public access. Elixir is easier to get started (Phoenix generators), easier to reuse other people’s code (hex.pm), and easier to host (self-contained release, fly.io); it is a natural fit to develop you personal web applications.

4 Likes

At that point, the question becomes why you need a web application in the first place.

There are only a few use cases that make web applications a necessity: Collaboration, social sharing, publication. Everything else could better be solved by desktop or mobile applications with local data storage, plus an optional (and generic) data sync service (cloud storage) to make the data available on multiple devices.

1 Like

Ideally yes, but the sad state of cross-platform desktop development and crazy gate keeping of the mobile platforms make them less viable.

Which is exactly the thing what I want to avoid. less dependency is my goal.

2 Likes

Right, less dependency is good. If the main storage location is offline on your machine, you can opt for a self-hosted storage service, or just use a NAS, or external storage, you might reach for syncthing, or rclone, or none of that. My point is, not everything needs to be a web application, and I don’t want to host a fleet of self-hosted services. Your point about cross-platform desktop development and closed-off mobile ecosystems is very valid, of course.

2 Likes

I agree it is a bit like the hammer finding nail situation, and self hosting is a commitment not to be taken lightly. However, I don’t see any alternative:

  • I want to avoid the big five
  • I got burned by smaller SaaS providers pivoting or going out of business serveral times

Elixir is the best language for self-hosted web applications so it is a good hammer fr me.

3 Likes

This is an exciting topic I meant to talk about in my original post!

inkandswitch.com/keyhive

  • Keyhive is a project exploring local-first access control. It aims to provide a firm basis for secure collaboration, similar to the guarantees of private chat but for any local-first application.

I highly recommend for anyone interested to read the whole linked article, I can’t really do it justice. Here’s some pretty pictures from the notebook :slight_smile:










3 Likes

You can’t really fix something with tech that most people don’t see as broken. There is a reason people flock to the biggest sites and its not because the technology for self hosting smaller sites and forums doesn’t exist.

When I was younger I modded a lot of forums and there was a pattern you’d see on most forums where at some point a group of people would get annoyed at the modding or some other rules and declare that they were leaving to start their own forum with better rules and they’d post a bunch about it then disappear for a few days to chat between themselves in their new forum. No one would follow though because of switching costs and because the conversations and people were already where they were and so after a week or two the people that left would inevitably come back.

We’re seeing the same thing with the push for web3 and distributed web apps now as that. The % of people that care about this stuff is a rounding error so their sites are dead and thus when people do decide to go have a look they see a wasteland and bounce quickly.

5 Likes

That’s absolutely correct save for several exceptions. Calendars work exactly the way self-hosting-people do propose. I host my calendar, G hosts yours, and yet they are both fully functional, because of handy interchange format. Oh, wait, even mail does work that way.

The issue, in my humble opinion, is not people are stuck to their facebooks, but the interchange which is broken (intentionally by bigcorps, but also because self-hosters are not always smart enough to understand what you’ve just said.) Fediverse tries to fix it. To some extent it even succeeded, and one might get to bluesky fellows from self-hosted mastodon instances.

But in general, unfortunately, nobody cares. Can I connect my blog (supporting fediverse shenanigans) to that forum?—Nah. Why?—Because nobody gave a shot.

That said, I think we need a working interchange before self-hosted-world. Then the migration of some of us happens and there might (or might not) be a snowball effect.

4 Likes

I mean even in your best case situations where there is a legacy of interchange and generally followed protocols almost every user in the world chooses to use one of 3 providers.

Mastodon tried to take advantage of the Elon stuff and failed because no one wants a distributed system like that. Lots of people tried it and went back to twitter.

Bluesky is only doing better because even with interoperability(is it on the fediverse? I don’t actually use any of those sites to know) most people believe it to just be a single site/source and if you mentioned fediverse they’d probably see it as a negative as they did with mastodon.

It’s a social issue not a technical one and the only ones that want it are generally those already chatting on smaller forums.

1 Like

I think it ① didn’t fail and ② not because people do not want a distributed system (many never heard of the term anyway.) It twitter allowed interchange, everyone migrated would have stayed.

Mastodon didn’t conquer twitter not because of any tech, but because of ten years behind.

The actual social issue is people like crowdy places. If it were for fediverse—people would tolerate distribution, specifically taking into account that 99% of them have zero clue about such a thing. After all, gitlab took a bite of the pie, despite it suffers the very same disease.

I think you miss the point in general. Self-hosting people do not talk about isolation. On the contrary, they talk about erasing boundaries between web presences (what fediverse is all about,) making regular people use whatever they want without even knowing about is it in Mountain View or under their own table.

Now a self-hoster becomes a pariah, it’s like wearing a velvet fedora on a beach. If we were transparently welcome to the “web,” and if the “join” was as easy as installing elixir with asdf to the local computer, many would choose self-hosted.

1 Like

It didn’t fail because people specifically don’t want distributed systems thats true, the reality is UX wise distributed systems are just worse.

Gitlab took a bite of the pie through the enterprise, the number of opensource projects hosted on gitlab is still tiny even if it has a large foothold in private repo management.

I think you severely overestimate how many people are even slightly interested in that. Usenet was massive and distributed but also kind of awful because of it so people moved to forums which used to be huge, I at least used to follow dozens. Then with the rise of digg, reddit + facebook everyone of those communities has chosen to migrate to centralised spaces.

Distributed is basically always worse because of the coordination overhead. It’s why cryptocurrencies still suck to use for day to day stuff 16 years after bitcoin launched/10 years after Eth. It’s why every web3 solution is “This normal thing but worse but thats ok because it’s distributed even though we do actually have central coordination servers because if it was properly distributed it would be unusable”

1 Like

Citation needed. In my experience that’s the opposite, save for aforementioned interchange.

In what? Nobody is interested in “distributed” or any other tech. Everybody is interested in actually owning their content. English is not my mother tongue, but I am deluding myself I can express thoughts in it without any additional supervision. If I meant “𝘥𝘰 𝘯𝘰𝘵 𝘨𝘪𝘷𝘦 𝘢 𝘴𝘩𝘪𝘵” I want it to stay that way without a third-party censorship shyly introducing asterisks when I type it without fancy artifice involving utf8. FWIW, this forum does exactly this.

This is valid for real-time stuff only. You know what? Nothing must be real-time in the regular user’s web experience, except weather forecasts maybe.

1 Like

Citation needed. In my experience that’s the opposite, save for aforementioned interchange.

Can you name any distributed systems that has a better UX than a centralised competitor or doesn’t depend on a centralised system to make it usable? Even email would be useless without the centralised DNS system.

Everybody is interested in actually owning their content.

They aren’t though or they wouldn’t move to centralised systems. People care about others seeing and reacting to their content not owning it.

This is valid for real-time stuff only. You know what? Nothing must be real-time in the regular user’s web experience, except weather forecasts maybe.

Even in non-realtime stuff the overhead means you have to spend more time accounting for the coordination issues and less on improving or adding case specific functionality which makes them worse.

1 Like

Would you mind to stop thinking black/white? I never suggested to ditch the centralization and/or centralized systems completely.

DNS must be global by definition. I understand that. I am not an RMS-like freak. Mails themselves though might be held on my HDD without any issue. Have you ever been blocked by your mail provider? I was. Yeah, with all these PDF attachments and whatnot buried six feet under their datacenter.

Tertium non datur? You don’t want working from the yacht sailing in Caribbean because if you wanted you would? That’s not a syllogism though. Pros of centralized in the web now outweight contras of not owning the content. That does not mean people are not angry with centralized.

Really? Have you ever tried the UX of basically every centralized web created in the last decade? Google Search Console? AWS? Facebook? It’s not about better/worse. It’s all about addiction and lack of choice.

Sure. Mastodon (both bare web and mobile.)

1 Like

I believe Elixir still needs more abstractions, even in its current state. For example, we often hear that Elixir is great for scaling, but a newcomer to the language can easily get lost in the multitude of tasks and concepts.

One successful example of high-level abstraction and reducing complexity could be the Broadway library. In my opinion, it might be a better starting point before moving on to more complex roles.

1 Like

You don’t want working from the yacht sailing in Caribbean because if you wanted you would?

It would be like if I was working from the yacht and decided to go buy a house and move there instead because I wanted to be around other people and not just a handful of other people on their boats. Which (sidenote) as someone that seriously considered buying a yacht and living in the Caribbean about 15 years back is actually a pretty common result after a year which is why there are tons of boats for sale around the area rotting away.

Sure. Mastodon (both bare web and mobile.)

You literally said a few posts ago that mastodon lost because of how behind it was. The UX for Mastodon is already worse because you have to first decide on a server so you have decision cost from the start, then you have to hope the server you choose is connected to everyone else that you’re interested in or run your own to control that which is definitely not a better experience.

You didn’t get an analogy building the very same reasoning as you did for “They aren’t though or they wouldn’t move to centralised systems.” My bad. Analogies are hard, I should not have it used.

I never said it lost in UX though. That’s the point you heavily try to ignore. UX is nothing. Nobody cares about UX. I bet 100 to 1 your bank’s web provides a UX worse than one could ever imagine and yet you are still using it. I do, at least.

For the value that is provided by the service, users would choke on their UX.