Portability: Rust - Go Comparison


Isn’t Rust significantly more portable than Go I thought? :face_with_raised_eyebrow:

1 Like

Not very educated on Rust, sorry. But can it cross-compile to every system it supports like Go can? And also, does it support more systems than Go? Last time I checked it didn’t but that was a while ago.


Right now the support for each is:


  • android arm
  • darwin 386
  • darwin amd64
  • darwin arm
  • darwin arm64
  • dragonfly amd64
  • freebsd 386
  • freebsd amd64
  • freebsd arm
  • linux 386
  • linux amd64
  • linux arm
  • linux arm64
  • linux ppc64
  • linux ppc64le
  • linux mips
  • linux mipsle
  • linux mips64
  • linux mips64le
  • linux s390x
  • netbsd 386
  • netbsd amd64
  • netbsd arm
  • openbsd 386
  • openbsd amd64
  • openbsd arm
  • plan9 386
  • plan9 amd64
  • solaris amd64
  • windows 386
  • windows amd64

Rust’s supported platforms is too huge, see it in the docs instead:

Rust has a few tiers of support, but in essence tier 1 has active tests on every single commit as well as full compiler support (I.E. the compiler itself can run on it as well). Tier 2 has full support but automated tests aren’t always run on every commit but are tested on released and candidates and so forth, the compiler/cargo will run on some of these, but not all. Just tier 1 and tier 2 is significantly larger than Go’s supported platforms of just its output code (and go’s compiler won’t run on everything at all). Then you start getting into tier 2.5 and tier 3, which are not build or tested automatically but are available, generally they are very niche-specific systems like windows xp and older and nips and cortex and so forth. Rust can even compile to GPU’s and webassembly natively as part of Tier 2.

Go by it’s very design will never be able to run on the amount of hardware that Rust can. It’s design and GC mean that the language is not powerful enough to do so without making limitations to the language that will break just about every library in the ecosystem (like getting rid of the GC or memory allocation when compiling to embedded systems). Rust is designed to even have a tiered standard lib so you can get some guarantees even on highly restricted microcontrollers, or you can rip it completely out if you so wish and compile a binary that is mere kilobytes in size.

And yes, Rust can cross-compile with ease, generally as simple as cargo build --target=blah-bleep-blorp. Linking is handled by LLVM so sometimes you need a special linker for things like windows when wanting to bind to MSVC libraries or so instead of the open ones (although the cross tool automates most of that with docker images), but in general you can just use cargo to grab the proper platform runtime that you want and compile using it and then of course you have the horror that is Apple, you ‘can’ cross-compile to apple systems but it’s a little (well, more than a little) slice of horror, as cross-compiling to Apple products always is. ^.^;

I’ve PR’d to a few golang projects so far but I just can’t get around how wordy and verbose and unreadable the language is, so I’ve not really put through effort to learn it extensively, and rust has it’s rough edges in syntax but it is fairly comprehensive and significantly more readable (until you start getting into some hairy lifetime declarations, but those are rare and they’ve saved my butt so many times unlike Go that lets you just Interface{} pass everything all horribly).


Sure, I kind of agree, but these tiers of support make things more complex than they should and they also don’t inspire much confidence in me at least. If you tell me “it can compile on platform X” then I don’t expect any exceptions to that. Looking at C and C++'s histories clearly shows that when you start making such fragmented guarantees things spiral out of control. You end up with the committee throwing their hands in the air and having 50+ places in the docs saying “undefined behaviour, sorry”.

However, Go might not guarantee everything on every platform as well (haven’t checked in details).

In my eyes that’s a good thing and not a weakness. Managing complexity and fragmented promises is a constant mental uphill battle and many people’s experiences in IT – mine included – demonstrate that somebody inevitably makes a mistake that then a ton of other people are stuck almost forever with. Why not stick to opinionated languages like Elixir and Go that remove a huge chunk of this mental overhead right from the start? That’s a clear win. We’re not machines, we’re humans and mistakes are a given. The less possible error surface the better! (This also goes in favor of the static typing, of course – which Go, strictly speaking, doesn’t have.) “Less possibility for mistakes” should be an universal “shall we try this tech?” criteria IMO.

That being said, I am glad Rust aims to displace C from microcontrollers. It’s about damn time it got competition there. I would be even more glad if languages like Go and OCaml can get so hyper-optimized that they can work there as well!

Another thing. I am not well-informed but I’ve heard from people that Rust is already suffering from C++'s syndrome of “here, have 20 ways to achieve the same thing” and to me that’s a total deal breaker. I understand the language has to be powerful in order to be able to be compiled to many platforms – but again, I am never going back to a language where you have 50 implementations of something like linked lists, stack and queues just to begin with, without even going to maps or radix trees.

I understand mine is not a technical argument per se. Just sharing what I find jarring about a language. I’ve been in C and C++ for just a few short years about 17 years ago and I am never ever going back to anything that even remotely has their set of problems.

Golang is verbosive, no arguing about it. But with proper editor/IDE snippets support it becomes alright to work with – linting and vetting is another thing though, can’t argue that OCaml for example seems to be a bit easier to grok and do a cursory eyeball check for potential problems (Elixir as well IMO).

As for safety, yep – Rust and OCaml seem like the the clear winners over Go.

Just because the compiler may not compile to something doesn’t mean the programs it cross-compiles to it don’t work with full support, quite often it’s just a lack of platform primitives that a compiler needs like a filesystem or dynamic memory or so. :slight_smile:

If something is in tier 2 or tier 1, you can be guaranteed it will work or it is a major bug, and to those ends those are all quite well used platforms. At the absolute worst case you can compile rust to base machine code without using any stdlib and as long as LLVM can compile to that arch then rust likely can too. Rust does not do undefined behaviour in safe code, hence why if you stick to safe code you can be assured the program will work if it compiles as you programmed it or it is a major bug in the compiler itself and should be reported.

Heh, that’s the thing about it, using a GC is a major mistake and prevents Go from being used in a lot of areas, but its too far in to change it now, hence that’s the issue. ^.^

Ah but this is a great example as to why both Elixir and Go are not very opinionated, or rather they are opinionated in the wrong ways. Especially with Go the ‘opinionated’ nature of it has to do with design and nothing to do with safety or maintainability, and this is why every project I’ve PR’d to so far gets Interface{}'splosions everywhere with crashes half the time (hence the PR’s). Even Kubernates was the prime example of Go being such a bad language that they actually built a new typing system on top of it just to get it manageable.

In comparison, Rust is extremely opinionated, but it’s opinions are to force you to write code that actually works, it doesn’t care much about your design or what you want to write, only that what you write won’t crash because of poor coding but rather that is only possible because of explicit, not implicit, decisions by the programmer themselves. Go’s error surface is utterly ginormous compared to Rust. Elixir’s is as well though that’s primarily owed to its dynamic typing, the OTP helps buffer that a lot.

Eh, both Go and OCaml have GC’s, thus they will never be suitable for microcontrollers regardless of the optimizations that they have. At best if you could pre-alloc all memory needed before hand as a static chunk and could keep it highly constrained to that then it would help, but you are still losing efficiency on the memory handling and it will eat up extra memory overall. Like OCaml is significantly more safe than Go, programs you write in it are like rust in that you can be pretty sure they are stable (Rust is more stable though I have to admit), but the GC adds significant overhead that cannot be ignored at the small scales.

If you have a language that can actually choose the machine code it outputs in detail, then you have a language that can do absolutely everything in some form. That doesn’t mean choice is bad, and they have been very good at choosing the libraries that work the best of a given ecosystem to bring in to the stdlib, but if Rust got a builtin, say, good transaction based memory synchronization across threads model, do you really want that to mean that the Actor model (Elixir’s model) should be disallowed (as an aside, Rust has a fantastic actor library called Actix)? Choice is good. I do agree though that there should generally be one obvious choice for a given purpose, and in Rust that was more fuzzy in the past due to how new it was but most areas are very well set now. Even just recently a multitude of Rust game engines have just recently been coalescing around the gfx-hal library for the rendering backend and thus it has become The Standard for the GPU interface layer (gfx-hal is really cool actually, it supports Metal, Vulkan, DX12, and OpenGL (mostly, webgl is still needing more work), and it exposes a vulkan-like API that works with them all very efficiently, it’s kind of like bgfx in the C++ world but far better made and very used now). :slight_smile:

But nah, the standard library has all the usual containers very well optimized. Of course there are use-case-specific-optimized versions like statically sized vectors and so forth as libraries but in general the stdlib has what you need.

C++ is a…special case, it’s standard library did not have the advantage of looking at others at the time to know what was good design, and hence why libraries out there tend to do better than the standard in almost every way. Rust, like Go, does not have that issue.

Proper editor/IDE support does not make a 4-deep embedded loop with Interface{} 'cast-and-pray’ing any cleaner though (yes that was one of the PR’s I had to fix up because that code had a bug :fearful:). ^.^;


@AstonJ You could split me and @OvermindDL1’s comments here in something probably called “Language portability: Go vs. Rust vs. others” or something.


I agree with everything you said. I have some personal preferences to keep sharing with you though.

I am not arguing Rust’s benefits. I am just saying that if it’s not your 1st language then it’s probably gonna be a struggle to get used to that model. I admit I am looking at that solely from my point of view of a programmer that’s not very keen on learning several new languages or frameworks every year anymore. So if anything has a better tech-marketing (if that term makes sense) I’m much more likely to pick it up compared to if it has a lot of if-s and but-s in terms of where and how it’s supported. That’s all I meant; I am not arguing the merits which seem quite obvious and good.

Eh, that’s assuming Golang’s creators even entertained the idea of ever competing with C in the microcontrollers space. It’s a fair choice if they decided they didn’t want to bother.

Sure. I have to admit I am starting to itch for static typing and that’s why I’ll pick up OCaml this year, it’s a done deal (I’ve also done some reading to understand the implied types mechanic and that to have a highly-optimized machine code you should try and not make your functions generic but rather very type-specific – a notion I very much agree with).

As for Golang + K8s, I’ve heard the story and it’s quite absurd. Additionally, K8s seems to try and emulate OTP just on multi-machine & multi-region level which is admirable but it leads to a lot of duplicated effort. Oh well, it’s their time they are using, right? :102:

See, that’s why I said I am not informed well. I looked at it a year and something ago and wasn’t satisfied with that aspect. I am very glad they are rectifying this! It is often a make-or-break sign for the language’s future. Imagine if we had 5 competing Ecto-like libraries; many people wouldn’t take Elixir itself seriously if that was the case. That was my main point here and it seems we’re in agreement, plus that the Rust community is trying to converge and gravitate towards agreed-upon-libraries for more things than before. This bodes very well for it!

That’s true of course. Sympathies that you had to go through that! :101:

That’s a really huge discussion but overall I disagree: C++ was IMO borne out of the necessity to commercialize this whole computer programming thing and do it fast. Also it accidentally clicked with the lowest common denominator of programmers back then and was thus well-positioned in terms of marketing – at the time anyway. Its success is more a combination of an incident + corporate pushes than any technical merit or proficiency at all.

I don’t have a hard proof for those claims, just impressions of my work with it and the comments of 45+ year old very hardcore programmers who 17 years ago were commenting over their coffee break: “if there was something FP-like and with less undefined behaviour parts in the spec than C++ I’d take it right away even if it meant I have to rewrite 2 million lines of code to use it”. And these were people that wrote libraries running on 20+ different embedded boards in an afternoon. These were people that couldn’t rely on if (ptr == NULL) because on certain boards 0x00000000 was a valid addressable memory address. Etc.

C++ didn’t have the benefit of 20+ years of experience with something before it that made so many mistakes as it did. That’s true. But I’d go on and claim that the C++ committee and community didn’t even want to admit it was making mistakes for the longest time. Most of the C++ discussions on the forums I’ve seen ended with “you are doing it wrong” or “get good, noob”. I suspect this is largely unchanged even today.

My point with this is – I really do hope Rust doesn’t end up the same way. With what you said you are giving me hope that things will be better this time around.


Cool, I like hearing. :slight_smile:

Rust is an interesting combination of haskell/ocaml typing, without the inference (the heck?!) and with some added refined typing (ooo nice) to get rid of the GC, thus cementing it in a place to take on C/C++ itself. :slight_smile:

Unsure if they do, but I know some people have tried it, they had to entirely cancel deallocating memory, had to use pools for everything, had to do so much stuff just to make sure the GC was never accidentally collecting, it was not pretty… ^.^;

One really irritating thing about Go’s creators is that one phrase that one of them said that essentially boiled down into newbie programmers are stupid and not capable so they need something basic, at least that is how I and most people have read in to it, and the language really seems to reflect that with its lack of capabilities… >.>

OCaml is a fantastic language to learn HM typing on, and it’s a great Python replacement (unless you need some specialized python-only library, but you’d be surprised what OCaml’s opam has). :slight_smile:

I find it sad because they said they were initially pulled in by Go’s promises but those promises hadn’t been fulfilled and they ended up having to essentially rewrite huge swaths of it to get it to work…

I actually very much like their whole RFC process, Elixir could really use something like that. For every new proposed feature different alternatives are looked at and it’s discussed and debated to see if it is generically useful or at least can be used as such with an as easy to use interface as they really can seem to get, it’s nice to just read through those RFC’s. :slight_smile:

I still find it horrifying that Go’s own standard library returns Interface{} types so often… >.>
That’s like C/C++ returning void*'s or, well, rust doesn’t have such a horrifying version without dropping into unsafe code… >.>
All for stuff that an Variant/ADT would have solved with SO much less work… Blegh…

Mmm, I don’t really think C++ was borne out of necessity, C had already been around and C++ was just a layer on top of it with more optional features. You could actually make decently safe programs in C++ compared to C due to its greater typing abilities, though using them was still optional and a lot of people didn’t, thus not gaining the usefulness of it. Well standard written C++ is almost as safe as Rust with similar patterns as Rust, while being even more succinct and readable than Rust, just you don’t have the compiler saving you (by yelling loudly at stupid stuff) in a lot of areas that Rust does. ^.^;

That sounds like Rust now… ^.^

For note though, if (ptr == NULL) is still entirely safe on a system where 0 is a valid memory location, because NULL gets defined to whatever the empty-pointer sentinal value should be for the system it is compiling for (which is often 1 on systems where 0 is valid because 1 is not addressable on most of those systems because of alignment reasons, but there are many possibilities depending on the architecture). :slight_smile:

The committee was a bit of a joke back in the day, although the past 2 decades have been fixing it far more quickly and the last 10 years have really been active on deprecating things, which is quite frankly amazing to me that they’d do that. ^.^;

I’ve seen a variety of languages that could potentially compete with C/C++ but I never saw them actually having a chance. Rust is the first language I’ve ever seen that I actually do believe has a chance in offsetting C/C++ as King for a huge variety of reasons, though that is still a few-odd years out. :slight_smile:

1 Like

Likewise, the test !ptr works on those systems, as I’m sure you know, just throwing it in for completeness since multiple times I’ve seen people who didn’t know better argue that kind of test is not safe because NULL is not always 0.

As long as ptr is typed as a pointer it should otherwise good luck. ^.^

Personally I prefer using if(ptr == nullptr) as it causes a type failure if ptr is not a pointer (unlike == NULL) and it always does the right thing on each arch, though I like using types very strongly even in C++. :slight_smile:

1 Like

well, yeah, nullptr is both awesome and decades overdue

1 Like

As every Eastern European will tell you, democracy is hardly the most optimal managerial system. :021: I quite like how Elixir is managed.

RE: everything else, yep, completely agreed!

Well in Rust the Core group always has final say, they just listen to everything that was discussed and tested and debated, and thankfully they almost always give reasons for their final choice too (almost always, the horror that is their recent website is, well, a horror, the whole process was a horror…).

The most optimal managerial system would of course be a specialized advanced machine intellect. All hail our machine overlords! ^.^

Absolutely. That will be the day we stop pretending we don’t suck legendarily at management and ruling other people – and humanity will become happier as a result.

1 Like

Yes exactly! Singularitists Trans-Humanist unite! Lol. ^.^

I’m all for saving on resource by just uploading all humans into a virtual world where they can do anything, build up a matrioshka brain around the sun for the computing power. With the current limit of physics as we understand them right now we could fully simulate a human brain and it’s I/O with about 12W of power (or with expected heat losses anywhere from 16-24W ) at a realtime speed, which is SO substantially less than what our bodies require. Load copies of yourself into fully automated spaceships to send around the cosmos, or just let the AI manage it and send all data back for us to play with. Humanity could use more power to super-accelerate our time. We could become and do anything in our little virtualities harnessing the power of stars to create fully distinct human civilizations (light speed communication between stars when we are operating at accelerated time would be so monumentally slow compared to us that other such virtualities may as well not exist to any other). We could do and experience and think of anything we can imagine (and probably a lot more given an external driving MI to manage it all).

Even if we didn’t bring humanity into virtualities, the human brain is still fully a classical machine, it has no quantum operating parts, thus no randomness, so given an accurate simulation and the same inputs and outputs you could still simulate your own brain a million times over with slight variations to determine what you want ahead of when you want it, and an MI could deliver whatever you want to you right when you want it, or even a hair of a millisecond before you want it ‘in the real world’.

With sufficient computing power (like a matrioshka brain, which we have the technology to start building right now) and a guiding general MI (it doesn’t even need to be self-aware, just needs to be able to process enough inputs with enough relationships to be able to determine pre-defined outputs) then humanity could live in the very definition of a paradise of their own creation.

If this subject continues then this will be yet another thread to be branched out. ^.^

1 Like