The problem with Rust
Its cargo cult is the fusion of all of computing’s worst nightmares. Let's break that down and see why it’s a problem.
Today should be an absolute treat. We’re going to go through piece-by-piece this rant by William Woodruff: Weird architectures weren’t supported to begin with.
Here’s the basic context: the Rust programming language has a management problem with supporting as many machines as C does. The author is a member of Rust’s Cargo Cult – a term used to refer to Rust’s ideological adherents and propagators of the Truth About Memory Safety and So On.
While this rant exists seemingly alone, floating about the Web as an anecdote, it garnered much notoriety for unifying the emotional succour and ideological deference shared by many Cargo Cultists. It’s a great window into how most passionate Rust programmers think about the world of software.
There’s been a decent amount of
dramadebate in the open source community about support recently, originating primarily from pyca/cryptography’s decision to use Rust for some ASN.1 parsing routines1.
To summarize the situation: building the latest
pyca/cryptographyrelease from scratch now requires a Rust toolchain. The only current Rust toolchain is built on LLVM, which supports a (relatively) limited set of architectures. Rust further whittles this set down into support tiers, with some targets not receiving automated testing (tier 2) or official builds (tier 3).
By contrast, upstream GCC supports a somewhat larger set of architectures. But C, cancer that it is, finds its way onto every architecture with or without GCC (or LLVM’s) help, and thereby bootstraps everything else.
Program packagers and distributors (frequently separate from project maintainers themselves) are very used to C’s universal presence. They’re so used to it that they’ve built generic mechanisms for putting entire distributions onto new architectures with only a single assumption: the presence of a serviceable C compiler.
This is the heart of the conflict: Rust (and many other modern, safe languages) use LLVM for its relative simplicity, but LLVM does not support either native or cross-compilation to many less popular (read: niche) architectures. Package managers are increasingly finding that one of their oldest assumptions can be easily violated, and they’re not happy about that.
Up to this point, William is pretty fairly laying out the introductory context and facts of the matter. It’s a fair assessment, but then he says this:
But here’s the problem: it’s a bad assumption. The fact that it’s the default represents an unmitigated security, reliability, and reproducibility disaster.
It’s a bold claim for sure, and the emphasis in these block quotes certainly aren’t mine. If you’re a Cargo Cultist you already know exactly where he’s going with this.
Imagine, for a moment, that you’re a maintainer of a popular project.
Everything has gone right for you: you have happy users, an active development base, and maybe even corporate sponsors. You’ve also got a CI/CD pipeline that produces canonical releases of your project on tested architectures; you treat any issues with uses of those releases as a bug in the project itself, since you’ve taken responsibility for packaging it.
I’m going to point out that this is an asinine presumption to make, but whatever. It’s a silly hypothetical.
Because your project is popular, others also distribute it: Linux distributions, third-party package managers, and corporations seeking to deploy their own controlled builds. These others have slightly different needs and setups and, to varying degrees, will:
Build your project with slightly (or completely) different versions of dependencies
Build your project with slightly (or completely) different optimization flags and other potentially ABI-breaking options
Distribute your project with insecure or outright broken defaults
Disable important security features because other parts of their ecosystem haven’t caught up
Patch your project or its build to make it “work” (read: compile and not crash immediately) with completely new dependencies, compilers, toolchains, architectures, and environmental constraints
You don’t know about any of the above until the bug reports start rolling in: users will report bugs that have already been fixed, bugs that you explicitly document as caused by unsupported configurations, bugs that don’t make any sense whatsoever.
This sounds like a completely normal day in the seat of any popular project maintainer. Life sucks sometimes, but you carry on, y’know? I know he’s going for fire and brimstone here, but so far I’m unfazed. Maybe that’s just me.
You struggle to debug your users’ reports, since you don’t have access to the niche hardware, environments, or corporate systems that they’re running on. You slowly burn out under an unending deluge of already fixed bugs that never seem to make it to your users. Your user base is unhappy, and you start to wonder why you’re putting all this effort into project maintenance in the first place. Open source was supposed to be fun!
There’s a vital conflation going on here. Most people do not use these niche hardwares, because obviously they are niche, yet most people somehow end up unhappy about this. I don’t know, are open source communities toxic like that? I haven’t kept up, maybe they are. That sounds pretty unreasonable though.
What’s the point of this spiel? It’s precisely what happened to
pyca/cryptography: nobody asked them whether it was a good idea to try to run their code on HPPA, much less System/3906; some packagers just went ahead and did it, and are frustrated that it no longer works. People just assumed that it would, because there is still a norm that everything flows from C, and that any host with a halfway-functional C compiler should have the entire open source ecosystem at its disposal.
Yeah, I can see why they’re frustrated. They were relying on a cryptography project that was written in C, and the project leads made a command decision with apparently no real consultation or conscience to the greater reality of their users to switch to Rust, because they believe it’s their prerogative as security-conscious programmers. That was a pretty foolish move from the project leads then, right?
Security-sensitive software, particularly software written in unsafe languages, is never secure in its own right.
The tantalising tautology is tautologically tautological. Snooze.
The security of a program is a function of its own design and testing, as well as the design, testing, and basic correctness of its underlying platform: everything from the userspace, to the kernel, to the compilers themselves.
This is a really big point here! You know what? I disagree! I don’t think correctness of code is the end-all-be-all. And I don’t think testing matters much ultimately either.
One of the core ideas about C* is that managing the correctness of data matters way more than the correctness of the code. This idea rests on a simple conjecture: malformed data is far more confounding to the mind of a programmer than a mistyped program is. It messes with our ability to model our code far more when we’re forced to think about how it might handle some weird forms of data, or even malicious attempts to break the code by using malformed data. It’s hard on our brains. A lot harder than it is when we misuse a programming pattern, or misplace a loop counter somewhere in a function. Our eyes do most of the work at catching those. But the data? Beats me, man. Human mush go mush mush.
The whole idea of memory safety was botched in the research paper that birthed it, by the way. It conceives of memory regions not as mere data stores but as regions that have the ability to affect each other through interdependency, and claims to present some abstract noninterference principle. I don’t know every detail, but I read enough of the abstract to concur with Yarvin about this one: computer science research papers are bullshit.
That isn’t to say that any particular niche target is full of bugs; only to say that it’s a greater likelihood for niche targets in general. Nobody is regularly testing the mountain of userspace code that implicitly forms an operating contract with arbitrary programs on these platforms.
This might sound like a stupid question, but I think it’s an important one: if Rust is so inherently safe, what’s all this testing for? Is it not actually that safe at all without testing? That sounds hellishly precarious. Not to mention dishonest about the language.
As someone who likes C: this is all C’s fault. Really.
Oh Jesus, man. Let’s hear it.
There’s no standard way to write tests for C.
You need a standard for writing tests?
There’s no standard way to build C programs.
Invoking the compiler with arguments it specifies isn’t enough?
There’s no standard way to distribute C programs.
Sure there is! It’s called the World Wide Web. Ever heard of it?
All of these points have huge load-bearing on the word “standard”. Ever wondered what that really means? Well, I have some bad news for you. Standards are unicorns. To quote Peter Welch, “there are more standards than there are things your computer can actually do.”
It’s also really disingenuous, because it’s saying all of the biggest problems C faces aren’t technical, but political. A lack of standardisation is a political problem! You’re not trying to create some kind of code that doesn’t exist, you’re trying to coordinate people to do something the same way! What on Earth were you expecting out of programmers in the 1980s? A government?
There’s no such thing as truly cross-platform C.
This one is worse than disingenuous, it’s actually wrong. Remember, in this rant of his, we’re defending the programming language whose adoption into projects is actively breaking real world machines and causing project managers to complain. We’re steelmanning the language that supports less platforms rather than more, not just as a matter of fact but also as a matter of principle. If the wording reminiscent of a certain Scotsman didn’t give off the scent right away about this one, there you go.
The C abstract machine […]
I’m going to ellipse here just to point out he’s not criticising the programming language itself here at all. He’s criticising an abstract machine specified by ISO, and pointing out that within these abstract confines you cannot do a lot of things that you would need to do.
It’s a really common way that Cargo Cultists strawman C and C++. Instead of talking about the implemented language that literally everyone in the real world uses, where all of these concerns are assuaged and then some, they instead attack the theoretical language that literally no one in the real world uses. Of course, they pit this theoretical language against their real language that doesn’t have an ISO standards track and is also even in active, constant development and is therefore a huge moving target. I don’t think I could be more disingenuous and uncharitable to C if I wanted to be.
By contemporary programming language standards, these are conspicuous gaps in functionality: we’ve long since learned to bake testing, building, distribution, and sound abstract machine semantics into the standard tooling for languages (and language design itself).
I missed the part where these things became established fact in the field. Sure, they’re typical and normal to do in the industry, but it’s a really dangerous thing to grasp a norm and turn it into a dogma. We shouldn’t be doing that.
I’m going to be kind to both of us here and skip over most of his purported solution. As far as I know it’s really benign, uncontroversial tidbits about proliferating standards and telling everyone else how to do their jobs. This guy is clearly a bureaucrat.
There is one edict he issues from his podium that I cannot stand, however: “Give up on weird ISAs and platforms.” This is not only wrong, it’s horrifyingly destructive to the creative fabric of software engineering and I’m going to explain why. (Emphasis, again, not mine.)
I put this one last because it’s flippant, but it’s maybe the most important one: outside of hobbyists playing with weird architectures for fun (and accepting the overwhelming likelihood that most projects won’t immediately work for them), open source groups should not be unconditionally supporting the ecosystem for a large corporation’s hardware and/or platforms.
This view of weird ISAs is absolutely infantilising. And since he is, to reference Yarvin’s post linked earlier, a bureaucrat, the political value to people like him is obvious. He carried on with handwaving sympathies to project leaders and package maintainers, not as fellow programmers, but as fellow bureaucrats, who have a shared goal of easier pencil pushing methods for tomorrow.
Let me ask, then: do you know about the Epiphany-V? It’s a 1024-core RISC processor developed by Adapteva with funding from DARPA. Guess what? It outperformed its contemporary Nvidia Pascal architecture in energy efficiency by a factor of 2:1. Considering Nvidia boasted so publicly and spent several billion dollars touting Pascal’s energy efficiency, I think everyone could agree that’s a pretty huge deal, right? Okay, so this isn’t just fun hobby stuff, it’s also serious business. Guess what else? Rust won’t be coming to it any time soon. It’s a weird ISA, which people like William want to cast away for the political pain they cause people like him. But beyond that, it’s a genuinely challenging architecture to program at all. It’s performant as hell, but it lacks virtual memory, among other things. I guess the best things really do come at a price after all. No free lunches in theoretical computer science.
It is disturbing to me that a programming language has spawned this culture of sorts that treat engineering issues as political problems. It’s not surprising, though, not to me at least. The last few years have proven quite dry in terms of innovation, meanwhile the university continues its onward death march of impact-making and grant-proposing completely unabated. It seems logical to think such forces coalesce and reinforce their inherently political nature, slowly gaining more and more self-awareness about what their true purpose is.
As a language, Rust isn’t all or even mostly bad. Its core selling feature may be erroneous and does indeed remain totally unproven in the wild, but it has some other nice stuff like a concise async syntax, I’m told. There are lessons yet to be learned, as always.
But the problem here is beyond the language. Rust is the language of choice for an overtly political Cargo Cult that has no head and views itself as sovereign over the whole landscape of systems programming. As far as they’re concerned, they always arrive at a fair and equitable academic consensus about the Truth of any technical matter, and anyone who dissents without going through their sanctioned channels of “discussion” is an ignoramus at best and a dangerous heretic at worst.
This academic carbon copy of the American national security apparatus is as horrifying as it is dangerous to the well-being of computer science, which wasn’t doing very well since the 1960s to begin with. Any programmer with a genuine sense of compassion for their craft has a duty to disassociate from these people as much as possible, for the sake of what they do. These shepherds will presume you to be of their flock unless you make it abundantly clear that you are not.