Rust's Safety Guarantee Is Now a Political Argument, and It's Working
-
Adhithya (Adhi) Ravishankar - 12 Apr, 2026
The most consequential thing that happened to Rust in 2025 had nothing to do with the language itself. It happened in Washington, in Brussels, and in the security teams of companies whose core business has nothing to do with programming languages.
Memory safety vulnerabilities — buffer overflows, use-after-free bugs, null pointer dereferences — have been responsible for roughly 70% of critical security vulnerabilities in major software projects for years. Microsoft has reported this about Windows. Google has reported it about Chrome and Android. The NSA published papers on it. This has been known for a long time. What changed in 2025 is that government agencies and regulators started treating it as a policy problem with a technical solution, and the technical solution they named was memory-safe languages.
The Regulatory Push
The US Cybersecurity and Infrastructure Security Agency (CISA) released a roadmap for memory-safe software that went further than previous guidance. Rather than recommending that organizations “consider” memory-safe languages, it set timelines and called out specific sectors — critical infrastructure, federal contractors, healthcare IT — where migration away from memory-unsafe languages should be treated as a priority, not a long-term aspiration.
The EU’s Cyber Resilience Act, which came into force and began affecting product compliance requirements, embedded software security standards that made memory safety a documentable and auditable requirement for certain classes of products. Hardware vendors, automotive suppliers, and industrial equipment manufacturers selling into European markets found themselves needing to either justify the use of C/C++ with extensive tooling and process controls, or explain their migration path.
None of this mandated Rust specifically. But Rust is the only language that currently satisfies the “systems-programming performance” requirement while providing the memory safety guarantees these frameworks are asking for. Go qualifies on safety grounds but doesn’t cover the bare-metal and embedded use cases. C++ with hardened profiles and sanitizers can partially address the issue but requires extensive tooling investment and doesn’t provide the same compile-time guarantees.
What Adoption Actually Looks Like
The Rust adoption story of 2025 was less about new greenfield projects and more about the seams where Rust started appearing inside existing systems.
In the Linux kernel, the trajectory shifted from experimental to structural. The initial Rust support in 6.1 was celebrated but limited. Through 2024 and into 2025, the pace of Rust additions to the kernel increased — device drivers, filesystem components, and networking code — and the kernel mailing list’s tone shifted from debating whether Rust belonged to debating how to grow the pool of Rust kernel contributors. Linus Torvalds, who was skeptical early, has been publicly supportive of the direction.
In browsers, Mozilla’s use of Rust in Firefox components predates the current moment, but Chrome’s adoption of Rust for new security-sensitive code became substantial. The browser security argument for Rust is overwhelming — sandbox escapes and renderer vulnerabilities are disproportionately memory safety bugs, and both major browser engines have now committed resources to Rust for new security-sensitive code.
In cloud infrastructure, AWS, Google Cloud, and Azure all have production Rust code at various layers of their stacks. AWS’s Firecracker microVM — the technology underlying Lambda and Fargate — has been written in Rust since its introduction and has become a reference example for what safe systems code looks like in production at scale.
In embedded and automotive, this is where the regulatory pressure was most directly felt. The Rust Embedded Working Group formalized support for a wider range of microcontroller targets, and automotive suppliers working toward ISO 26262 compliance started evaluating Rust’s safety profile seriously. The toolchain for safety-critical Rust work is still maturing, but the trajectory is clear.
The Borrow Checker Is the Point
There’s a recurring debate in software communities about whether Rust’s compile-time safety guarantees are worth the learning curve. This debate often frames the borrow checker as an obstacle — something you fight through to get to writing real code.
The security argument reframes this entirely. The borrow checker is not an obstacle. It is a formal verification of the most common class of critical security vulnerabilities, applied at compile time, with zero runtime overhead. You are not fighting the compiler. The compiler is preventing you from writing the category of bugs that made 70% of your CVE history possible.
Once this framing settles in, the learning curve doesn’t feel different — it’s still steep — but the motivation to climb it changes. You’re not learning an idiosyncratic type system for its own sake. You’re adopting the only mainstream language that can prove, before your code ships, that a specific class of catastrophic bugs isn’t in it.
The Talent Gap Is Real and Getting Expensive
The main constraint on Rust adoption in 2025 was not technical or organizational — it was people. There are not enough experienced Rust engineers, and the supply is not growing as fast as demand.
Senior Rust engineers — people who’ve shipped production Rust, understand the async ecosystem, have opinions about which async runtime to use and why — are expensive and hard to find. Teams that want to adopt Rust are frequently slowed not by the language itself but by the cost of building internal expertise from scratch.
This is being addressed at the supply side: university CS programs started incorporating Rust into systems programming courses, and corporate training programs for Rust have proliferated. But there’s a multi-year lag between training investment and production-ready engineers, and in the meantime the demand side is moving fast.
If you’re a C or C++ systems programmer and you haven’t learned Rust yet, your situation is analogous to a Java programmer in 2010 who hadn’t looked at the JVM alternatives yet. The window where that’s a comfortable position is closing.
The Long Game
Rust is not going to replace C and C++ in existing codebases in the near term. The installed base of C and C++ is measured in billions of lines, and rewriting working code for its own sake is not how engineering organizations allocate resources.
What Rust is doing — and what 2025 accelerated — is becoming the default choice for new systems code. When a team needs to write something new that would previously have been written in C++, Rust is now the answer they have to argue against rather than argue for. That inversion matters enormously for where the language is in ten years.
The safety argument won. The political argument is now carrying it forward into domains where technical merit alone would have moved things slowly. The combination of those two things is why 2025 was Rust’s inflection point.