Every programming language is, at its core, a written argument. An argument that the languages which came before failed at something important — something worth spending years of your life to fix.
Sometimes the failure is performance. Sometimes it’s safety. Sometimes it’s the sheer weight of complexity that accumulates in a language over decades. And sometimes a language is born because the world has changed — new hardware, new workloads, new problems — and the old tools just don’t fit anymore.
The period from 2009 to 2024 produced an unusually dense cluster of new languages, many of which are now shaping mainstream software development. Here’s what each one was arguing against, and whether their argument held up.
Go (2009) — The Argument Against Build Time
The problem: C++ was eating Google alive.
In 2007, Robert Griesemer, Rob Pike, and Ken Thompson were sitting through a 45-minute C++ compilation at Google. By most accounts, that wait — not any single design flaw, but the accumulated friction of working with a large C++ codebase at scale — is what finally prompted them to start sketching a new language on a whiteboard.
Go was designed around a specific set of frustrations: C++ build times were unbearable, dependency management was a mess, and concurrent programming required expertise most developers didn’t have. Go’s response was radical simplicity. No inheritance. No generics (initially). A built-in concurrency model — goroutines and channels — based on Tony Hoare’s CSP formalism from 1978.
Goroutines start with just 2KB of stack (compared to ~1MB for a Java or OS thread), which means you can run tens of thousands of them on a single machine. The scheduler multiplexes them onto OS threads automatically.
Go reached 1.0 in March 2012 and has since become the dominant language for cloud infrastructure — Docker, Kubernetes, Terraform, and most of the tooling that runs the modern internet are written in Go.
What it got right: Build speed, operational simplicity, concurrency at scale. What it left behind: Generics (added in Go 1.18, 2022, a decade after launch), more expressive type systems.
Kotlin (2011) — The Argument Against Verbosity
The problem: Java required too much ceremony for too little result.
JetBrains — the company that makes IntelliJ IDEA, the IDE used by most Java developers — had a unique vantage point. They saw every pain point Java developers encountered, every boilerplate pattern typed thousands of times, every NullPointerException filed. They also depended on the JVM ecosystem too heavily to abandon it.
The first public announcement of Kotlin came at the JVM Language Summit on July 19, 2011. The first git commit was November 8, 2010. The design goals were precise: statically typed like Java, 100% interoperable with existing Java code, but with null safety built into the type system from day one, data classes in a single line, extension functions, and a concise syntax that didn’t require semicolons or public static void main.
Kotlin 1.0 shipped in February 2016. In May 2017, Google announced Kotlin as an officially supported language for Android development. By 2019, Google declared it the preferred language for Android. Today over 95% of the top 1,000 Android apps use Kotlin.
What it got right: Smooth migration path from Java, null safety, conciseness, multiplatform ambitions.
What the argument was: You shouldn’t need Optional<T> boilerplate and NPE guarding in every method — the compiler should handle it.
Elixir (2011) — The Argument Against Ruby’s Concurrency
The problem: Ruby was beautiful but couldn’t scale.
José Valim was a core contributor to Ruby on Rails — one of the most productive web frameworks ever built — and he loved the language. But Ruby’s Global VM Lock (GVL) meant that true parallelism was essentially impossible. One thread ran at a time. For the kind of real-time, concurrent applications the web was increasingly demanding, Ruby ran out of road.
Valim’s insight was to look at Erlang. Erlang had been solving massively concurrent, fault-tolerant distributed systems problems since 1986 at Ericsson — it powered telecom switches with nine-nines (99.9999999%) availability. The BEAM virtual machine, which Erlang ran on, was exceptional. But Erlang’s syntax was notoriously unfriendly.
Elixir runs on the BEAM VM and compiles to the same bytecode as Erlang. Every Elixir process is isolated with its own heap — a crash in one process cannot corrupt another. The garbage collector runs per-process, not globally, meaning GC pauses in one process don’t affect the rest of the system. WhatsApp, which handled 900 million users on a small engineering team, ran on Erlang/BEAM infrastructure.
Elixir added a modern, Ruby-inspired syntax, a thriving package ecosystem (Hex), the Phoenix web framework, and metaprogramming via macros — all on top of a runtime that had already proven itself for 25 years.
What it got right: Fault tolerance, concurrency, a production-proven runtime, developer ergonomics. What the argument was: You shouldn’t have to choose between pleasant syntax and genuine concurrency.
Swift (2014) — The Argument Against Objective-C’s Age
The problem: Objective-C was essentially unchanged since the early 1980s.
Objective-C was a thin object-oriented layer added to C in 1983. By 2010, Apple’s primary application language had manual memory management, verbose syntax with bracket-heavy method calls, no namespaces, no generics, and a nil-messaging behavior that silently swallowed errors instead of surfacing them. Writing safe iOS code required expertise and discipline that many developers lacked.
Chris Lattner, who had joined Apple in 2005 and built the LLVM compiler infrastructure, began working on Swift in 2010 as a personal project. It was secret inside Apple for nearly four years. When it was unveiled at WWDC on June 2, 2014, the audience reaction was described as stunned — no one had known it was coming.
Swift had optionals (no more silent nil), type inference, value semantics for structs, closures, generics, and a playground environment for interactive development. It used Automatic Reference Counting (ARC) — the compiler inserts memory management calls, so there’s no garbage collector pause, but also no manual free. Swift was open-sourced in December 2015.
What it got right: Safety, expressiveness, ARC memory model, and a clear migration story from Objective-C. What the argument was: Thirty years of Objective-C debt should not be the price of writing Apple software.
Rust (2015) — The Argument Against Unsafe Systems Programming
The problem: C and C++ were responsible for a catastrophic proportion of security vulnerabilities.
Graydon Hoare started Rust in 2006 as a personal project while working at Mozilla. Mozilla began sponsoring it in 2009, motivated by the Firefox browser — millions of lines of C++ handling untrusted web content, with memory safety bugs regularly becoming security exploits.
Rust’s central innovation is the borrow checker: a compile-time static analysis system that proves memory safety without a garbage collector. In safe Rust, use-after-free, null pointer dereferences, buffer overflows, and data races are impossible — not just warned against, but impossible to compile. This is not a runtime check; it is a mathematical proof performed before your program runs.
Rust 1.0 shipped in May 2015. The consequences have been significant: a 2019 Microsoft study found ~70% of their CVEs were memory safety bugs. A similar analysis at Google found ~70% of severe Chrome security bugs had the same root cause. In 2022, the U.S. NSA published guidance recommending memory-safe languages including Rust. In 2024, the White House Office of the National Cyber Director published a report recommending Rust for new systems programming.
Rust has been the most admired/loved language in the Stack Overflow Developer Survey every year from 2016 to 2024 — nine consecutive years.
What it got right: Memory safety with zero runtime overhead, the borrow checker as a provable guarantee. What the argument was: Safety and performance should not be a choice you have to make.
Zig (2016) — The Argument Against Hidden Complexity
The problem: C had accumulated decades of undefined behavior, preprocessor macros, and invisible overhead.
Andrew Kelley started Zig in 2016 with three goals printed in the documentation: pragmatic, optimal (C-equivalent performance as the natural default), and safe. Crucially, Zig’s safety is in the passenger seat, not the driver’s seat — you can turn it off when you need maximum performance.
Zig’s argument against C is not about safety in the Rust sense. It is about honesty. In Zig, there is no hidden control flow, no hidden memory allocations, no preprocessor macros (replaced by comptime, where ordinary Zig code runs at compile time), and no undefined behavior in release builds. If a Zig function can fail, that must be visible in the function signature and the call site.
Zig can cross-compile C code and is being used to rewrite parts of LLVM. It is the foundation of Bun, the fast JavaScript runtime, and has attracted investment from major funds on the strength of its systems programming credentials.
What it got right: Compile-time code execution (comptime), honest resource management, C interoperability. What the argument was: C shouldn’t need a 500-page standard describing what behavior is “undefined.”
Carbon (2022) — The Argument For Migration
The problem: C++ isn’t going away, but it needs a successor that existing codebases can migrate to.
Google engineer Chandler Carruth unveiled Carbon at the CppNorth conference in Toronto in July 2022. Carbon’s argument is different from Rust’s — it is not trying to win on safety or start fresh. It is trying to give the world’s billions of lines of C++ code a path forward.
Carbon is designed for full bidirectional interoperability with C++ at the ABI level. You can call Carbon from C++ and C++ from Carbon without wrappers or recompilation. The pitch is incremental migration: take a large C++ codebase, start writing new modules in Carbon, migrate existing files over time.
As of early 2026, Carbon remains experimental. No production deployments. The language’s viability depends entirely on whether Google and the wider C++ community decide the migration story is worth backing.
What it got right: Learning from Rust’s adoption friction, taking interoperability seriously as a first-class concern. What remains unproven: Whether the C++ community will coalesce around it.
Mojo (2023) — The Argument For AI Performance
The problem: Python dominates AI/ML but is too slow for production inference.
Chris Lattner — who built LLVM, created Swift, and founded the Modular AI company — announced Mojo in May 2023. The pitch was direct: Python has won the AI/ML ecosystem, but Python’s performance is a constant source of pain. NumPy, PyTorch, and TensorFlow are Python APIs backed by C/C++/CUDA kernels precisely because Python itself cannot do the math fast enough.
Mojo is designed to be a superset of Python — valid Python code should be valid Mojo code — but with progressive performance optimization available through type annotations and a systems programming layer. It is built on MLIR (Multi-Level Intermediate Representation) rather than LLVM directly, enabling code generation targeting GPUs, TPUs, and custom AI accelerators in addition to CPUs.
fast.ai described Mojo as potentially “the biggest programming language advance in decades.” Early benchmarks showed Mojo achieving performance competitive with C for certain numerical workloads.
What it got right: Targeting the specific pain point of AI/ML developers, progressive performance model. What remains unproven: Long-term ecosystem adoption and whether it can displace the existing Python + C++ hybrid model.
Gleam (2024) — The Argument For BEAM Type Safety
The problem: Erlang and Elixir’s dynamic typing meant entire categories of errors only appeared at runtime.
Gleam was created by Louis Pilfold and reached version 1.0 on March 4, 2024. It runs on the Erlang BEAM VM — the same runtime that powers WhatsApp and Discord’s backend — and compiles to the same bytecode as Erlang and Elixir. But Gleam adds something neither of those languages has: a static ML-family type system.
In Gleam, type errors are caught at compile time. You cannot pass the wrong type of value to a function. You cannot forget to handle a case in a pattern match. The runtime reliability of the BEAM VM — proven over 35+ years and billions of production deployments — combined with compile-time type safety.
Gleam was named the second most “admired” language in the Stack Overflow Developer Survey 2025 — in its first year of appearing in the survey. That is an extraordinary reception for a 1.0 language.
What it got right: Standing on proven infrastructure (BEAM) while adding a genuine capability (type safety) that the ecosystem lacked. What the argument was: You shouldn’t have to choose between the BEAM VM’s reliability and a type-checked program.
Verse (2023) — The Argument For Massively Concurrent Worlds
The problem: No existing language was designed for billions of concurrent users sharing a persistent world.
Epic Games announced Verse in March 2023 as the scripting language for Fortnite’s Unreal Editor for Fortnite (UEFN) platform. The language was designed with Simon Peyton Jones — one of the architects of Haskell and the GHC compiler — as a research collaborator.
Verse is a functional-logic language with software transactional memory at its core. The design goal is explicitly stated in Epic’s documentation: a language that scales to “billions of concurrent users in a shared world.” Traditional game scripting languages assume a single-player context or a small multiplayer session. Verse is designed for something categorically different — persistent shared spaces where millions of players interact simultaneously.
Whether Verse achieves its stated ambitions at full scale remains to be seen. But it is one of the few new languages designed from scratch for a problem that did not previously exist.
The Pattern
Looking across these ten languages, a pattern emerges. Every successful new language made an argument about something the existing ecosystem got fundamentally wrong — and that argument was specific and verifiable, not just “our syntax is nicer.”
Go argued that build time was destroying developer productivity at scale. Rust argued that memory safety and performance were not a trade-off. Swift argued that a 40-year-old language shouldn’t be the price of building Apple software. Gleam argued that proven runtime reliability and compile-time type safety should coexist.
The languages that have struggled — Carbon and Mojo are still proving themselves — are fighting on harder terrain: not pointing at an obvious failure in existing languages, but trying to improve on something that mostly works for people who have already adapted to its limitations.
The next breakthrough language will probably argue against something that seems fine to most developers right now. It always does.
Want to run code in any of these languages? CodeArchaeology has Hello World guides for Go, Rust, Kotlin, Elixir, and Swift — each with Docker examples so you can run them today without installing anything.
Comments
Loading comments...
Leave a Comment