AI is just the latest Monoculture
There’s growing anxiety that AI will be the death of innovation. The argument goes like this: LLMs are trained on existing code, so they recommend existing tools, so people use existing tools, so future LLMs are trained on even more of the same. New programming languages, libraries, and frameworks will face an impossible bootstrapping problem. How do you gain adoption when the coding assistant everyone uses has never seen your syntax?
It’s a reasonable concern. But I think it misses something important. LLMs aren’t introducing lock-in to software. They’re inheriting it. The feedback loop that people are worried about has been running for fifty years.
We optimize for familiarity, not quality
Look at the tools we celebrate as modern breakthroughs. Bun is a faster Node.js that runs the same code. Zig cc is llvm with better cross compilation. uv is a faster pip that installs the same packages. These are genuine improvements, and uv specifically has made my life as a Python developer significantly better. But notice what they have in common: none of them ask you to learn anything new. They compete on speed and reliability while carefully preserving the interfaces we already know.
This isn’t a criticism. It’s a survival strategy. Tools that demand new mental models face an uphill battle that most don’t survive. The path of least resistance is to make your new thing look exactly like the old thing. And that’s been true long before anyone was worried about training data.
The lock-in runs deep
The C ABI
Every language that wants to participate in systems programming eventually implements extern “C”. Rust does it. Go does it. Zig does it. Odin does it. Not because the C calling convention is well-designed, but because it’s the only calling convention that everything else understands.
The C ABI wasn’t created to be a universal standard. It just got there first. Now it constrains what languages can efficiently express at their boundaries. Features that don’t map cleanly onto C’s model, things like tagged unions, multiple return values, or non-nullable pointers, require awkward workarounds or simply get abandoned at the FFI layer. C++ carries decades of baggage specifically because it needed to interoperate with C. Until WebAssembly matures, there’s no real alternative. You speak C’s ABI, you spin up a webserver and communicate through REST or gRPC, or you don’t talk to anyone.
Curly braces everywhere
When Brendan Eich built JavaScript in ten days, management gave him an explicit instruction: make it look like Java. Not because Java’s syntax was suited to a dynamic scripting language, but because Java was familiar. And Java looked like C++ because C++ looked like C.
Follow the chain forward. Every mainstream language of the last thirty years uses curly braces for blocks. Java, JavaScript, C#, Go, Rust, Swift, Kotlin, TypeScript. The languages that tried something different, Python’s significant whitespace, Ruby’s end keywords, Lisp’s parentheses, either carved out niches or stayed marginal.
This pressure is so strong that the Racket team, when designing their new language Rhombus, deliberately abandoned s-expression syntax. From their documentation:
Rhombus is a new language that is built on Racket. It offers the same kind of language extensibility as Racket itself, but using traditional (infix) notation.
Even the people building a language specifically to showcase the power of Lisp-style macros felt they had to wrap it in familiar clothing. The syntax isn’t better. It’s just what people already know.
The x86/ARM/NVIDIA chokepoints
For decades, x86 dominated servers and desktops while ARM dominated mobile. Alternatives existed. MIPS, PowerPC, SPARC, Itanium. The weight of existing software, compilers, toolchains, and developer familiarity crushed all of them.
RISC-V is the current challenger, and look at how it’s positioning itself: as a drop-in replacement that slots into existing workflows, not as a fundamentally different way of thinking about processors. The pitch is “you won’t have to change anything.” That’s the only pitch that works.
The same dynamic plays out in GPUs. NVIDIA’s dominance isn’t just about hardware quality. It’s about CUDA. Try starting a new GPU company and see how far you get when every machine learning framework assumes CUDA exists. Intel is trying. It’s not going well. AMD has a chance, but ROCm has only recently started turning into a serious framework.
The browser is a bad operating system we’re stuck with
The web was designed for linked documents. JavaScript was bolted on as an afterthought. CSS was meant for styling academic papers. Somehow this accidental stack became the universal application platform.
Every attempt to replace it failed. Flash is dead. Silverlight is dead. Java applets are dead. Native app stores exist but haven’t displaced the web. So now we build complex applications by fighting against a document viewer, papering over its limitations with thousands of npm packages.
WebAssembly was supposed to change things. But notice what actually happened: it runs alongside JavaScript rather than replacing it. You still can’t escape the DOM. The gravitational pull of existing infrastructure is too strong.
So what about AI?
I could keep going. Git’s specific model of version control won and now everyone thinks in commits and branches. Unix file permissions from the 1970s still shape what’s possible on modern systems. QWERTY persists despite tradeoffs being made due to typewriter limitations.
The point is this: lock-in isn’t something that might happen if we’re not careful with AI. It’s the default state of computing. It’s what happens whenever switching costs accumulate faster than the benefits of switching.
Will LLMs accelerate this? Probably. A coding assistant that’s never seen your new language is a real barrier to adoption. But it’s a barrier stacked on top of dozens of existing barriers. The monoculture was already here. AI is just learning to speak it fluently.
Call To Action 📣
Hi 👋 my name is Diego Crespo and I like to talk about technology, niche programming languages, and AI. I have a Twitter, Mastodon, and Threads if you’d like to follow me on other social media platforms. If you liked the article, consider liking and subscribing. And if you haven’t why not check out another article of mine listed below! Thank you for reading and giving me a little of your valuable time. A.M.D.G


