Reports of code's death are greatly exaggerated

114 points - today at 11:09 AM

Source

Comments

lateforwork today at 6:47 PM
Chris Lattner, inventor of the Swift programming language recently took a look at a compiler entirely written by Claude AI. Lattner found nothing innovative in the code generated by AI [1]. And this is why humans will be needed to advance the state of the art.

AI tends to accept conventional wisdom. Because of this, it struggles with genuine critical thinking and cannot independently advance the state of the art.

AI systems are trained on vast bodies of human work and generate answers near the center of existing thought. A human might occasionally step back and question conventional wisdom, but AI systems do not do this on their own. They align with consensus rather than challenge it. As a result, they cannot independently push knowledge forward. Humans can innovate with help from AI, but AI still requires human direction.

You can prod AI systems to think critically, but they tend to revert to the mean. When a conversation moves away from consensus thinking, you can feel the system pulling back toward the safe middle.

As Apple’s “Think Different” campaign in the late 90s put it: the people crazy enough to think they can change the world are the ones who do—the misfits, the rebels, the troublemakers, the round pegs in square holes, the ones who see things differently. AI is none of that. AI is a conformist. That is its strength, and that is its weakness.

[1] https://www.modular.com/blog/the-claude-c-compiler-what-it-r...

neversupervised today at 8:35 PM
The author’s intuition is still backward calibrated, even though he talks about the future. He doesn’t have an intuition for the future. All code will be AI generated. There’s no way to compete with the AI. And whatever new downsides this brings will be solved in ways we aren’t fully anticipating. But the solution is not to walk back vibecoding. You have to be blind to believe not most code will be vibecoded very soon.
randcraw today at 8:31 PM
Krouse points to a great article by Simon Willison who proposes that the killer role for vibe coding (hopefully) will be to make code better and not just faster.

By generating prototypes that are based on different design models each end product can be assessed for specific criteria like code readability, reliability, or fault tolerance and then quickly be revised repeatedly to serve these ends better. No longer would the victory dance of vibe coding be simply "It ran!" or "Look how quickly I built it!".

pacman128 today at 5:52 PM
In a chat bot coding world, how do we ever progress to new technologies? The AI has been trained on numerous people's previous work. If there is no prior art, for say a new language or framework, the AI models will struggle. How will the vast amounts of new training data they require ever be generated if there is not a critical mass of developers?
picafrost today at 6:39 PM
So much of society's intellectual talent has been allocated toward software. Many of our smartest are working on ad-tech, surveillance, or squeezing as much attention out of our neighbors as possible.

Maybe the current allocation of technical talent is a market failure and disruption to coding could be a forcing function for reallocation.

pjmlp today at 8:26 PM
This is coping, with tools like Boomi, n8n, Langflow, and similar, there are plenty of automated tasks that can already be configured and that's it.
_pdp_ today at 8:33 PM
Remember Deep Thought, the greatest computer ever built that spent 7.5 million years computing the Answer to the Ultimate Question of Life, the Universe, and Everything? The answer was 42, perfectly correct, utterly useless because nobody understood the question they were asking.

That's what happens when you hand everything to a machine without understanding the problem yourself.

AI can give you correct answers all day long, but if you don't understand what you're building, you'll end up just like the people of Magrathea, staring at 42 and wondering what to do with it.

True understanding is indistinguishable from doing.

soumyaskartha today at 5:34 PM
Every few years something is going to kill code and here we are. The job changes, it does not disappear.
flitzofolov today at 6:45 PM
r0ml's third law states that: “Any distributed system based on exchanging data will be replaced by a system based on exchanging programs.”

I believe the same pattern is inevitable for these higher level abstractions and interfaces to generate computer instructions. The language use must ultimately conform to a rigid syntax, and produce a deterministic result, a.k.a. "code".

Source: https://www.youtube.com/watch?v=h5fmhYc4U-Y

woeirua today at 7:15 PM
The argument here seems to be “you need AGI to write good code. Good code is required for… reasons. AGI is far away. Therefore code is not dead.”

First, I disagree that good code is required in any sense. We have decades of experience proving that bad code can be wildly successful.

Second, has the author not seen the METR plot? We went from: LLMs can write a function to agents can write working compilers in less than a year. Anyone who thinks AGI is far away deserves to be blindsided.

idopmstuff today at 5:40 PM
I don't know that people are saying code is dead (or at least the ones who have even a vague understanding of AI's role) - more that humans are moving up a level of abstraction in their inputs. Rather than writing code, they can write specs in English and have AI write the code, much in the same way that humans moved from writing assembly to writing higher-level code.

But of course writing code directly will always maintain the benefit of specificity. If you want to write instructions to a computer that are completely unambiguous, code will always be more useful than English. There are probably a lot of cases where you could write an instruction unambiguously in English, but it'd end up being much longer because English is much less precise than any coding language.

I think we'll see the same in photo and video editing as AI gets better at that. If I need to make a change to a photo, I'll be able to ask a computer, and it'll be able to do it. But if I need the change to be pixel-perfect, it'll be much more efficient to just do it in Photoshop than to describe the change in English.

But much like with photo editing, there'll be a lot of cases where you just don't need a high enough level of specificity to use a coding language. I build tools for myself using AI, and as long as they do what I expect them to do, they're fine. Code's probably not the best, but that just doesn't matter for my case.

(There are of course also issues of code quality, tech debt, etc., but I think that as AI gets better and better over the next few years, it'll be able to write reliable, secure, production-grade code better than humans anyway.)

deadbabe today at 5:22 PM
My problem is that while I know “code” isn’t going away, everyone seems to believe it is, and that’s influencing how we work.

I have not really found anything that shakes these people down to their core. Any argument or example is handwaved away by claims that better use of agents or advanced models will solve these “temporary” setbacks. How do you crack them? Especially upper management.

gedy today at 6:00 PM
When I started my professional life in the 90s, we used Visual J++ (Java) and remember all this damn code it generated to do UIs...

I remember being aghast at all the incomprehensible code and "do not modify" comments - and also at some of the devs who were like "isn't this great?".

I remember bailing out asap to another company where we wrote Java Swing and was so happy we could write UIs directly and a lot less code to understand. I'm feeling the same vibe these days with the "isn't it great?". Not really!

rglover today at 6:12 PM
It's only dead to those who are ignorant to what it takes to build and run real systems that don't tip over all the time (or leak data, embroil you in extortion, etc). That will piss some people off but it's worth considering if you don't want to perma-railroad yourself long-term. Many seem to be so blinded by the glitz, glamour, and dollar signs that they don't realize they're actively destroying their future prospects/reputation by getting all emo about a non-deterministic printer.

Valuable? Yep. World changing? Absolutely. The domain of people who haven't the slightest clue what they're doing? Not unless you enjoy lighting money on fire.

rvz today at 11:44 AM
From "code" to "no-code" to "vibe coding" and back to "code".

What you are seeing here is that many are attempting to take shortcuts to building production-grade maintainable software with AI and now realizing that they have built their software on terrible architecture only to throw it away, rewriting it with now no-one truly understanding the code or can explain it.

We have a term for that already and it is called "comprehension debt". [0]

With the rise of over-reliance of agents, you will see "engineers" unable to explain technical decisions and will admit to having zero knowledge of what the agent has done.

This is exactly happening to engineers at AWS with Kiro causing outages [1] and now requiring engineers to manually review AI changes [2] (which slows them down even with AI).

[0] https://addyosmani.com/blog/comprehension-debt/

[1] https://www.theguardian.com/technology/2026/feb/20/amazon-cl...

[2] https://www.ft.com/content/7cab4ec7-4712-4137-b602-119a44f77...

erichocean today at 5:36 PM
> If you know of any other snippet of code that can master all that complexity as beautifully, I'd love to see it.

Electric Clojure: https://electric.hyperfiddle.net/fiddle/electric-tutorial.tw...

cratermoon today at 7:16 PM
I can't tell if the author's "when we get AGI" is sarcasm or genuine.
cratermoon today at 6:26 PM
Yet again we can pull out Edsger W.Dijkstra's 1978 article, "On the foolishness of "natural language programming""

"In order to make machines significantly easier to use, it has been proposed (to try) to design machines that we could instruct in our native tongues. this would, admittedly, make the machines much more complicated, but, it was argued, by letting the machine carry a larger share of the burden, life would become easier for us. It sounds sensible provided you blame the obligation to use a formal symbolism as the source of your difficulties. But is the argument valid? I doubt."

jee599 today at 7:24 PM
[dead]
aplomb1026 today at 5:32 PM
[dead]
Plutarco_ink today at 6:05 PM
[dead]
lucasay today at 5:52 PM
[dead]
developic today at 11:10 AM
What is this
deleted today at 6:33 PM