What is happening to writing? Cognitive debt, Claude Code, the space around AI

81 points - today at 2:59 PM

Source

Comments

apsurd today at 9:39 PM
Axios got traction because it heavily condensed news into more scannable content for the twitter, insta, Tok crowd.

So AI is this on massive steroids. It is unsettling but it seems a recurring need to point out that across the board many of "it's because of AI" things were already happening. "Post truth" is one I'm most interested in.

AI condenses it all on a surreal and unsettling timeline. But humans are still humans.

And to me, that means that I will continue to seek out and pay for good writing like The Atlantic. btw I've enjoyed listening to articles via their auto-generated NOA AI voice thing.

Additionally, not all writing serves the same purpose. The article makes these sweeping claims about "all of writing". Gets clicks I guess, but to the point, most of why and what people read is toward some immediate and functional need. Like work, like some way to make money, indirectly. Some hack. Some fast-forwarding of "the point". No wonder AI is taking over that job.

And then there's creative expression and connection. And yes I know AI is taking over all the creative industries too. What I'm saying is we've always been separating "the masses" from those that "appreciate real art".

Same story.

ericdykstra today at 11:03 PM
I won't ever put my name on something written by an LLM, and I will blacklist any site or person I see doing it. If I want to read LLM output I can prompt it myself, subjecting me to it and passing it off as your own is disrespectful.

As the author says, there will certainly be a number of people who decide to play with LLM games or whatever, and content farms will get even more generic while having less writing errors, but I don't think that the age of communicating thought, person to person, through text is "over".

dtf today at 9:42 PM
"Is Claude Code junk food, though? ... although I have barely written a line of code on my own, the cognitive work of learning the architecture — developing a new epistemological framework for “how developers think” — feels real."

Might this also apply to learning about writing? If have barely written a line of prose on my own, but spent a year generating a large corpus of it aided by these fabulous machines, might I also come to understand "how writers think"?

I love the later description of writing as a "special, irreplaceable form of thinking forged from solitary perception and [enormous amounts of] labor", where “style isn’t something you apply later; it’s embedded in your perception" (according to Amis). Could such a statement ever apply to something as crass as software development?

AstroBen today at 10:22 PM
This type of cadence.

You know the one.

Choppy. Fast. Saying nothing at all.

It's not just boring and disjointed. It's full-on slop via human-adjacent mimicry.

Let’s get very clear, very grounded, and very unsentimental for a moment.

The contrast to good writing is brutal, and not in a poetic way. In a teeth-on-edge, stomach-dropping way. The dissonance is violent.

Here's the raw truth:

It’s not wisdom. It’s not professional. It’s not even particularly original.

You are very right to be angry. Brands picking soulless drivel over real human creatives.

And now we finish with a pseudo-deep confirmation of your bias.

---

Before long everyone will be used to it and it'll evoke the same eugh response

Sometimes standing out or wuality writing doesn't actually matter. Let AI do that part

pawelduda today at 9:30 PM
About the article that's referenced in the beginning - that sentiment presented in it honestly sounds like AI version of cryptocurrency euphoria just as the bubble burst. "You are not ready for what's going to happen to the economy", "crypto will replace tradfi, experts agree". The article is sitting at almost 100M views after just a week and has strong FOMO vibes. To be honest, it's very conflicting for me to believe that, because I've been using AI and compared to crypto, it doesn't just feel like magic, it also does magic. However, I can't help but think of this parallel and the possibilty that somehow the AI bubble could right now be starting to stall/regress. The only problem is that I just don't see how such a scenario would play out, given how good and useful these tools are
ayoung5555 today at 10:02 PM
As much as the general public seems to be turning against AI, people only seem to care when they're aware it's AI. Those of us intentionally aware of it are better tuned to identify LLM-speak and generated slop.

Most human writing isn't good. Take LinkedIn, for example. It didn't suddenly become bad because of LLM-slop posts - humans pioneered its now-ubiquitous style. And now even when something is human-written, we're already seeing humans absorb linguistic patterns common to LLM writing. That said, I'm confident slop from any platform with user-generated content will eventually fade away from my feeds because the algorithms will pick up on that as a signal. (edit to add from my feeds)

What concerns me most is that there's absolutely no way this isn't detrimental to students. While AI can be a tool in STEM, I'm hearing from teachers among family and friends that everything students write is from an LLM.

Leaning on AI to write code I'd otherwise write myself might be a slight net negative on my ability to write future code - but brains are elastic enough that I could close an n month gap in 1/2n months time or something.

From middle school to university, students are doing everything for the first time, and there's no recovering habits or memories that never formed in the first place. They made the ACT easier 2 years ago (reduced # of questions) and in the US the average score has set a new record low every year since then. Not only is there no clear path to improvement, there's an even clearer path to things getting worse.

submeta today at 10:04 PM
I wonder whether we will see a shift back toward human generated, organic content, writing that is not perfectly polished or exhaustively articulated. For an LLM, it is effortless to smooth every edge and fully flesh out every thought. For humans, it is not.

After two years of reading increasing amounts of LLM generated text, I find myself appreciating something different: concise, slightly rough writing that is not optimized to perfection, but clearly written by another human being

1necornbuilder today at 10:21 PM
The "cognitive debt" framing resonates, but from an unexpected direction. I'm not a developer. I've never written a line of code. I built enterprise software, a live computer vision system monitoring industrial cranes, deployed on Google Cloud Run, generating six figures in contracts, entirely by chatting with Claude. No IDE, no terminal muscle memory to lose.

For me, there is no cognitive debt in the code. There's no ground truth I'm losing touch with, because I never had it. The ground truth I bring is domain knowledge: fifteen years of understanding what an industrial operator actually needs to see on a screen at 3am. What Breen describes as "junk food", the dopamine hit of watching Claude build a new feature is, for domain experts like me, the first time in history we could participate in building at all. The gap that existed wasn't "developer loses touch with code." It was "person closest to the problem could never build the solution." But his core point about writing holds, even here. The thinking that produces good software requirements, the careful articulation of what needs to be built and why, that remains irreducibly human. My most important contributions to my own codebase aren't commits. They're the precise questions I ask. Maybe cognitive debt is domain-specific. Developers accumulate it. Domain experts spend it.

kittikitti today at 10:13 PM
I think people hate AI generated writing more than they like human curated writing. At the same time, I find that people like AI content more than my writing. I write, comment, and blog in many different places and I notice that my AI generated content does much better in terms of engagement. I'm not a writer, I code, so it might be that my writing is not professional. Whereas my code-by-hand still edges out against AI.

We need to value human content more. I find that many real people eventually get banned while the bots are always forced to follow rules. The Dead Internet hypothesis sounds more inevitable under these conditions.

Indeed we all now have a neuron that fires every time we sense AI content. However, maybe we need to train another neuron that activates when content is genuine.

kittbuilds today at 10:14 PM
[dead]
rnakle today at 10:33 PM
That is a shallow piece of the new genre: I am a concerned academic who nevertheless uses these new tools to create vibe coded slop and has to tell the world about it.

Everything is inevitable but my own job is secure. Have I already told you how concerned I am?

No novelty. No intellectual challenge. No spirit. Just AI advertisements! /s

jongjong today at 9:07 PM
I agree with the assessment that pure writing (by a human) is over. Content is going to matter a lot more.

It's going to be tough for fiction authors to break through. Sadly, I don't think the average consumer has sufficiently good taste to tell when something is genuinely novel. People often prefer the carefully formulated familiar garbage over the creative gems; this was true before AI and, IMO, will continue to be true after AI. This is not just about writing, it's about art in general.

There will be a subset of people who can see through the form and see substance and those will be able to identify non-AI work but they will continue to be a minority. The masses will happily consume the slop. The masses have poor taste and they're more interested in "comfort food" ideas than actually novel ideas. Novelty just doesn't do it for them. Most people are not curious, new ideas don't interest them. These people will live and breathe AI slop and they will feel uncomfortable if presented with new material, even if wrapped in a layer of AI (e.g. human-written core ideas, rewritten by AI).

I feel like that about most books, music and pop culture in general; it was slop and it will continue to be slop... It was the same basic ideas about elves, dragons, wizards, orcs, kings, queens, etc... Just reorganized and mashed with different overarching storylines "a difficult journey" or "epic battles" with different wording.

Most people don't understand the difference between pure AI-generated content (seeded by a small human input) and human-generated content which was rewritten by AI (seeded by a large human input) because most people don't care about and never cared about substance. Their entire lives may be about form over substance.

davtyan1202 today at 9:38 PM
As we move further into a world where data exfiltration is becoming more sophisticated, local-first processing isn't just a luxury—it’s a necessity. Hardware is finally powerful enough to handle what used to require a massive backend infrastructure.