Training students to prove they're not robots is pushing them to use more AI
132 points - today at 7:01 PM
SourceComments
How about going back to the old system where, apart from experimental lab work, nothing is graded until the end of the term?
All weekly assignments should just be considered prep for one exam at the end of the term where the student has an opportunity to demonstrate mastery of the course's subject matter. They can prepare as they wish, use AI, and even cheat on the homework, but there will be a revelation at the end of the term.
That final test can be proctored, monitored, audited to ensure that whatever words are used are indeed the student's own words. The resulting grade depends on that, and that alone.
The approach of continuous assessment, which to me always seemed suspect and ripe for abuse, was completely broken by the AI tools that are now available.
If your school uses software to detect AI writing, that's a problem with the quality of your school. The people choosing that software are too stupid to be running a school. The software isn't going to get any better.
I've noticed I write a lot different because of combative online arguments. I have a problem.
So much of my communication is directed to people who don't want to hear me or understand me. So I've become very punchy and repetitive, trying to hammer home ideas that people are either unable or unwilling to understand.
I need to find ways to talk to people who want to hear and understand me.
It's hard to find other people who actually want to hear and understand though. People have different interests, and even when people appear to be working towards the same goal, they often aren't; like a boss who just won't understand the bad news, because it's easier to ignore the problem.
Honestly, I lean towards shaming educators who do that. If you can't detect the whiff of LLM with your own senses, then it has been used properly and shouldn't be faulted. If that premise invalidates your assignment, change the assignment. It's not as if you're assigning this work to test the basic mechanics of writing (grammar, sentence/paragraph structure, parallelism, whatever) — I mean, how much of that did you consciously try to teach? My recollection is, not an awful lot; and I can only imagine it's gotten worse since I was in K-12 (and I went to pretty darn good K-12).
It turns out to be built into the training data. The diffusion model just doesn't have many references of naked people not embedded in porn tropes, so it autocompletes porn.
Online moderation of generated images have the same weird incentive. Since real people seldom film themselves having sex, a naked person not having sex is a red flag for a possible real person, and gets moderated more strongly.
So in the new world, well written sentences are a handicap and nudity is generally accompanied by an exchange of fluids.
I think grading in general can be stymying for students' motivation and creative drives.
In my experience educators no longer use AI detectors given the risk of false positives. But some work is obviously lazy AI content. When that happens, educators talk to the student to see if they understand what they wrote.
Teachers cope with more in person writing, oral presentations, defense of what’s been written.
If you think out it the pre-AI computing generation is itself anomalous for having ubiquitous access to efficient human-only writing tools. We probably wrote more than previous generations. Early Internet / blogging culture bears this out.
If now teachers abdicate this judgment to a software, students should be allowed to abdicate their duties to a computer as well.
These kinds of things are novel to us and deserving skepticism, but become just the world we live in to them.
But, the article's focus on writing "worse" for AI detectors misses what is important. Trying to distinguish humans from machines does not develop student capability. In fact, it's a fleeting technique because AI writing styles will vary and improve over time.
This reminds me a bit of that. AI writing is—in many ways—objectively very good, but that doesn’t matter if no one thinks you wrote it. AI writing is boring exactly because it is consistent and like any art form people want to see something original.
Glad to see some schools and teachers teach how to use them well, rather than ban them outright.
On a side note: the fixed-pattern essay thing seems to be an American invention, or at least popularized by the American education system.
As soon as someone yells "witch" you cannot disprove you're not one, and I've even had people put my handwritten comments through "AI detector" websites that "proved" they were AI (they weren't). It literally just highlighted two popular English phases.
LLMs were trained on sites like HN and Reddit, so now if you write like a HN or Reddit commentator, you sound like AI...
This will likely be valuable for AI skills too.
The schools simply don’t have the flexibility, agility, or frankly it seems motivation to adapt to what has already happened.
The ship has sailed; essay writing is no longer a viable form of assessment.
The idea to try to build a reliable AI detector is asinine, and fundamentally misunderstands how any of this works now, let alone the very obvious trend-lines.
Stop with the lazy half-baked solutions, get your head out of the sand, rethink the whole curriculum. This is an emergency, we needed to be urgently attending to this years ago.
Did not this self censorship process started decades ago? There are certain answers expected in academia, arguing for anything else would get you in troubles. Not using “devoid” seems pretty minor inconvenience.
For me biggest wtf is why students are still expected to write graded essays, and to keep this make believe it is somehow useful and applicable skill.
Teachers are being hamstrung on curriculum. The districts enter into contracts that require the use of certain programs for certain amounts of time. We've known for decades (if not a century) that direct instruction works [1] but you can't sell devices, platforms and consulting services that way.
We're literally at the point in education we were in the 1950s when the health benefits of nicotine in your Q zone were lighting up the airwaves.
And generative AI means it's all but impossible to have take home writing assignments. But hey this is another opportunity to sell AI or cheating detection software, that's often just an em-dash detection [2].
We have a generation that gets to college quite possibly having never written a book. social promotion through grades and the constant distraction of electronic devices in classroom settings. I don't even necessarily blame the parents entirely either because we've constructed a society where 2 people need 5 jobs to make ends meet.
And while all this is going on we have a coordinated and well-funded effort to defund public education and move government funds to private schools based on the failing public education that's failing because we defunded it. This is usually backed up by some baloney study that shows charter shcool produce better results that really comes down to charter schools being able to be selective with enrolments while public schools cannot be. Plus we mingle in special education kids into public education because those programs got defunded too.
And really that's just a bunch of already affluent people who want a tax break for doing somethign they were going to do anyway: send their kids to private schools so they don't have to mingle with the poors and aren't taught inconvenient things like human reproduction, critical thinking and self-determination.
And after all of that we just end up teaching kids how to pass standardized tests.
[1]: https://marginalrevolution.com/marginalrevolution/2018/02/di...
[2]: https://medium.com/@brentcsutoras/the-em-dash-dilemma-how-a-...