Anthropic does a somewhat similar thing. If you visit their ToS (the one for Max/Pro plans) from a European IP address, they replace one section with this:
Non-commercial use only. You agree not to use our Services for any commercial or business purposes and we (and our Providers) have no liability to you for any loss of profit, loss of business, business interruption, or loss of business opportunity.
It's funny that a plan called "Pro" cannot be used professionally.
Lawyers are playing Calvinball again. I have no idea why the law finds this kind of argumentation compelling. "I clearly intentionally deceived, but I stashed some bullshit legalese into a document no one will read so my deception is completely OK."
owenmtoday at 5:53 PM
As far as I can tell, this is only for the free personal plan, not any of the business offerings (ie not Copilot for M365) and Github Copilot is under a separate set of terms.
āThese Terms donāt apply to Microsoft 365 Copilot apps or services unless that specific app or service says that these Terms apply.ā
Think of Copilot being a suite of different products under the same overall banner and it starts to make (a bit) more sense.
jeffwasktoday at 3:02 PM
I can hear the lawyers huddled around a conference table rolling the bones and chanting the sacred words to come up with that "get out of trouble free" card. It told your son he had terminal cancer and should kill himself... sorry, it clearly says for Entertainment Purposes only.
lateforworktoday at 7:05 PM
Go to https://www.copilot.com/ and ask a question. You'll see from the answers that it is indeed for entertainment only. It is ridiculously behind ChatGPT, and I don't know how that can happen since Microsoft has access to the same models.
nuneztoday at 5:34 PM
FYI: This is only for the "Cortana replacement" Copilot, not the other Copilots. This language doesn't appear in GitHub Copilot's Consumer Agreement, for example.
sgbealtoday at 3:32 PM
The section titled
> IMPORTANT DISCLOSURES & WARNINGS
Tells us:
> You may stop using Copilot at any time.
That's an odd thing to include in a ToS.
chrisjjtoday at 7:38 PM
These terms too are pretty entertaning.
we canāt promise that any Copilotās Responses wonāt infringe someone elseās rights (like their copyrights, trademarks, or rights of privacy) or defame them.
You agree to indemnify us and hold us harmless (including our affiliates, employees and any other agents) from and against any claims, losses, and expenses (including attorneys' fees) arising from or relating to your use of Copilot
Raed667today at 3:43 PM
a blanket "entertainment only" disclaimer likely wouldn't survive scrutiny for a product actively/relentlessly marketed as a productivity tool
yoyohello13today at 3:40 PM
I've been reading Jurassic Park recently. Hammond's monologue about expensive technology only being fundable via Entertainment seems very relevant.
polyamid23today at 7:07 PM
They should tell copilot then!
It seems to disagree.
- āAre you for entertainment purposes only?ā
-āNot at all ā unless you want me to be.
The short version: Iām not āfor entertainment only.āā
Edit:
Ok I see it is legal framing to not be held liable, but can they just do that via ToS and let the tool itself promote something else?
i-e-btoday at 7:32 PM
"Donāt use bots or scrapers"
Says the bot based on scraped data
kklisuratoday at 5:40 PM
> Other people may send similar Prompts as yours, and they could get the same, similar, or different Responses and Creations.
This is why I'm skeptical about all this AI coding thing...
cartoonfoxestoday at 8:22 PM
And yet my licensed Visual Studio 2026 that's supposed to be for serious-face commercial development is rife with Copilot.
> You may stop using Copilot at any time.
But how? Microsoft has shoved it into so many products that I don't see how it's possible, without dropping them alltogether.
anshumankmrtoday at 5:11 PM
If it is for entertainment purposes only, why am I not laughing when I use it?
Smalltalker-80today at 6:11 PM
Cool, I'm going to put this disclaimer in my work email signature.
So I'm never accountable for any mistakes.
LurkandCommenttoday at 3:53 PM
I thought a year ago when I bought a new laptop with 365 and Copilot integrated that they would make better use of AI and its integration. I can't think of when I actually used it and cancelled any subscription associated with it. On the otherhand, I use ChatGPT all the time.
giancarlostorotoday at 3:34 PM
How does this affect Copilot in VS 2022 / VS 2026? Because this is kind of insulting to a professional. I really wish Microsoft would learn to name things correctly. There's Copilot the ChatGPT-like service, then there's Copilot for Visual Studio which is not the same as far as I can tell.
nerdjontoday at 3:36 PM
Can I get this on a sticker to pass out anyone tries to shove copilot down my throat at work?
Maybe a shirt, could sell it on the Microsoft store even. Now that would be entertainment.
_trampeltiertoday at 5:12 PM
Just today afternoon, I did read a bit trough Adobes EULA and I saw most of Adobes Software is not allowed to be used from children. I guess most (todays) software are not allowed for children because of the whole user tracking and spying.
snutoday at 5:48 PM
Hilariously, immediately after I read this, my boss sends a global message to us reminding us that we 'need to be trying to integrate copilot into our jobs.'
osmsuckstoday at 6:24 PM
To us, all the profit. To you, all the risk.
wxwtoday at 3:20 PM
> Copilot is for entertainment purposes only. It can make mistakes, and it may not work as intended. Donāt rely on Copilot for important advice. Use Copilot at your own risk.
> We donāt own Your Content, but we may use Your Content to operate Copilot and improve it. By using Copilot, you grant us permission to use Your Content, which means we can copy, distribute, transmit, publicly display, publicly perform, edit, translate, and reformat it, and we can give those same rights to others who work on our behalf.
lol
ar0today at 3:17 PM
To be clear this is only for the standalone Copilot chat or app and website; not for the āCopilotā services integrated into Office 365 etc.
jmugantoday at 5:29 PM
I thought the title was a joke until I actually read the thing.
tasukitoday at 7:47 PM
Well, I'm entertained!
monegatortoday at 3:28 PM
> Copilot may include advertising
oytistoday at 6:03 PM
I might be alone with this, but I don't find it very entertaining.
soupfordummiestoday at 5:20 PM
Worth noting that this is in the terms of use as of October 2025. This isn't "new".
staticautomatictoday at 3:48 PM
Guys they're just disclaiming warranties relax
ibejoebtoday at 5:55 PM
They unironically relaunch it as XBox Copilot tomorrow...
jrochkind1today at 4:14 PM
No way that holds up in court when they are marketing it for things other than entertainment.
hn_acc1today at 5:57 PM
Do not taunt Happy Copilot Ball.
maieuticagenttoday at 3:27 PM
They're just trying to pick up that Disney deal (Clippy rhymes with Mickey)
i like the way that when ai does something good of course the people who built it should make a lot of money but when it does something bad no one is responsible
pseudosavanttoday at 6:37 PM
I can't help but be reminded of Joe Pesci in Goodfellas:
"Funny how? I mean, funny like Iām a clown? I amuse you? I make you laugh? Iām here to fuckinā amuse you? What do you mean funny, funny how?"
mihaalytoday at 6:36 PM
My employer does not allow me using software with entertainment function on company hardver.
Now what?! Do I have to uninstall Windows?
OrvalWintermutetoday at 6:24 PM
One of the most toxic TOS I have ever had the misfortune of reading.
Simulacratoday at 3:12 PM
If it's for entertainment purposes only then why is it being shoved down our throats at every opportunity???
classifiedtoday at 5:16 PM
So they finally admit that it's just a toy? Where does that leave all the mega-"productive" developers?
j45today at 4:51 PM
Non-exact software will be causing sleepless nights for non-exact legal writers.
cayceptoday at 5:17 PM
I should ask it to produce an image of Satya Nadella in Maximus garb yelling "are you not entertained?!"
ashleyntoday at 4:07 PM
Ah yes, the new "for tobacco use only" of tech.
anthktoday at 4:20 PM
I told you so, dear LLM evangelists.
catlikesshrimptoday at 6:00 PM
The ownership section is hilarious (tldr your content is not ours, but we can do anything you could do with it except being liable)
"We donāt own Your Content... By using Copilot, you grant us permission to use Your Content, which means we can copy, distribute, transmit, publicly display, publicly perform, edit, translate, and reformat it, and we can give those same rights to others who work on our behalf."
Handy-Mantoday at 2:30 PM
Seems fine to me for the consumer facing product terms lol