Microsoft: Copilot is for entertainment purposes only

343 points - today at 2:25 PM

Source

Comments

wowoc today at 5:12 PM
Anthropic does a somewhat similar thing. If you visit their ToS (the one for Max/Pro plans) from a European IP address, they replace one section with this:

Non-commercial use only. You agree not to use our Services for any commercial or business purposes and we (and our Providers) have no liability to you for any loss of profit, loss of business, business interruption, or loss of business opportunity.

It's funny that a plan called "Pro" cannot be used professionally.

https://www.anthropic.com/legal/consumer-terms

everdrive today at 3:28 PM
Lawyers are playing Calvinball again. I have no idea why the law finds this kind of argumentation compelling. "I clearly intentionally deceived, but I stashed some bullshit legalese into a document no one will read so my deception is completely OK."
owenm today at 5:53 PM
As far as I can tell, this is only for the free personal plan, not any of the business offerings (ie not Copilot for M365) and Github Copilot is under a separate set of terms.

ā€œThese Terms don’t apply to Microsoft 365 Copilot apps or services unless that specific app or service says that these Terms apply.ā€

Think of Copilot being a suite of different products under the same overall banner and it starts to make (a bit) more sense.

jeffwask today at 3:02 PM
I can hear the lawyers huddled around a conference table rolling the bones and chanting the sacred words to come up with that "get out of trouble free" card. It told your son he had terminal cancer and should kill himself... sorry, it clearly says for Entertainment Purposes only.
lateforwork today at 7:05 PM
Go to https://www.copilot.com/ and ask a question. You'll see from the answers that it is indeed for entertainment only. It is ridiculously behind ChatGPT, and I don't know how that can happen since Microsoft has access to the same models.
nunez today at 5:34 PM
FYI: This is only for the "Cortana replacement" Copilot, not the other Copilots. This language doesn't appear in GitHub Copilot's Consumer Agreement, for example.
sgbeal today at 3:32 PM
The section titled

> IMPORTANT DISCLOSURES & WARNINGS

Tells us:

> You may stop using Copilot at any time.

That's an odd thing to include in a ToS.

chrisjj today at 7:38 PM
These terms too are pretty entertaning.

we can’t promise that any Copilot’s Responses won’t infringe someone else’s rights (like their copyrights, trademarks, or rights of privacy) or defame them.

You agree to indemnify us and hold us harmless (including our affiliates, employees and any other agents) from and against any claims, losses, and expenses (including attorneys' fees) arising from or relating to your use of Copilot

Raed667 today at 3:43 PM
a blanket "entertainment only" disclaimer likely wouldn't survive scrutiny for a product actively/relentlessly marketed as a productivity tool
yoyohello13 today at 3:40 PM
I've been reading Jurassic Park recently. Hammond's monologue about expensive technology only being fundable via Entertainment seems very relevant.
polyamid23 today at 7:07 PM
They should tell copilot then! It seems to disagree.

- ā€šAre you for entertainment purposes only?ā€˜

-ā€šNot at all — unless you want me to be. The short version: I’m not ā€œfor entertainment only.ā€ā€˜

Edit: Ok I see it is legal framing to not be held liable, but can they just do that via ToS and let the tool itself promote something else?

i-e-b today at 7:32 PM
"Don’t use bots or scrapers"

Says the bot based on scraped data

kklisura today at 5:40 PM
> Other people may send similar Prompts as yours, and they could get the same, similar, or different Responses and Creations.

This is why I'm skeptical about all this AI coding thing...

cartoonfoxes today at 8:22 PM
And yet my licensed Visual Studio 2026 that's supposed to be for serious-face commercial development is rife with Copilot.

> You may stop using Copilot at any time.

But how? Microsoft has shoved it into so many products that I don't see how it's possible, without dropping them alltogether.

anshumankmr today at 5:11 PM
If it is for entertainment purposes only, why am I not laughing when I use it?
Smalltalker-80 today at 6:11 PM
Cool, I'm going to put this disclaimer in my work email signature. So I'm never accountable for any mistakes.
LurkandComment today at 3:53 PM
I thought a year ago when I bought a new laptop with 365 and Copilot integrated that they would make better use of AI and its integration. I can't think of when I actually used it and cancelled any subscription associated with it. On the otherhand, I use ChatGPT all the time.
giancarlostoro today at 3:34 PM
How does this affect Copilot in VS 2022 / VS 2026? Because this is kind of insulting to a professional. I really wish Microsoft would learn to name things correctly. There's Copilot the ChatGPT-like service, then there's Copilot for Visual Studio which is not the same as far as I can tell.
nerdjon today at 3:36 PM
Can I get this on a sticker to pass out anyone tries to shove copilot down my throat at work?

Maybe a shirt, could sell it on the Microsoft store even. Now that would be entertainment.

_trampeltier today at 5:12 PM
Just today afternoon, I did read a bit trough Adobes EULA and I saw most of Adobes Software is not allowed to be used from children. I guess most (todays) software are not allowed for children because of the whole user tracking and spying.
snu today at 5:48 PM
Hilariously, immediately after I read this, my boss sends a global message to us reminding us that we 'need to be trying to integrate copilot into our jobs.'
osmsucks today at 6:24 PM
To us, all the profit. To you, all the risk.
wxw today at 3:20 PM
> Copilot is for entertainment purposes only. It can make mistakes, and it may not work as intended. Don’t rely on Copilot for important advice. Use Copilot at your own risk.

> We don’t own Your Content, but we may use Your Content to operate Copilot and improve it. By using Copilot, you grant us permission to use Your Content, which means we can copy, distribute, transmit, publicly display, publicly perform, edit, translate, and reformat it, and we can give those same rights to others who work on our behalf.

lol

ar0 today at 3:17 PM
To be clear this is only for the standalone Copilot chat or app and website; not for the ā€œCopilotā€ services integrated into Office 365 etc.
jmugan today at 5:29 PM
I thought the title was a joke until I actually read the thing.
tasuki today at 7:47 PM
Well, I'm entertained!
monegator today at 3:28 PM
> Copilot may include advertising
oytis today at 6:03 PM
I might be alone with this, but I don't find it very entertaining.
soupfordummies today at 5:20 PM
Worth noting that this is in the terms of use as of October 2025. This isn't "new".
staticautomatic today at 3:48 PM
Guys they're just disclaiming warranties relax
ibejoeb today at 5:55 PM
They unironically relaunch it as XBox Copilot tomorrow...
jrochkind1 today at 4:14 PM
No way that holds up in court when they are marketing it for things other than entertainment.
hn_acc1 today at 5:57 PM
Do not taunt Happy Copilot Ball.
maieuticagent today at 3:27 PM
They're just trying to pick up that Disney deal (Clippy rhymes with Mickey)
OfirMarom today at 6:40 PM
That one line is…doing A LOT of legal work.
ortusdux today at 3:12 PM
It worked for Fox News
tech_ken today at 4:53 PM
Another bingo square for that 'AI is gambling' post (https://news.ycombinator.com/item?id=47428541)
ratelimitsteve today at 3:23 PM
i like the way that when ai does something good of course the people who built it should make a lot of money but when it does something bad no one is responsible
pseudosavant today at 6:37 PM
I can't help but be reminded of Joe Pesci in Goodfellas:

"Funny how? I mean, funny like I’m a clown? I amuse you? I make you laugh? I’m here to fuckin’ amuse you? What do you mean funny, funny how?"

mihaaly today at 6:36 PM
My employer does not allow me using software with entertainment function on company hardver.

Now what?! Do I have to uninstall Windows?

OrvalWintermute today at 6:24 PM
One of the most toxic TOS I have ever had the misfortune of reading.
Simulacra today at 3:12 PM
If it's for entertainment purposes only then why is it being shoved down our throats at every opportunity???
classified today at 5:16 PM
So they finally admit that it's just a toy? Where does that leave all the mega-"productive" developers?
j45 today at 4:51 PM
Non-exact software will be causing sleepless nights for non-exact legal writers.
caycep today at 5:17 PM
I should ask it to produce an image of Satya Nadella in Maximus garb yelling "are you not entertained?!"
ashleyn today at 4:07 PM
Ah yes, the new "for tobacco use only" of tech.
anthk today at 4:20 PM
I told you so, dear LLM evangelists.
catlikesshrimp today at 6:00 PM
The ownership section is hilarious (tldr your content is not ours, but we can do anything you could do with it except being liable)

"We don’t own Your Content... By using Copilot, you grant us permission to use Your Content, which means we can copy, distribute, transmit, publicly display, publicly perform, edit, translate, and reformat it, and we can give those same rights to others who work on our behalf."

Handy-Man today at 2:30 PM
Seems fine to me for the consumer facing product terms lol