Even Microsoft know Copilot can’t be trusted • The Register
A recent surge of interest in Microsoft’s Terms of Use for Copilot is a reminder that AI helpers are really just a bit of fun.
Despite the last update taking place in late 2025, the document for Copilot for Individuals recently attracted new attention from netizens. It includes this gem: “Copilot is for entertainment purposes only. It can make mistakes, and it may not work as intended. Don’t rely on Copilot for important advice. Use Copilot at your own risk.”
Regular readers of The Register won’t be shocked by Microsoft’s warning that Copilot gets things wrong and should not be relied on. The company itself has long acknowledged the assistant’s limitations. During the London leg of its AI tour, for example, every demonstration of Copilot wizardry came with a warning that the tool could not be fully trusted and that human verification was required.
The same applies to any other AI assistant: they can be useful, but their output still needs checking, particularly on anything consequential like medical advice or an investment plan.
As one commenter on Hacker News pointed out: “Anthropic does a somewhat similar thing. If you visit their ToS (the one for Max/Pro plans) from a European IP address, they replace one section with this: Non-commercial use only. You agree not to use our Services for any commercial or business purposes and we (and our Providers) have no liability to you for any loss of profit, loss of business, business interruption, or loss of business opportunity.” (The Register checked this from a US and a European IP and can confirm this is the case.)
The commenter added: “It’s funny that a plan called ‘Pro’ cannot be used professionally.”
As for Copilot’s Terms of Use, they may not be new, but the attention is useful for two reasons. It is a reminder to read the text users so often click through, and it underlines that chatbots such as Copilot are neither companions nor dependable sources of advice.
Instead, they are error-prone tools that can be helpful one moment and confidently wrong the next. Some in the tech industry may market AI assistants as though they put a genius in every laptop, but Microsoft’s own warning is rather less grand: “It can make mistakes, and it may not work as intended.”
Copilot for Individuals may be for entertainment purposes only. Microsoft 365 Copilot, meanwhile, can be just as inaccurate, only with fewer laughs. ®
First Appeared on
Source link