Advertisement

Copilot caveat jars with Microsoft’s AI push

Microsoft has found itself defending an awkward contradiction at the heart of its artificial intelligence strategy after wording in its Copilot terms of use described the service as being “for entertainment purposes only”, while warning users that it can make mistakes and should not be relied on for important advice. The language, published in Microsoft’s consumer Copilot terms, surfaced just as the company was broadening the tool’s role across consumer software, workplace products and agent-based AI services.

The clause is unusually blunt for a product that Microsoft has spent months presenting as a serious productivity layer for search, writing, research and office work. In the terms, Microsoft says Copilot “can make mistakes, and it may not work as intended”, adding that users proceed at their own risk. The same section also says Microsoft makes no warranty or representation about Copilot and places responsibility on users if they publish or share outputs publicly.

That wording drew fresh scrutiny because it appeared out of step with Microsoft’s commercial messaging. On March 30, Microsoft unveiled new upgrades to Copilot aimed at improving enterprise usefulness, including features that let multiple AI models work within the same workflow and compare outputs side by side. Microsoft executive Nicole Herskowitz told Reuters the approach was meant to curb hallucinations, improve quality and lift productivity, underscoring how central reliability has become to the company’s pitch.

Microsoft has since moved to contain the fallout. A company spokesperson said the “entertainment purposes” phrasing was legacy language dating from Copilot’s earlier life as a Bing search companion and no longer reflected how the product is used today. According to that statement, the clause will be changed in the next update to the terms. That explanation matters because Copilot has evolved from a consumer chat assistant into a wider family of products spanning Windows, Microsoft 365, developer tools and specialist business workflows.

The episode also highlights a broader tension running through the generative AI market. Providers are racing to persuade companies and consumers that AI assistants can save time and raise output, while legal teams continue to write terms designed to limit liability if those systems produce false, harmful or infringing material. Microsoft’s Copilot terms make that balance especially visible. They state that outputs may be wrong, that the service may fail to work as intended, and that users remain responsible for how responses are used or distributed.

Microsoft’s own corporate narrative has leaned the other way. During its fiscal second-quarter earnings call in January, the company said Microsoft 365 Copilot’s accuracy and latency, powered by what it called Work IQ, were “unmatched” and said usage intensity had risen sharply. Earlier, Microsoft said its first-party family of Copilots had surpassed 150 million monthly active users across work, coding, security, science, health and consumer use cases, showing how strategically important the brand has become.

For customers, the practical issue is less the wording itself than what it says about acceptable reliance on AI systems. Across the industry, providers warn that outputs are probabilistic and require human review, especially in high-stakes fields. Reuters reported last week that Microsoft’s latest Copilot upgrades are explicitly designed to keep hallucinations in check by having one model review another. Separate reporting on the economics of generative AI has also pointed to hallucinations as a stubborn structural weakness rather than a passing bug, adding to pressure on vendors to promise more than the technology can yet guarantee.

The terms issue is also notable because Microsoft users do not rely on only one agreement. Business Insider reported that the broader Microsoft Services Agreement, which users also accept, does not use the same “entertainment purposes” wording in its AI-related sections. That distinction may matter legally, but it did little to calm criticism once the clause spread online, largely because the phrase sounded at odds with Microsoft’s attempt to position Copilot as a paid assistant for serious work.
Previous Post Next Post

Advertisement

Advertisement

نموذج الاتصال