Microsoft Warns Copilot AI Should Not Be Used For Professional Advice

Microsoft has updated its Microsoft Services Agreement, warning users they should not rely on Copilot AI for professional advice....
Microsoft Warns Copilot AI Should Not Be Used For Professional Advice
Written by Matt Milano

Microsoft has updated its Microsoft Services Agreement, warning users they should not rely on Copilot AI for professional advice.

Artificial intelligence is the hottest trend in tech, with users across industries trying to push the boundaries of what it can do. In some cases, this has resulted in embarrassing issues when AI models have provided false—even illegal—advice.

In Microsoft’s latest terms make clear that users should not put too much stock in results produced by Copilot AI:

Assistive AI. AI services are not designed, intended, or to be used as substitutes for professional advice.

Companies and organizations have continued to struggle with issues pertaining to the advice that AI chatbots and models provide. For example, New York City’s MyCity AI chatbot drew criticism for giving bad advice, including suggestions that were illegal. Examples includes advising business owners that they could take workers’ tips, landlords could engage in income-based discrimination, and saying stores are not requires to accept cash.

Microsoft is clearly working to temper people’s expectations, at least from a legal perspective.

Get the WebProNews newsletter delivered to your inbox

Get the free daily newsletter read by decision makers

Subscribe
Advertise with Us

Ready to get started?

Get our media kit

Advertise with Us