Source: The Register
Microsoft's terms of service classify Copilot as unsuitable for consequential decisions—a legal hedge that exposes the gap between confident marketing and what the company will defend in court. The disclaimer amounts to an admission that the system hallucinates, contradicts itself, and produces unreliable outputs at scale. Yet Microsoft continues positioning it as a productivity layer across enterprise workflows. AI vendors are operating in a liminal space: deploying systems too unreliable to warrant liability while customers treat them as legitimate decision-support tools anyway.