AI Law - International Review of Artificial Intelligence Law
G. Giappichelli Editore

20/11/2024 - AI Hallucinations Cause False Criminal Accusations

argument: Notizie/News - Criminal Law

Source: ABC News

The article highlights the growing issue of AI hallucinations, where AI tools, such as chatbots and language models, generate false information about individuals. In several high-profile cases, AI systems have falsely described people as criminals, leading to potential defamation lawsuits. For example, a German journalist discovered that Microsoft's AI tool, Copilot, had falsely described him as a child molester and con-man, conflating his news reports with his personal identity.

The article discusses the legal implications of these AI-generated inaccuracies, including the difficulty of holding AI developers accountable for the false information produced by their systems. In some cases, individuals have considered suing AI companies for defamation, but the legal process can be lengthy and costly. The article also highlights the challenge of correcting AI hallucinations, as the systems often cannot easily remove the false data from their models.