Law
fromABA Journal
1 day agoSanctions ramping up in cases involving AI hallucinations
Monetary sanctions against attorneys for AI-generated hallucinations in case documents are increasing as courts take these issues more seriously.
"Any exposure of source code or system-level logic is significant, because it shows how controls are implemented. In AI systems, that layer is especially critical. The orchestration, prompts, and workflows effectively define how the system operates. If those are exposed, it can make it easier to identify weaknesses or manipulate outcomes."
Four terabytes of data have reportedly been stolen, including database records and source code. Allegedly stolen data has been published on a leak site, containing Slack information, internal ticketing data, and videos of conversations between Mercor's AI systems and contractors.
AI on the dark side has done three things particularly well: speed, scale, and sophistication. As a result, the time between a successful intrusion and the actual theft of data has decreased significantly over the past three years. Whereas three years ago the average period was nine days, it is now one day. The fastest case documented by Palo Alto Networks was even 72 minutes.
The savings disappear the moment you hit real-world complexity. Disparate data sources and messy inputs, ambiguous situations without clear rule sets, or actually any domain where the rules aren't already obvious. And someone still has to write all those rules.