Four terabytes of data have reportedly been stolen, including database records and source code. Allegedly stolen data has been published on a leak site, containing Slack information, internal ticketing data, and videos of conversations between Mercor's AI systems and contractors.
Victims of deepfake image abuse have called for stronger protection against AI-generated explicit images, as the law criminalising the creation of non-consensual intimate images comes into effect. Campaigners from Stop Image-Based Abuse delivered a petition to Downing Street with more than 73,000 signatures, urging the government to introduce civil routes to justice such as takedown orders for abusive imagery on platforms and devices.