Jon Belcher calls for action on deepfakes and data misuse in New Law Journal

In the 6th March edition of New Law Journal, Data Protection and Information Governance partner Jon Belcher argued that data protection law already provides the tools to tackle intimate image abuse and it is time for those with power to act. His article highlighted that:

· AI-generated deepfakes of real people amount to personal data processing, bringing them squarely within the scope of UK data protection law.

· The Information Commissioner’s Office (ICO) already has the powers to act against platforms enabling non-consensual image abuse. The issue is one of enforcement, not a lack of legislation.

A condensed summary of Jon’s article follows.

What prompted the Grok controversy?

In late 2025, Grok users prompted the AI to produce millions of sexualised deepfakes in days, including thousands involving children. Although the AI’s parent company (Elon Musk’s X) restricted these features after a backlash, similar tools remained easily accessible. X has argued that users and not the platform are responsible.

Despite public outcry, regulatory responses have been sluggish. The ICO issued only a brief statement saying it had contacted X; Ofcom and the European Commission launched investigations that remain ongoing. Criminal law already covers child sexual abuse material and the UK government has accelerated a new offence for those creating non-consensual “purported intimate images” of adults – but data protection law already provides a route to resolution.

Where does the law stand on data and deepfakes?

The use of a real person’s image to create a deepfake constitutes processing of personal data. While users may fall outside data protection rules when acting personally, AI companies do not. Individuals can object to processing, complain to the ICO or pursue compensation in court – though litigation may be costly and complex.

Jon concludes that while the regulator has been historically reluctant to act, the deepfake issue presents a clear-cut case for decisive intervention. Data protection law already provides the tools; regulators need to use them more quickly and robustly.