ShowBiz & Sports Lifestyle

Hot

Met complaints over image-based sexual abuse have more than doubled in the last five years

Met complaints over image-based sexual abuse have more than doubled in the last five years

Bryony GoochWed, April 29, 2026 at 4:38 PM UTC

0

Reports of image based sexual abuse and explicit deepfakes to the Metropolitan Police have more than doubled in the last five years, new figures show as the Home Office promised to crack down on offenders.

The Metropolitan Police received a 120 per cent rise in complaints about non-consensual intimate image abuse (NCII), according to FOI data obtained by online safety provider Verifymy.

The safety provider warned that the rapid rise of AI-powered “nudification” tools, used to remove people’s clothes in photos, was making it easier to create and share realistic, sexually explicit images without consent.

Some 1,766 NCII complaints were made last year in Greater London, a 17 per cent rise from 1,523 the year before, and more than double the 805 recorded in 2020, according to Metropolitan Police complaint data seen by The Independent.

These findings come months after Ofcom launched an investigation into the Grok AI chatbot on X being used to create and share sexual deepfakes of real people, including children.

The Internet Watch Foundation said just last week that in 2025 there had been a more than 260-fold increase in AI-generated child sexual abuse videos from the year before.

Video models, nudification apps, subscription platforms and agentic AI systems were enabling offenders to produce and distribute illegal content at scale, the watchdog warned, allowing them to manipulate images of real children and simulate explicit chats with child characters.

Social media platforms will have to remove any non-consensual images reported within 48 hours, under the new Crime and Policing Bill, which currently in the final stages of legislation. Those that don’t risk hefty fines or having their services blocked in the UK. Nudification tools used for AI deepfakes will be banned under the new rules.

The Crime and Policing Bill is in the final stages of legislation (AFP/Getty)

Victims of NCII will have up to three years to report the crime, up from the current six months.

Advertisement

While the Crime and Policing Bill looks set to crack down on NCII, experts are warning that platforms must do more the tackle the growing harm.

Emma Robert-Tissot, Head of Partnerships at Verifymy, said: “In an age of hyper-realistic image generation, everyone should have control over how their identity is used and represented online. Consent management that supports this is no longer a technical consideration, it is a fundamental right.

“While content moderation plays an important role, it cannot identify all forms of non-consensual intimate image abuse, particularly as synthetic content becomes more advanced. Platforms must therefore take a more holistic approach - combining prevention, consent and detection - to effectively tackle this growing harm.”

Victimes of NCII will have up to three years to report the crime under new legislation(PA) (PA Wire)

Commenting on the FOI findings, a Met Police spokesperson said that tech firms needed to crack down on methods of NCII offending.

“Non‑consensual intimate image (NCII) abuse can have a devastating and lasting impact on victims. The online world is changing rapidly, and reporting of this type of offending has increased significantly over the past five years,” they said.

“We continue to strengthen our response to tech-enabled abuse by bolstering specialist teams and investing in new technology. This includes technology that allows officers to review large volumes of messaging and an NCII toolkit providing vital information on what this abuse looks like and its impact on victims to enable police to improve their response.

“While using technologies and working alongside our safeguarding partners to provide support for victims, we continue our call to tech firms to design out these methods of offending.”

A government spokesperson said: “Sharing or creating intimate images without consent is a vile crime and we are taking immediate action to tackle this growing issue.

“We have made the creation of intimate images without consent a crime with up to six months in prison and we are banning AI tools which generate deepfake sexual images of people without consent, with developers and suppliers facing up to three years in prison.”

Original Article on Source

Source: “AOL Breaking”

We do not use cookies and do not collect personal data. Just news.