Tech firms will have 48 hours to remove abusive images under new law

tech-firms-will-have-48-hours-to-remove-abusive-images-under-new-law

Richard MorrisTechnology reporter

Getty Images

Tech platforms would have to remove intimate images which have been shared without consent within 48 hours, under a proposed UK law.

The government said tackling intimate image abuse should be treated with the same severity as child sexual abuse material (CSAM) and terrorist content.

Failure to abide by the rules could result in companies being fined up to 10% of their global sales or have their services blocked in the UK.

Prime Minister Sir Keir Starmer said it is part of an “ongoing battle” with platform providers on behalf of victims.

Janaya Walker, interim director of the End Violence Against Women Coalition, said the move “rightly places the responsibility on tech companies to act.”

The proposals are being made through an amendment to the Crime and Policing Bill, which is making its way through the House of Lords.

Under the plans, victims would only have to flag an image once, rather than contact different platforms separately.

Tech companies would have to block the images from being re-uploaded once they have been taken down.

The proposal would also provide guidance for internet service providers to be able to block access to sites hosting illegal content, the idea being that this would target rogue websites that currently fall outside of the reach of the Online Safety Act.

Women, girls and LGBT people are disproportionately affected by Intimate Image Abuse (IIA).

A government report in July 2025 found young men and boys were largely targeted for financial sexual extortion – sometimes referred to as “sextortion” – where a victim is asked to pay money to keep intimate images from being shared online.

A Parliamentary report published in May 2025 highlighted an increase of 20.9% increase in reports of intimate image abuse in 2024.

Speaking on BBC Breakfast, the prime minister said the rule would mean a victim of intimate image abuse “doesn’t have to do a sort of whack-a-mole chasing wherever this image is next going up”.

He noted that tech companies are “already under that duty when it comes to terrorist material so it can be done. It’s a known mechanism,” adding that “we need to pursue this with the same vigour.”

Sir Keir said the law would be enforced by fines and other measures yet to be determined, by a “combination of oversight bodies in relation to what’s online and then it will be a criminal matter”.

He said he did not think this would include prison sentences for tech bosses.

Technology Secretary Liz Kendall said: “The days of tech firms having a free pass are over… no woman should have to chase platform after platform, waiting days for an image to come down”.

The announcement comes after the government’s standoff with X in January, when AI tool Grok was used to generate images of real women wearing very little clothing.

This eventually led to the function being removed for users.

Leave a Reply