The UK government announced on Tuesday that creating and sharing sexually explicit “deepfakes” will become a criminal offense, as part of efforts to address the growing issue of such images, primarily affecting women and girls.
Deepfakes are AI-generated videos, images, or audio clips that are manipulated to appear real, and the technology can be used to insert someone’s likeness into pornographic content. While publishing intimate images without consent, known as revenge porn, became a crime in 2015, the current law does not address the use of fake images.
According to the UK-based Revenge Porn Helpline, incidents of image-based abuse using deepfakes have risen by over 400% since 2017.
Under the new law, those found creating or sharing sexually explicit deepfakes could face prosecution. The justice ministry emphasized that there is no justification for producing these images without consent.
The previous Conservative government, which was succeeded by the Labour Party in July, had proposed similar plans to make deepfakes a criminal offense, with penalties including fines and imprisonment.
The government stated that additional details of the new offense would be provided in due time.
In addition, the government will introduce new crimes related to taking intimate images without consent and installing equipment for such purposes. Convictions could lead to up to two years in prison.
“Such degrading behavior must not be normalized,” said Victims Minister Alex Davies-Jones.
Technology Minister Margaret Jones stated that tech platforms hosting abusive content would face stricter oversight and penalties.
“Intimate-image abuse is a national emergency that is causing significant, lasting harm to women and girls who lose control over their digital identity due to online misogyny,” said campaigner Jess Davies.
These new offenses will be part of the government’s Crime and Policing Bill, which is set to be introduced to parliament, though the date remains unconfirmed.