US Resident Faces 70 Years in Prison for Creating AI-Generated Pornography
Steven Anderegg, a 42-year-old resident of Holmen, Wisconsin, has been arrested for creating, distributing, and possessing child sexual abuse material (CSAM) generated using artificial intelligence. Anderegg allegedly used the popular text-to-image model Stable Diffusion to create sexually explicit images involving minors.
According to a statement from the US Department of Justice, “Many of these images depicted nude or partially clothed minors, lewdly displaying or touching their genitals, or engaging in sexual acts with adult men.” Evidence collected from Anderegg’s devices shows that he intentionally created sexualized images of prepubescent children using explicit text prompts.
Anderegg also communicated with a 15-year-old, to whom he sent sexually explicit material via Instagram. He explained to the teenager how he created the CSAM and what text prompts he used. Anderegg was identified after a “cyber tip from the National Center for Missing and Exploited Children (NCMEC), following a report from Instagram about his account distributing these images.”
“A federal grand jury in the Western District of Wisconsin returned an indictment on May 15, charging Anderegg with creating, distributing, and possessing obscene visual depictions of minors engaged in sexually explicit conduct, as well as transmitting obscene material to a minor under 16,” the statement said.
If convicted, Anderegg faces a maximum sentence of up to 70 years in prison, with a mandatory minimum of five years.
“Technology may change, but our commitment to protecting children will remain steadfast,” said Deputy Attorney General Lisa Monaco. “The Department of Justice will aggressively pursue those who produce and distribute child sexual abuse material (CSAM), regardless of how it is created. Simply put, CSAM created with AI is still CSAM, and we will hold accountable those who use AI to create obscene, abusive, and increasingly photorealistic images of children,” Monaco concluded.