The Minnesota House passed bill HF1606, a ban on "nudification" apps that use artificial intelligence to undress individuals in nonconsensual deepfakes, including child sexual abuse material.
Rep. Jessica Hanson, a member of the Minnesota Democratic-Farmer-Labor party and sponsor of HF1606, said this nudification technology "empowered and enabled pedophiles and sexual predators around the globe."
"It has harmed children who are made victims by their cruel peers, women who are made victims by men they have trusted for decades," Hanson said, citing how predators profit off these harmful images by sharing them on the dark web and social media, or exchanging them for money.
HF1606, titled "Nudification technology access prohibited," passed with a near-unanimous vote. The single outlier in the 132-1 vote was Republican Rep. Drew Roach, according to The Deepdive.
Roach called this material disseminated using nudification technology "disgusting" and "vile" but criticized the bill for not addressing the "root cause," arguing that those with technical skills could still generate harmful content without relying on the specified tools.
"We're going to attack a software, a manufacturer … instead of the perpetrators of these crimes," he said. "If we want to prevent this from happening in the future, we should go after those perpetrators with the full force of the law."
Get cost-effective air conditioning in less than an hour without expensive electrical work![]() The Merino Mono is a heating and cooling system designed for the rooms traditional HVAC can't reach. The streamlined design eliminates clunky outdoor units, installs in under an hour, and plugs into a standard 120V outlet — no expensive electrical upgrades required. And while a traditional “mini-split” system can get pricey fast, the Merino Mono comes with a flat-rate price — with hardware and professional installation included. |
Hanson argues that content creation is the root cause. She also argues that the state does go after perpetrators but "the law is not strong enough to catch a lot of them."
This debate is happening as lawmakers across the country, and the world, deploy different legal theories to mitigate the harms caused by AI-generated nonconsensual deepfakes and child sexual abuse material.
In March, three Tennessee teenagers filed a class action lawsuit against Elon Musk's xAI alleging Grok's "Spicy Mode" was weaponized against them. In the same month, two Pennsylvania teens were sentenced on felony counts of sexual abuse of children after creating deepfake child pornography using images of their classmates.
While many states have concentrated on preventing the distribution of explicit deepfakes or non-consensual imagery, Minnesota's approach aims to prevent such content from proliferating by targeting operators of nudification platforms.
Hanson described HF1606 as a way to address "victims' intimate trauma and an AI feature used without the victim's consent to cause harm to the victim's well-being to create violent, dangerous and traumatizing material."
Editor's note: If you or someone you know has experienced AI-facilitated abuse and is in need of assistance, please visit RAINN.org. The National Sexual Assault Hotline (1-800-656-4673) is available 24/7.
Get TCD's free newsletters for easy tips, smart advice, and a chance to earn $5,000 toward home upgrades. To see more stories like this one, change your Google preferences here.








