Nudification deepfake tech could be banned in MN under new proposed bill
DFL lawmaker on deepfake pornography bill [RAW]
In 2025, Sen. Erin Maye Quade (DFL-Apple Valley) spoke on her anti-nudification proposal to protect Minnesotans from non-consensual, AI-generated pornographic images.
ST. PAUL, Minn. (FOX 9) - Hoping to stop the predatory behavior of using an app or website to alter an image or video to reveal another person’s private parts – a process known as "nudification" – Minnesota lawmakers are considering new legislation that would prohibit the creation of these "deepfakes" while also paving the way for lawsuits against those who’ve created them.
Nudification deepfakes in Minnesota
What we know:
HF1606, sponsored by Rep. Jessica Hanson (DFL-Burnsville), would prohibit someone from accessing, downloading or using a website, app or software to nudify an image or video, or to do so on behalf of someone else.
It would also ban advertisements or promotions of products that can nudify images or videos, while allowing for lawsuits against people who created the deepfakes.
The backstory:
Last legislative session, a bill proposed by Minnesota lawmakers also sought to target with fines the use and availability of nudification technology in the state.
Similarly, it prohibited the use of the tech, while allowing for any person who has been injured by the use of the tech to file a civil lawsuit for damages of no less than $500,000 for each unlawful access, download or use.
However, the effort didn't quite make it to the finish line. After being rolled into a larger collection of bills, known as the public safety/judiciary omnibus, it failed to be passed by the House and Senate conference committees.
Nudification law eyed by Minnesota lawmakers
A bipartisan Minnesota bill would make it illegal for apps and websites to allow anyone to "nudify" photos and videos.
Previously, a bill approved in 2023 provided penalties for those who engage in the creation and spread of "deepfakes" – videos and images that have been digitally created or altered with AI. The tech is often aimed at spreading political misinformation and pornography, among other things.
A person found guilty of their usage can now face up to five years in prison, or a civil penalty of up to $10,000 for each instance. Texas, California and Virginia are among other states with similar laws that restrict the use of deepfakes.
The backstory:
In 2025, Congress passed the Take It Down Act, which criminalizes the non-consensual publication of intimate images – also including "deepfakes" – but it stopped short of penalizing their creation.
The Source: Information provided by FOX 9 previous reporting.