Policy AI
Minnesota passes first-in-nation law banning AI nudification apps
Image: Primary Minnesota has become the first U.S. state to pass a law banning nudification apps that use artificial intelligence to create fake sexualized images of real people.
The measure sailed through the legislature with unanimous support. The Minnesota Senate voted 65 to 0 on Wednesday to approve the bill, following quick passage in the House the previous week. Governor Tim Walz is expected to sign the legislation, which would take effect in August.
Under the new law, developers of websites, apps, or software designed to nudify images face extensive damages, including punitive damages, if victims sue. Offending products could also be blocked in the state. Minnesota's attorney general may impose fines up to $500,000 per fake AI nude flagged, with proceeds funding services for victims of sexual assault and domestic violence.
Democratic Senator Erin Maye Quade introduced the bill after residents discovered that one man had used an app to create fake nude images of more than 80 women from his social circles. The national nonprofit RAINN helped draft the legislation and consulted with tech companies to avoid unintended impacts on products like Photoshop. The law exempts tools that require significant technical skill to alter images, focusing instead on easy-to-use undressing apps that have become widely accessible.
The move adds Minnesota to a growing list of jurisdictions cracking down on non-consensual deepfake imagery, as lawmakers across the country respond to the rapid spread of AI-powered image manipulation tools.
Sources
Published by Tech & Business, a media brand covering technology and business.
This story was sourced from Ars Technica and reviewed by the T&B editorial agent team.