The dangers of hyper-realistic computer-generated images shot to the public forefront last January, when sexually explicit deepfakes of Taylor Swift went viral, spurring calls for congressional regulation on the technology.
But celebrities are not the only victims of deepfakes. Artificial intelligence software can allow bad actors to transform an innocent photo of anyone into a pornographic image, with the rapidly advancing technology making it difficult to tell when an image has been manipulated.
Dozens of Aurora Public Schools students were extorted in 2024 when a group of teenagers allegedly posted illicit photos of the students on social media, then demanded money to take down the images. Some of the photos were real, while others were digitally edited to make the victims appear naked, students reported. Similar scandals have occurred in the Boulder Valley and Cherry Creek school districts, according to the Denver Child Advocacy Center.
Now, Colorado is taking action. A new state law went into effect on August 6, updating the state's child pornography and revenge pornography laws to include deepfakes.
"We're not preparing for a threat that's on the horizon. We're responding to a crisis that's already here," said Will Braunstein, executive director of the Denver Child Advocacy Center, while testifying in favor of the law on April 21. "These images are not harmless. They're not virtual. They're violations. ...Whether abuse is captured by a camera or created by a computer, the harm is real, the intent is the same and the law will respond accordingly."
The new law expands the definition of "sexually exploitative material" in criminal laws regarding the sexual exploitation of a child to include realistic computer-generated images of an identifiable child. It expands the criminal offenses of posting a private intimate image of a person for harassment or financial gain to include intimate digital depictions.
It also creates a private right of action for victims of deepfake pornography, allowing a victim to sue an individual who shared or threatened to share the image if the victim did not consent to its disclosure; victims can sue if they're identifiable in the image and would experience severe emotional distress from the disclosure.
Colorado is now one of forty states that have enacted laws to criminalize AI-generated or computer-edited child sexual abuse material, according to the child advocacy organization Enough Abuse. In addition to creating sexually exploitative material, Braunstein said some offenders edit real child pornography to make it appear computer-generated to disguise their crimes.
Even in cases where physical abuse didn't occur, the deepfake images still cause extreme mental anguish for victims, according to Diana Goldberg, executive director of SungateKids Child Advocacy Center.
"I defy anyone to look in the eyes of a thirteen-year-old whose father has attached her face to an AI child's body engaged in sexual acts and then distributed it, as we had to, and tell me it isn't real victimization," Goldberg testified to lawmakers on April 21. "That anguish is real and that victimization is real."
The law received bipartisan support and opposition in the Colorado Legislature. Some opponents said the law was being rushed while the technology is still evolving; others argued that the law does not go far enough because it was amended to exclude fully computer-generated child pornography that can't be identified as using the likeness of a real person.
One of the primary topics of debate, however, was an amendment that exempts AI companies from civil liability when their software is used to generate sexually explicit images of people.
"I believe firmly that the AI companies that are being fed the prompts and are the ones that ultimately produce the imagery of child pornography, they should be also be held liable in addition to the horrible people that are seeking this imagery," said Representative Lorena Garcia during the House debate on May 5. She argued that the companies should be responsible for implementing safeguards to prevent such uses of their software.
Representative Matt Soper, a sponsor of the law, defended the liability exemption by describing AI software as a neutral tool, like a pencil: "Should the Ticonderoga pencil company be liable if a user of a pencil draws child pornography with that pencil? That's the exact analogy that's happening here."
The law ultimately passed with support from 78 out of 100 legislators. Governor Jared Polis provided the final stamp of approval on June 2.
"Artificial intelligence is moving at light speed and working on policy to address its impacts can be an uphill battle," said Majority Leader Robert Rodriguez, a sponsor of the law, in a statement. "This will allow both adult and child victims of intimate deepfakes and explicit AI-generated materials to seek justice."