Skip to main content

Pornographic Deepfakes in Schools

Computer lab

The New York Times recently reported on another case of a public school having to deal with issues of students using deepfake and image rewriting technology to create nude and sexual images of other students in their school. The story focused on a school in New Jersey which had an incident that happened five months ago. The school and school district were slow to understand what to do. This is a case in which their instincts were positive in protecting the girls who have functionally been abused and bullied by these actions. But it didn't occur to the school to call the police, even though the creation and distribution of child pornography is illegal. 

The argument tends to be, “Well, this is new technology. We don't know what to do.” But it's not that new, and quite truthfully, the law is clear that the production of these images when they include children is simply illegal. When you have someone in your school doing something illegal, you must tell the police.  

If a child stabs another child in school, you wouldn't say, “We didn't know we had to report it to the police. We got the wounded child to the hospital, suspended the other one for two days, and told everyone not to bring knives into school anymore.”  

One of the problems here is that we’re not all that comfortable discussing the kinds of harm that these sorts of systems produce. This kind of technology is being used to manipulate images of not just children, but women in general. There are literally 10,000 sites where you can do this. There's outrage when offensive images of Taylor Swift are generated, and then everybody forgets about it. But it's an ongoing problem. People are being hurt, and even though it is an area that makes us uncomfortable, we need to take steps to stop it. 

It's not a matter of regulating the technology because it's everywhere and incredibly useful. Photo manipulation is on our smartphone cameras and desktops and has changed the nature of photography. The issue is the dissemination of the materials that are produced. It doesn’t matter what the technology is. If people were doing this with Photoshop and disseminating images, the same rules should apply. It's about the output and the result. We sometimes get so focused on the technology that we forget to focus on the harm. 

Kristian Hammond
Bill and Cathy Osborn Professor of Computer Science
Director of the Center for Advancing Safety of Machine Intelligence (CASMI)
Director of the Master of Science in Artificial Intelligence (MSAI) Program

Back to top