• Business Business

Teens who made AI deepfake porn of classmates sentenced to just 60 hours of community service

The case raises concerns about legal precedents and the need for policy reforms.

A grand building with columns, an American flag, and steps leading up to the entrance under a blue sky.

Photo Credit: iStock

Two teenagers in Pennsylvania have been sentenced on felony counts of sexual abuse of children after creating deepfake child pornography using images of their classmates, according to USA Today.

What's happening?

Between October 2023 and May 2024, two boys generated 347 fake pornographic images and videos featuring 60 girls from their school and in their area, with all but one of the victims being under the age of 18. One of the girls was just 12 years old in one of the photos used by the offenders to create a sexually explicit deepfake.

During the victim impact statements of the court proceedings, many of the young female victims spoke about their struggles with their grades falling, nightmares, anxiety, panic attacks, depression, and PTSD following the deepfake incident. 

"I never imagined school yearbook photos would be used for your own satisfaction," a victim said to one of the perpetrators. "Your actions affect me every day."

On March 25, the two male former students were convicted in juvenile court and received two years of probation. They were also ordered to pay restitution and are required to complete 60 hours of community service for manufacturing child sexual abuse material. They will not be put in a juvenile detention center. 

At the end of their two-year probation period, they will be eligible to have their records expunged.

What is the reaction to this case?

Victims and their families expressed outrage over the boys' future eligibility to expunge their records. Many of the victims are concerned that these photos will continue to resurface throughout their lives, such as when applying to colleges or jobs, or even when meeting potential partners.

This case raises concerns about legal precedents and the need for policy reforms.

What is being done about deepfake crimes?

On May 19, 2025, the first federal legislation in the U.S., called the Take it Down Act, was signed into law to criminalize the distribution of nonconsensual intimate imagery.

In the U.S., 45 states have enacted laws criminalizing AI-generated child sexual assault material. Despite this nationwide effort, many state laws lack specifics to effectively regulate the creation and distribution of non-consensual deepfake nudes. And in several states, AI or computer-generated images are not included in existing child pornography statutes.

Which of these savings plans for rooftop solar panels would be most appealing for you?

Save $1,000 this year 💸

Save less this year but $20k in 10 years 💰

Save less in 10 years but $80k in 20 years 🤑

Couldn't pay me to go solar 😒

Click your choice to see results and earn rewards to spend on home upgrades.

Schools across the country are struggling with the growing problem of deepfake abuse. Researchers warn that school policies, legal recourse, and education lag far behind while the prevalence of deepfake non-consensual intimate imagery skyrockets.

In late 2024, the Pennsylvania Legislature amended its laws to specifically include and define AI child pornography as child sexual abuse material.

You can check your state's statutes for the criminalization for AI-generated child sexual abuse material here. If you believe or know that someone shared an intimate image or video of you, visit the Cyber Civil Rights Initiative's Safety Center.

Get TCD's free newsletters for easy tips to save more, waste less, and make smarter choices — and earn up to $5,000 toward clean upgrades in TCD's exclusive Rewards Club.

Cool Divider