Florida teen arrested for creating ‘deepfake’ AI nude photos of classmates

Two Florida middle school students were arrested in December and charged with third-degree felonies for allegedly creating deepfake nude photos of their classmates.a report go through wired Citing a police report, two boys, ages 13 and 14, are accused of using an unnamed “artificial intelligence application” to generate explicit images of other students “between the ages of 12 and 13.” The incident may be the first criminal charges in the United States related to images of AI-generated nudes.

They were charged with third-degree felonies under a 2022 Florida law that makes it a crime to distribute deepfake sexually explicit images without the victim’s consent. The arrests and charges appear to be the first of their kind in the United States related to the sharing of AI-generated nudity.

After a student at Pinecrest Cove Academy in Miami, Florida, was suspended on December 6, local media reported the incident and reported the case to the Miami-Dade Police Department.according to wiredthey were arrested on December 22.

The production of AI-generated nudes and other explicit images of children by minors has become an increasingly common problem in school districts across the country. But other than the Florida incident, to our knowledge, none of the incidents resulted in an arrest. There is currently no federal law to address non-consensual deepfake nudity, leaving states to address on their own the impact of generating AI on issues such as child sexual abuse material, non-consensual deepfakes or revenge porn.

Last fall, President Joe Biden issued an executive order on artificial intelligence, requiring agencies to provide a report on banning the use of generative artificial intelligence to create child sexual abuse material. Congress has yet to pass a law regarding deepfake porn, but that could change soon. Both the Senate and House of Representatives introduced legislation this week called the REFERENCE Act of 2024, an effort that appears to have bipartisan support.

While nearly all states now have laws against revenge porn, only a few have passed laws targeting AI-generated sexually explicit images to varying degrees. Victims in states without legal protections also resort to lawsuits. For example, a New Jersey teen is suing a classmate for sharing fake AI nudity.

this Los Angeles Times It was recently reported that the Beverly Hills Police Department is investigating a case in which students allegedly shared images that “used the students’ actual faces over AI-generated nudity.” But because the state’s law against “unlawful possession of obscene material knowing that it depicts a person under 18 engaging in or simulating sexual conduct” does not explicitly mention artificial intelligence-generated images, the article said it is unclear whether it constitutes a crime.

The local school district voted Friday to expel five students involved in the scandal Los Angeles Times Report.

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *