-Advertisement-

-Advertisement-

This harmful trend is so much more than a ‘youthful transgression’

Last fall, after boys at a high school near my home in New Jersey used artificial intelligence to create nude deepfakes of their female classmates, some local parents viewed the episode as a “youthful transgression,” according to The Wall Street Journal.

The description is delusional and dangerous, as the problem has become an “epidemic” for teen girls in the US, according to The New York Times.

At schools across several states, boys have created and circulated nude deepfake images of their female classmates, according to The Times. Some of the recent victims at a Beverly Hills middle school were just 12 and 13 years old.

It’s critical for our society to recognize pornographic deepfakes for what they are: a form of violence against women and girls that must be stopped with education, changes in user behavior and legislation.

As I write in my new book, when intimate images of a woman go public, that puts her at risk of sexual assault, depression and suicide. It makes it harder for her to find a job. And it makes it harder for her to date. In other words, it can be life-destroying.

But these kinds of episodes are likely to only become more common. Tools for creating images using artificial intelligence are now widely available online. And the vast majority of images created with AI are pornographic, according to experts.

Educators and parents need to teach young people what a grotesque form of violence nude deepfakes are so they can never again use the excuse that they were just experimenting and having some fun when they created them. Schools and parents should also clearly lay out the punishments that will befall kids if they create them — and their severity should be commensurate with the unthinkable harm deepfakes do to victims. I’d also like to see education sessions run for adults in places like libraries and community centers, to teach them about the potential consequences of deepfakes and how to spot them and other kinds of misinformation.

This kind of broad public education would hopefully cause people to think twice about engaging with nude deepfakes at all through things like clicks, likes and shares. As I explain in my book, when we see someone share something inappropriate, it’s important to resist the temptation to clap back by commenting on their post, since comments send a signal to social networks that people like and want to see more similar content. But, if you know the person offline, you should have a conversation with them about why their posts are unacceptable in order to help socially stigmatize the practice.

Lawmakers must also step in. In January, a bipartisan group of members of Congress introduced the Disrupt Explicit Forged Images and Non-Consensual Edits Act of 2024, known as the DEFIANCE Act, which would allow victims to sue people who create deepfakes of them if the perpetrator knows or recklessly disregards the fact that the victim doesn’t consent to them. The bill is an imperfect solution because it’s often hard to identify perpetrators — especially if law enforcement doesn’t take the incident seriously and investigate. But it could still act as a powerful deterrent. Congress needs to pass this legislation now.

Last year, President Joe Biden secured voluntary commitments from major AI companies to watermark AI-generated content, and major tech companies are now working together to develop standards for doing so. This needs to be required by law. In September, Nebraska Republican Sen. Pete Ricketts introduced a bill that would require AI-generated content to be watermarked, but it doesn’t go far enough. Watermarks on AI content can be easily removed, so it’s also important to prohibit the removal of such tags.

Americans need to wake up to an important fact: Nude deepfakes destroy women’s lives. There’s nothing innocent about creating or engaging with them. Schools, parents and libraries need to educate people about why it’s unacceptable to make or share them. We need to make clear to people we know who engage with them that it’s shameful. And lawmakers need to require watermarks on AI and allow victims to sue their perpetrators. Our society needs to take this new form of violence against women and girls seriously.

Leave A Comment

Your email address will not be published.

You might also like
where to buy viagra buy generic 100mg viagra online
buy amoxicillin online can you buy amoxicillin over the counter
buy ivermectin online buy ivermectin for humans
viagra before and after photos how long does viagra last
buy viagra online where can i buy viagra