Is There Finally Hope for Victims of Deepfake Tech?

(Dreamstime)

By Tuesday, 25 June 2024 12:31 PM EDT ET Current | Bio | Archive

Sen. Ted Cruz, R-Texas, introduced the Take It Down Act, bipartisan legislation requiring social media sites to remove explicit images and make publishing it a federal crime.

On June 18, Cruz, the U.S. Senate Commerce Committee ranking member, led the charge on illegalizing nonconsensual intimate images (NCII), including deepfakes and generative artificial intelligence (AI) content.

An acronym for Tools to Address Known Exploitation by Immobilizing Technological Deepfakes on Websites and Networks, the Take It Down Act is designed to protect and empower victims of nonconsensual intimate image abuse.

If passed, the act would be a victory for victims, women, and girls across the nation as the legislation would make it a federal offense for NCII images to be published online. 

In my June 2023 article, "Lawmakers Must Rush to Protect Us from Deepfakes," I warn of the implications of deepfake technology, with celebrities like Bella Thorn also expressing concern over violations against noncelebrities.

In October, a police investigation reportedly commenced over AI-generated pornographic images of female students at a New Jersey high school. The illicit images were generated and circulated by male classmates, inciting outrage from parents and concerned citizens.

In a public service announcement released last year, the FBI highlighted the atrocious trend of manipulating web-sourced images — or those requested from victims — into sexually themed deepfakes. It also acknowledged the "significant challenges" in removing deepfake content from the web. 

Sen. Ted Cruz and his colleagues answered this challenge with the Take It Down Act.  

Websites containing user-generated content will be required to have a notification and takedown process, ensuring that victims of NCII are afforded a straightforward process for illicit content removal.

With enforcement by the Federal Trade Commission, the act requires websites, including social media, to have procedures for removing NCII within 48 hours of a victim's request. 

The act also places the onus on web platforms to ensure that content on their sites does not contain NCII.

"Social media platforms and those that distribute revenge porn need to be held accountable. Our bill will make sure that even computer-generated deep fakes will not be allowed to stay online. I am proud to stand with my colleagues to help stop this sickening practice that has become far too common," Sen. Shelley Capito, R-W.Va., reportedly said in her official statement.

In addition to criminalizing the publication of NCII in interstate commerce, the act makes it unlawful to knowingly publish NCII to the web, including deepfake images and videos of real people. It also states that a victim's consent to create authentic images does not constitute consent to publish them. 

"In recent years, we've witnessed a stunning increase in exploitative sexual material online, largely due to bad actors taking advantage of newer technologies like generative artificial intelligence. Many women and girls are forever harmed by these crimes, having to live with being victimized again and again," Sen. Cruz is quoted as saying in a press release regarding the act.

"By creating a level playing field at the federal level and putting the responsibility on websites to have in place procedures to remove these images, our bill will protect and empower all victims of this heinous crime," Sen. Cruz reportedly added.

In a published statement, Sen. Tedd Budd, R-N.C., is quoted as saying, "The shocking growth in online sexual exploitation and blackmailing requires a national response.

"The Take it Down Act builds on existing federal law, accounts for the growth in technologies that make it easier to create fake images, and establishes a requirement for websites to respond to victims and take down explicit material. I am proud to join my colleagues in the bipartisan effort to protect Americans from this growing crime and to bring those who perpetrate it to justice."

Sen. Todd Young, R-Ind., reportedly referred to the Act as "a sensible step to protect Americans." 

"We are increasingly seeing instances where generative AI is used to create exploitative images of an individual based on a clothed image. This bipartisan bill builds on existing federal law to protect Americans, particularly young women, from harmful deepfakes and establishes a requirement for websites to take down this type of explicit and disturbing material." 

When responding to the case involving New Jersey high school girls who are victims of deepfaked NCII, Shelley Brindle, Westfield, New Jersey's first female mayor, is reported as saying, "To be in a situation where you see young girls traumatized at a vulnerable stage of their lives is hard to witness."

By introducing this legislation, Sen. Cruz and his colleagues acknowledge this trauma and the profound psychological, emotional, and reputational harm caused by the malicious use of deepfake technology. They are also safeguarding against it.

This is much-needed legal protection against NCII and can't be signed into law soon enough.

V. Venesulia Carr is a former United States Marine, CEO of Vicar Group, LLC and host of "Down to Business with V.," a television show focused on cyberawareness and cybersafety. She is a speaker, consultant and news commentator providing insight on technology, cybersecurity, fraud mitigation, national security and military affairs. Read more of her reports — Here.

© 2024 Newsmax. All rights reserved.


VVenesuliaCarr
Sen. Ted Cruz, R-Texas, introduced the Take It Down Act, bipartisan legislation requiring social media sites to remove explicit images and make publishing it a federal crime.
deepfake, tech, cruz, take it down act
848
2024-31-25
Tuesday, 25 June 2024 12:31 PM
Newsmax Media, Inc.

View on Newsmax