PHILADELPHIA — A South Jersey woman turned to our Investigative Team after she said she was the target of a malicious online attack.
She said the perpetrator created sexually explicit images of her by using artificial intelligence, or AI, and then distributed the photos to friends.
She shared her story as a warning to others.
“I wanted to know, like, who was making them? Like, obviously, what was the source? What was the reason?” said Alyssa Rosa.
Rosa said she learned of the pornographic images with her likeness after a woman contacted her on social media.
She said the woman told her she found them on her boyfriend’s phone, and she tracked down Rosa to make her aware.
“That kind of content never existed of me before, and now it does, and it’s completely without my consent,” said Rosa. “I was mad. I was mad.”
Rosa said she channeled that anger into action.
She learned the images were likely being created by a guy she’d befriended on a social dating app. The same guy she told our Investigative Team also had access to her Facebook photos.
“He would comment on my photos, like, say, ‘thank you,’ ‘you’re so beautiful.’ Comment on photos with my son, like, you know, ‘he’s so handsome.'”
Most worrisome to Rosa is she hasn’t seen most of the content that is allegedly out there since the woman only agreed to share a portion of the photo and text messages with her.
But Rosa said those sexually explicit images were likely altered from a version of her real photos.
“One thing that really stuck with me when in the screenshot she sent me is that he said, ‘I made so many clips of what that ***** would do.’ Like, it’s disgusting. Like, how dare you?”
U.S. Representative Madeleine Dean of Pennsylvania has introduced the bipartisan No Fakes Act. It would help protect victims of “deepfakes”.
“AI is moving so fast, sometimes for very good outcomes and sometimes for very tragic. We have to put the guardrails in place,” said Dean. “It gives a property right to you and to me, to our voice and likeness.”
Deepfakes circulating on the web are quickly increasing. Dean said there needs to be laws to punish those who create or distribute them.
Two other pieces of bipartisan legislation moving through Congress would also require images to be removed and give law enforcement more teeth when going after the creators.
“The Shield Act creates a new criminal offense for someone who knowingly mails or distributes intimate visual depictions,” added Dean.
The legislative push comes after recent high-profile cases, including one earlier this year, which involved pop star Taylor Swift where AI-generated pornographic pictures were distributed on X and went viral.
Rosa said she feels violated. She hopes the person who did this destroys the content, and she hopes laws are put into place to protect victims.
“That’s just way too much power for someone to have access to my likeness and do whatever they want,” she added.
A new Pennsylvania law will go into effect later this month that will make it illegal to create this AI-generated sexually explicit material. House Bill 125 just passed and prohibits AI from being used to generate child sexual abuse images.
And last week, the United States Senate passed the Take Down Act would force social media companies to remove sexually exploited images including deepfakes of any person within 48 hours of being notified by a victim. That bill still needs to pass in the House of Representatives.
Copyright © 2024 WPVI-TV. All Rights Reserved.