In 2023, more deepfake abuse videos were shared than in every other year in history combined, according to an analysis by independent researcher Genevieve Oh. What used to take skillful, tech-savvy experts hours to Photoshop can now be whipped up at a moment’s notice with the help of an app. Some deepfake websites even offer tutorials on how to create AI pornography.
What happens if we don’t get this under control? It will further blur the lines between what’s real and what’s not — as politics become more and more polarized. What will happen when voters can’t separate truth from lies? And what are the stakes? As we get closer to the presidential election, democracy itself could be at risk. And, as Ocasio-Cortez points out in our conversation, it’s about much more than imaginary images.
“It’s so important to me that people understand that this is not just a form of interpersonal violence, it’s not just about the harm that’s done to the victim,” she says about nonconsensual deepfake porn. She puts down her spoon and leans forward. “Because this technology threatens to do it at scale — this is about class subjugation. It’s a subjugation of entire people. And then when you do intersect that with abortion, when you do intersect that with debates over bodily autonomy, when you are able to actively subjugate all women in society on a scale of millions, at once digitally, it’s a direct connection [with] taking their rights away.”
Friendly reminder we’ve had photoshop for decades. Legislation can’t keep up with technology and trying to do so will almost always come at the cost of constitutional rights. Like freedom of expression.
If I want to photoshop a dick on Trump’s face, nobody should be allowed to tell me no. It’s not fucking “interpersonal violence”.
Photoshopping a dick onto Trump’s face is 100% protected expression. Producing a photoreal deepfake of him balls deep in Lindsey Graham’s ass while Mitch McConnell can be seen holding the camera in a mirror wearing a ballgag and cuck strap then posting it online either without context or trying to pass it off as real is a problem.
Oddly specific. Donnie, is that you, trying to get ahead of a video that’s about to leak?
About to leak? Somebody hasn’t been paying attention to the news.
The problem isn’t just scale, it’s ease of use. Photoshop took time, skill, and it was usually still pretty apparent that a photo had been manipulated—not to mention the evidence when you can find the original elements in the actual photo, which could’ve been done by anyone who was willing to search enough in order to debunk it. Now AI gives near flawless photoshop skills to every single person, infinitely upping the likelihood that a complete fabrication of unique elements, untraceable to any original photo, can cause serious harm.
Remember that pope in the puffy jacket photo? It had telltale signifiers of AIgen, but it still fooled insane amounts of people. Now, make the photo abusive and with a small amount of work, erase the AI flaws. And release it at an opportune time for the bad actor (I would bet a lot that we will see some of this as the election nears. A truly groundbreaking “October surprise”). What’s that old saying? “A lie will travel halfway around the world before the truth has a chance to pull its boots on?”
You’re right that it’s hard for legislation to keep up with technology. But that’s because technology companies are insanely rich and can endlessly lobby. And we have corrupt as fuck legislators. We could keep up with technology. But the system is broken in favor of those who want zero oversight. And it breaks further every time one of them is successful. Regulating massive companies to hobble their ability to cause lasting damage should not be mentioned alongside terms like “freedom of expression.” Yes, the power to create these images is technically in the hands of the people feeding the AI the prompt, but restricting the abilities of a company to hand dangerous tools to anyone and everyone isn’t the same thing as restricting people’s right to create. I think that’s a dangerous way of thinking.
deleted by creator
Any judge or jury wanting to blame victims will still find plenty of leeway in this law.
The same ones that put guys on the sex offender registry for peeing outdoors?
It’s too late at this point IMO, you can make AI generated porn on your PC… How exactly are they going to stop it?
“The legislation amends the Violence Against Women Act so that people can sue those who produce, distribute, or receive the deepfake pornography, if they “knew or recklessly disregarded” that the victim did not consent to those images.”–The article.
I read the article… amending a law doesn’t make the problem go away.
Maybe if more attention was given to the politicians talking about this half a decade ago (instead of focusing on AOC, which honestly realized this issue way too late), something more meaningful could have been done.
That wasn’t the point of the article though. It isn’t either/or.
It isn’t either/or.
kind of, this is like doing a blame game for climate change but 30 years in the future.
im sure dealing with it before it became so ubiquitous would have been easier.
My issue with the topic is that everyone targets the wrong thing and just jumps on the media hysteria. They are not going to be able to stop the production or distribution of deepfakes, and imo they shouldn’t, because they’re basically just an advanced form of photo & video editing that already existed for decades and it did not bother anyone up until now that “AI” became a media scapegoat. What they should be bother to enforce is the illegitimate use of such material, for things like blackmail, bullying, disinformation etc. Some neckbeards wanking one out on a clearly marked deepfake porn video isn’t really going to harm the person depicted. Using such a video to claim it is real on the other hand to smear or blackmail them on the other hand is. And this type of bullying has also been going on for decades through classical photo & video manipulation, and again, it did not bother anyone up until now. And by focusing on this idiotic media hype instead of the real issue we basically make sure that it keeps on happening, just like with climate change.
alright you have a point, although AI made this process infinetly simpler. if you have a remotely sufficient computer and some knowledge on how to operate the tools, you can do that in a matter of a few hours of actual work.
im surprised there isnt more deepfakes.
It’s usually quite a hassle to set those tools up, especially if you don’t have much technical knowledge. A lot of the more resource heavy tasks are also not really possible on a home computer and require big servers with multiple GPUs and absurd amounts of VRAM, or very specific APUs but those are still very early. The majority of what you can do at home is typically limited to generating pictures, and even there it takes quite a bit of time if you want some really high quality stuff. For a lot of more complex tasks you’re simply resource limited. And in regards to time I’m just talking for the actual generation process. Getting good results, and to the point of getting them, is another lengthy process that many people underestimate. It’s not a magic button because those LLMs are pretty damn stupid actually.
people can sue those who produce, distribute, or receive the deepfake pornography
So can I send someone deepfake porn and then sue them?
And they could counter sue you for distributing it to them and you’d probably lose
It has to be possession with intent to disclose (meaning to share). Uploading something to a lemmy instance should do it.
Might be better to use a proxy to send it.
The law creates a new kind of intellectual property, so one would expect the enforcement problems to be similar to copyright. However, there are some big differences.
One is that the minimum damages are 150k USD + attorney’s fees/costs. That’s going to unleash quite some entrepreneurial zeal.
To be on the hook, “possession with intent to distribute” is enough if one “recklessly disregards” that a depicted individual did not consent. EG if you come across nudes of some celebrities on your lemmy instance, you better delete them immediately. Assuming that the celebrity consented to the images being shared sounds like “reckless disregard” to me. If it’s just someone, then it’s no problem.
This definitely will make some people quite a lot of money.
By making gay AI generated porn of republicans so they call it the devil and start a witch hunt.
If that started happening I wonder how fast the laws would change.
The real problem is that people automatically believe what they see online, no matter how ridiculous or outrageous, rather than thinking about probability and provenance and supporting evidence and all that stuff.
Unfortunately, this problem is not likely to be solved any time soon, since we’ve had more than a quarter-century now (since the advent of image editing software) to work on it. Hell, even further back than that, a certain percentage of the population could be fooled into believing in UFOs by a blurry black-and-white photograph of pie plates suspended from fishing line. We’re never gonna fix this.
The problem also goes both ways though. Not only does it create fake things but it makes it much easier to discredit real things by just claiming that it’s deep faked.
… its the scale.
we’ve had photograph manipulation since the photograph. we’ve not had the ease and scale which we are about to have. and its not the same.
anyone can open the box at the corner and mess with a traffic light. and has been able to since we had them. now give me the ability to mess with all the traffic lights in a city.
the difference is scale.
Good. If you don’t own your body and reputation you own nothing. There is a reason why we have laws in place to protect people from false accusations. And since it is pretty believable that any given person does have sex we need to block out the exemption for reasonable person.
You have never owned your reputation.
And - while you sort of own your body - you have never owned depictions of your property (that someone else made with their labor).
If you are wondering what I mean by “sort of owning your body”: You are not allowed to sell it whole or in parts (ie organs). If you try to destroy or damage it, most governments will interfere. In fact, governments provide assistance to maintain that particular piece of property.
This ownership-centric view is simply dystopian.
What you can do is sue for copyright infringement (e.g. if they use IP you own in their model, like pictures from a blog) or defamation (false accusations).
But you’re right, you don’t own your likeness. I can go take a picture of anyone I want and sell it without any issues, provided they’re “in public” at the time. If I take enough to train an AI model, yeah, I could use it to make new images. But if I use those images to claim something untrue that’s also damaging, they can sue me.
I wish we had more ownership of our bodies though. Suicide should be a right (and doctor assisted suicide should be legal), consensual prostitution should be legal, etc. I’m less interested in selling organs though, just due to the completely coercive nature of it.
I wish we had more ownership of our bodies though.
Do you think this ownership view might be connected to the state of health care in the US? Me, I would balk at being asked to pay to maintain someone’s else’s property. If they can’t afford it, they should sell it. That’s not the attitude I have toward the human body, though.
No, the healthcare issues are complex and involve a lot of corruption and inertia, not beliefs around body ownership. In fact, I’d argue it’s quite the opposite, Americans in general aren’t in favor of bodily ownership, so things like doctor assisted suicide are generally restricted or outright banned. There is a lot of pearl clutching though.
My personal perspective is that as long as there’s proper consent, individuals can do what they want with their bodies. But my barrier for “proper consent” is pretty high. Something like prostitution is pretty straightforward with minimal surprises, but selling organs requires pretty in depth knowledge about long term consequences of the surgery and loss of the organ. However, both have a high risk of coercion, so there needs to be rules in place.
But the pearl clutchers just say no to anything that sounds distasteful.
Just because you own something does not mean there are no rules. I can own property that doesn’t mean I get to light it in fire, or dump chemicals on it that cause an environmental nightmare, or kill a protected animal, or fish a river flowing through it without a license, or run a meth lab on it…
You own your body and image. 5th amendment. Not having absolute unlimited power over both doesn’t change who the owner is. Frankly this type of black and white thinking is lolitarian logic.
It’s disappointing that AOC supports this capitalist law. This law is not against harassment. The DEFIANCE act creates a new kind of intellectual property.
This is the best summary I could come up with:
“It’s not a question of mental strength or fortitude — this is about neuroscience and our biology.” She tells me about scientific reports she’s read about how it’s difficult for our brains to separate visceral images on a phone from reality, even if we know they are fake.
Ocasio-Cortez is one of the most visible politicians in the country right now — and she’s a young Latina woman up for reelection in 2024, which means she’s on the front lines of a disturbing, unpredictable era of being a public figure.
She recently co-published a paper for UNESCO with research assistant Dhanya Lakshmi on ways generative AI will exacerbate what is referred to in the industry as technology-facilitated gender-based violence.
Mary Anne Franks, a legal scholar specializing in free speech and online harassment, says it’s entirely possible to craft legislation that prohibits harmful and false information without infringing on the First Amendment.
The legislation amends the Violence Against Women Act so that people can sue those who produce, distribute, or receive the deepfake pornography, if they “knew or recklessly disregarded” that the victim did not consent to those images.
“Congress is spearheading this much-needed initiative to address gaps in our legal framework against digital exploitation,” says a spokesperson for Mace, who recently introduced her own bill criminalizing deepfake porn.
The original article contains 4,747 words, the summary contains 218 words. Saved 95%. I’m a bot and I’m open source!