- cross-posted to:
- technology@beehaw.org
- cross-posted to:
- technology@beehaw.org
Instagram is profiting from several ads that invite people to create nonconsensual nude images with AI image generation apps, once again showing that some of the most harmful applications of AI tools are not hidden on the dark corners of the internet, but are actively promoted to users by social media companies unable or unwilling to enforce their policies about who can buy ads on their platforms.
While parent company Meta’s Ad Library, which archives ads on its platforms, who paid for them, and where and when they were posted, shows that the company has taken down several of these ads previously, many ads that explicitly invited users to create nudes and some ad buyers were up until I reached out to Meta for comment. Some of these ads were for the best known nonconsensual “undress” or “nudify” services on the internet.
My body is not inherently for your sexual simulation. Downloading my picture does not give you the right to turn it in to porn.
Did you miss what this post is about? In this scenario it’s literally not your body.
There is nothing stopping anyone from using it on my body. Seriously, get a fucking grip.
Do you have nudes out there? Because if not then yes, that would stop people. The ai can’t magically reveal what’s actually under your clothes.
You’ve lost this argument when you don’t even know what the argument is about.
The problem isn’t what is actually under because only you or people you choose would know that, the problem is it appears like it’s what is actually under your clothes. What do you think people should do, say “That’s not what I actually look like naked, this is what I actually look like naked” or something?
Literally yes. As this becomes a more prevalent, widespread issue, eventually we’re going to reach a point where seeing a nude of someone is effectively meaningless, as it’s just as likely that it’s fake as it is real.
This is just a transitional phase. It’s going to be rough for sure, especially with how puritan and judgmental our culture is, but my point stands.
Scroll back up.
Entirely screw off with this gaslighting BS.
That’s their pitch, it’s their way of advertising to people. The ai isn’t literally psychic. All the ai is doing is guessing by using a database of thousands, tens of thousands, hundreds of thousands of naked bodies, and trying to fill in the blanks based on what it thinks yours probably looks like.
The AI isn’t magic, it doesn’t have the ability to somehow reveal what you look like without knowing. It’s the equivalent of really good photoshop effectively.
You haven’t explained anything I didn’t already know. Of course it’s not psychic nor 100% accurate to real life.
But this is flat out bullshit and wildly disingenuous. You’re entirely ignoring the fact that when it’s posted that no one but the creator of the deepfake knows its fake. Everyone else just sees a nude. You are playing semantics while ignoring the actual harm.
At this point, I just see you as a troll who isn’t interested in any sort of good faith discussion.
I’m not trying to be disingenuous, it genuinely sounded like you didn’t realize, that’s my bad.
Imo as these become widespread, we’ll inevitably reach a point where nudes simply don’t matter. If anyone can create a nude of someone else with next to no effort in seconds, then a nude getting “leaked” would have next to no impact or relevance.
Right now we live in a pretty puritan society, so the transitional phase is going to suck and people are going to be hurt. Obviously that’s awful, and none of this should take away from that fact, I feel horrible for the people negatively impacted by this. And while that’s all true, it’s also true that as we continue going down this road we’ll reach a point where it simply won’t matter anymore.
You get to tell me what i can and cannot think about in my own head?
WTF?
There is a huge ass difference between your personal thoughts and using a subjects social media, a database of existing nudes and AI to have REAL MEDIA produced.
Seriously, not even remotely similar and its frankly disturbing that this is even your thought process.
If thought crime is a thing, im out.
That is crossing the Rubicon.
There is no harm done to you or your body with an AI generated image or video.
Blackmail and extortion are crimes of their own, as are rape and sexual assault.
But thinking about something and using tools to visualize it are not crimes.
Maybe society overreacts to nudity. Maybe society’s attitude to sex needs to change. Maybe opression and regulation of sex has been a major form of control over society and oppression of certain groups.
People are too concerned with their own junk to see the actual issue.
lmao
It’s not thoughtcrime you giant crybaby.
This is SUCH a huge leap. You have a right to your thoughts, not databases, programming and services to generate media.
Stay classy, fascist.
You want to limit what data people can collect and share what programs they can write.
Why stop there?
Prohibit what paintings they cam make. What drawings can be drawn. What words can be written.
Fascist.
LMAO!
Picture me openly laughing right in your damn face. This is such a hilarious use of the word that I can’t even begin to take it serious and it’s a tragedy that YOU are taking it serious. I’m sorry you feel so entitled to other peoples bodies and likeness. That is honestly very sad for you.
I’m saying it’s wrong to do this because of the harm it causes, not outlawing and executing those that make the code. But that distinction is lost on you because “fascism” is anything you don’t like. Absolutely hilarious.
(P.S I am screenshot this and show it to as many people as I can)
Can U haz chzbrgr?
If the people you show this to have any intelligence, or understanding of the Bill of Rights, they will laugh in your face.