this post was submitted on 27 Jan 2024
280 points (100.0% liked)

Not The Onion

16768 readers
1125 users here now

Welcome

We're not The Onion! Not affiliated with them in any way! Not operated by them in any way! All the news here is real!

The Rules

Posts must be:

  1. Links to news stories from...
  2. ...credible sources, with...
  3. ...their original headlines, that...
  4. ...would make people who see the headline think, “That has got to be a story from The Onion, America’s Finest News Source.”

Please also avoid duplicates.

Comments and post content must abide by the server rules for Lemmy.world and generally abstain from trollish, bigoted, or otherwise disruptive behavior that makes this community less fun for everyone.

And that’s basically it!

founded 2 years ago
MODERATORS
 

cross-posted from: https://mbin.grits.dev/m/mews/t/22301

White House calls for legislation to stop Taylor Swift AI fakes

top 50 comments
sorted by: hot top controversial new old
[–] [email protected] 163 points 1 year ago (5 children)

Nobody cares until someone rich is impacted. Revenge porn has been circulating on platforms uninhibited for many years, but the second it happens to a major celebrity suddenly there's a rush to do something about it.

[–] [email protected] 89 points 1 year ago

What?

This isn't revenge porn, it's fakes of celebrities.

Something that was done for decades, and one of the biggest parts of early reddit. So it's not "the second" either.

The only thing that's changed is people are generating it with AI.

The ones made without AI (that have been made for decades) are a lot more realistic and a lot more explicit. It just takes skill and time, which is why people were only doing it for celebrities.

The danger of AI is any random person could take some pictures off social media and make explicit images. The technology isn't there yet, but it won't take much longer

[–] [email protected] 22 points 1 year ago

I think it's more about the abject danger that unregulated AI replication of noteworthy figures represents to basically everything

Also, revenge porn is illegal in I think every state but South Carolina and even then it might have been banned since I saw that stat

[–] [email protected] 10 points 1 year ago

What a braindead take. Both the US and many other intl countries have enacted AI safety and regulation rules, this is an extension of that effort. The idea is to set a precedent for this kind of behavior. They are also looking into how AI is being used for election interference like having AI Biden tell people not to vote.

Everybody cares, just because it’s not all in place day 0 doesn’t mean nobody does

load more comments (2 replies)
[–] [email protected] 66 points 1 year ago* (last edited 1 year ago) (3 children)

This wasn't a problem until the rich white girl got it. Now we must do... something. Let's try panic!

-The Whitehouse, probably.

[–] [email protected] 27 points 1 year ago (2 children)

Honestly, I kind of don't even care. If that's what it takes to get people to realize that it's a serious problem, cool. I mean, it's aggravating, but at least now something might actually happen that helps protect people who aren't megastars.

[–] [email protected] 16 points 1 year ago (6 children)

You must be new to capitalism, lol

load more comments (6 replies)
load more comments (1 replies)
load more comments (2 replies)
[–] [email protected] 34 points 1 year ago (2 children)

This is what the white house is concerned about........ Fuck them. Like there is so much worse going on in America but oh no one person has ai fake porn images heaven forbid!

[–] [email protected] 23 points 1 year ago (1 children)

The White House is capable of having a position on more than one issue at a time. There also doesn't seem to be a particular bill they are touting, so this seems to be more of a "This is messed up. Congress should do something about it" situation than "We're dropping everything to deal with this" one.

[–] [email protected] 9 points 1 year ago

The White House is capable of having a position on more than one issue at a time.

Doubt.

[–] [email protected] 33 points 1 year ago* (last edited 1 year ago) (1 children)

U.S. government be like:

Thousands of deep fakes of poor people: I sleep.

Some deep fakes of some privileged Hollywood elite: R E A L S H I T.

load more comments (1 replies)
[–] [email protected] 33 points 1 year ago (3 children)

Do you want more AI gens of nude Taylor Swift? Because that's how you get more AI gens of nude Taylor Swift.

load more comments (3 replies)
[–] [email protected] 28 points 1 year ago (2 children)

Surely this should be a priority.

[–] [email protected] 24 points 1 year ago* (last edited 1 year ago) (1 children)

Well, it's not really just about Swift. There are probably many other people that are going through this. Not every person who generates nudes of someone else is going to make it to the news, after all.

I could see this being a problem in highschools as really mean pranks. That is not good. There are a million other ways I could see fake nudes being used against someone.

If someone spread pictures of me naked: 1. I would be flattered and 2. Really ask why someone wants to see me naked in the first place.

If anything, just an extension of any slander(?) laws would work. It's going to be extremely hard to enforce any law though, so there is that.

However, how long have revenge porn laws been a thing? Were they ever really a thing?

[–] [email protected] 19 points 1 year ago* (last edited 1 year ago)

i remember a headline from a few weeks back, this is already happening in schools. its really not about swift

[–] [email protected] 26 points 1 year ago (4 children)

This will be interesting.

How to write legislation to stop AI nudes but not photo shopping or art? I am not at all sure it can be done. And even if it can, will it withstand a courtroom free speech test?

load more comments (4 replies)
[–] [email protected] 25 points 1 year ago (3 children)

Wait.. They want to stop only Taylor Swift AI fakes? Not every AI fake representing a real person???

[–] [email protected] 16 points 1 year ago

Y'all need to read the article and stop rage baiting. It's literally a click away.

"Legislation needs to be passed to protect people from fake sexual images generated by AI, the White House said this afternoon."

[–] [email protected] 16 points 1 year ago (1 children)

Only AI fakes of billionaires. They're just admitting that there's a two tiered legal system, and if you're below a certain "value," you will not be protected.

[–] [email protected] 8 points 1 year ago

If the value level is Taylor Swift we're all doomed

load more comments (1 replies)
[–] [email protected] 24 points 1 year ago

Taylor is just trying to distract us from her jet emissions again, just like her new PR relationship with that Kelce guy was almost certainly to distract us from her dating that Matty Healy dude that openly said he enjoys porn that brutalizes black women (and also from her jet emissions).

She's not stupid. She's a billionaire very aware of how news cycles work.

[–] [email protected] 23 points 1 year ago (29 children)

I'd much rather that we do nothing, let it proliferate to the point where nobody trusts nudes at all any more.

load more comments (29 replies)
[–] [email protected] 21 points 1 year ago (13 children)

The amount of people in this thread that doesn't see this as a problem is disturbing.

[–] [email protected] 12 points 1 year ago* (last edited 1 year ago) (2 children)

I think the main issue many people are taking is that these fakes have been around for a bit, but NOW there's a call to legislation when it's a billionaire that's the victim.

Of course it's a problem and I've said before that this needs to be discussed on a legislative level, but even I'm rolling my eyes that it took a literal billionaire being exposed to it to have any impact.

load more comments (2 replies)
load more comments (12 replies)
[–] [email protected] 19 points 1 year ago (2 children)

Is the law going to explicitly protect her and no one else?

[–] [email protected] 12 points 1 year ago

It’s called setting a precedent bud.

[–] [email protected] 8 points 1 year ago (1 children)

Click on the link, read the article, answer your own question

load more comments (1 replies)
[–] jaschen 16 points 1 year ago (21 children)

It's a victimless crime. I mean, making it illegal doesn't stop people from doing it.

I think once it gets to a point that nobody trusts AI porn, like how we don't trust photoshopped porn, then nobody would care anymore.

[–] [email protected] 13 points 1 year ago (1 children)

I hope you are joking. The target here is very much a victim. I don't want people fucking up my reputation just cus.

[–] jaschen 15 points 1 year ago (2 children)

So someone photoshopped your face on another person is also illegal? What if someone talks about you doing a hypothetical sex act with their friends in private. Would that be illegal?

In all 3 instances, your reputation is fucked up. But which one is illegal? When does it end?

load more comments (2 replies)
load more comments (20 replies)
[–] [email protected] 14 points 1 year ago

Oh look, we've got this generations moral panic figured.

[–] [email protected] 11 points 1 year ago (16 children)

Dear world, please stop making fakes kthx.

We need to change society so people don't want to make celebrity deep fakes.

[–] [email protected] 10 points 1 year ago (3 children)

What do you propose? Keep in mind, we can't even change society so people don't constantly try to kill each other over nothing.

load more comments (3 replies)
load more comments (15 replies)
[–] [email protected] 8 points 1 year ago

Man, I hope this doesn't distract Biden from his important work pretending to dislike the genocide in Palestine that he is materially supporting!

load more comments
view more: next ›