this post was submitted on 29 Jul 2024
188 points (100.0% liked)

No Stupid Questions

39834 readers
1899 users here now

No such thing. Ask away!

!nostupidquestions is a community dedicated to being helpful and answering each others' questions on various topics.

The rules for posting and commenting, besides the rules defined here for lemmy.world, are as follows:

Rules (interactive)


Rule 1- All posts must be legitimate questions. All post titles must include a question.

All posts must be legitimate questions, and all post titles must include a question. Questions that are joke or trolling questions, memes, song lyrics as title, etc. are not allowed here. See Rule 6 for all exceptions.



Rule 2- Your question subject cannot be illegal or NSFW material.

Your question subject cannot be illegal or NSFW material. You will be warned first, banned second.



Rule 3- Do not seek mental, medical and professional help here.

Do not seek mental, medical and professional help here. Breaking this rule will not get you or your post removed, but it will put you at risk, and possibly in danger.



Rule 4- No self promotion or upvote-farming of any kind.

That's it.



Rule 5- No baiting or sealioning or promoting an agenda.

Questions which, instead of being of an innocuous nature, are specifically intended (based on reports and in the opinion of our crack moderation team) to bait users into ideological wars on charged political topics will be removed and the authors warned - or banned - depending on severity.



Rule 6- Regarding META posts and joke questions.

Provided it is about the community itself, you may post non-question posts using the [META] tag on your post title.

On fridays, you are allowed to post meme and troll questions, on the condition that it's in text format only, and conforms with our other rules. These posts MUST include the [NSQ Friday] tag in their title.

If you post a serious question on friday and are looking only for legitimate answers, then please include the [Serious] tag on your post. Irrelevant replies will then be removed by moderators.



Rule 7- You can't intentionally annoy, mock, or harass other members.

If you intentionally annoy, mock, harass, or discriminate against any individual member, you will be removed.

Likewise, if you are a member, sympathiser or a resemblant of a movement that is known to largely hate, mock, discriminate against, and/or want to take lives of a group of people, and you were provably vocal about your hate, then you will be banned on sight.



Rule 8- All comments should try to stay relevant to their parent content.



Rule 9- Reposts from other platforms are not allowed.

Let everyone have their own content.



Rule 10- Majority of bots aren't allowed to participate here. This includes using AI responses and summaries.



Credits

Our breathtaking icon was bestowed upon us by @Cevilia!

The greatest banner of all time: by @TheOneWithTheHair!

founded 2 years ago
MODERATORS
 

Thanks ahead of time for your feedback

top 50 comments
sorted by: hot top controversial new old
[–] [email protected] 155 points 8 months ago (6 children)

i think its 'barrier to entry'

photoshop took skills that not everyone has/had keeping the volume low.

these new generators require zero skill or technical ability so anyone can do it

[–] [email protected] 21 points 8 months ago* (last edited 8 months ago) (2 children)

When Photoshop first appeared, image manipulations that would seem obvious and amateurish by today’s standards were considered very convincing—the level of skill needed to fool large numbers of people didn’t increase until people became more familiar with the technology and more vigilant at spotting it. I suspect the same process will play out with AI images—in a few years people will be much more experienced at detecting them, and making a convincing fake will take as much effort as it now does in Photoshop.

[–] [email protected] 16 points 8 months ago

Nope, the ai will continue to get better, and soon spotting the fakes will be nearly impossible.

load more comments (1 replies)
[–] [email protected] 11 points 8 months ago (2 children)

Have you tried to get consistent goal orientated results from these ai tools.

To reliably generate a person you need to configure many components, fiddle with the prompts and constantly tweak.

To do this well in my eyes is a fair bit harder than learning how to use the magic wand in Photoshop.

[–] [email protected] 27 points 8 months ago (1 children)

I mean, inpainting isn't particularly hard to make use of. There are also tools specifically for the purpose of generating "deepfake" nudes. The barrier for entry is much, much lower.

load more comments (1 replies)
[–] [email protected] 10 points 8 months ago

You could also just find the promps online and paste them in.

[–] [email protected] 7 points 8 months ago (2 children)

Ehhhh, I like to think that eventually society will adapt to this. When everyone has nudes, nobody has nudes.

[–] [email protected] 14 points 8 months ago

Unfortunately, I doubt it will be everyone. It will primarily be young women, because we hyper-sexualize those...

[–] [email protected] 7 points 8 months ago (1 children)

You might think so, but I don't hold as much hope.

Not with the rise of holier than thou moral crusaders who try to slutshame anyone who shows any amount of skin.

[–] [email protected] 5 points 8 months ago

I like to be optimistic, eventually such crusaders will have such tools turned against them and that will be that. Even they will begin doubting whether any nudes are real.

Still, I'm not so naive that I think it can't turn any other way. They might just do that thing they do with abortions, that is the line of reasoning that goes: "the only acceptable abortion is my abortion", now changed to "the only fake nudes, are my nudes"

load more comments (1 replies)
[–] [email protected] 47 points 8 months ago (3 children)

It's a bit of a blend of it has always been a big deal, and that it is indeed more of a big deal still now because of how easy, accessible, and believable the AI can be. Like even nowadays, Photoshop hits only one point of that triangle. But it was even less capable back in the day. It could hit half of one of those points at any given time.

Basically, a nude generated by a good AI has to be proven false. Because it doesn't always immediately seem as such at first. If you have seen obvious AI fakes, they are just that, obvious. There are many non-obvious ones that you might have seen and not known they were fake. That is, of course, assuming you have looked.

The other reason it can be more of a big deal now is that kids have been doing it of other kids. And since the results can be believable, the parents didn't know they were fake to start with. So it would blow up as if it was real before finding out it was AI. And anything involving that is gonna be a big deal.

load more comments (3 replies)
[–] [email protected] 35 points 8 months ago (2 children)

Because previously if someone had the skills to get rich off the skill making convincing fake nudes we could arrest and punish them - people with similar skillsets would usually prefer more legitimate work.

Now some ass in his basement can crank them out and it's a futile game of whack-a-mole to kill them dead.

[–] [email protected] 17 points 8 months ago* (last edited 8 months ago) (1 children)

it’s a futile game of whack-a-mole

It's still going to be futile even with this law in place. Society is going to have to get used to the fact that photo-realistic images aren't evidence of anything (especially since the technology will keep improving).

[–] [email protected] 13 points 8 months ago (1 children)

It blows my mind when I think about where we might be headed with this tech. We've gotten SO used to the ability to communicate instantly with people far away in the technology age, how will we adapt when we have to go back 300 years and can only trust something someone tells us in person. Will we go back to local newspapers? Or can we not even trust that? Will we have public amphitheaters in busy parts of town, where people will around the news? And we can only trust these people, who have a direct chain of acquaintance all the way back to the source of the information? That seems extreme, but I dunno.

I think most likely we won't implement extreme measures like that, to ensure we're still getting genuine information. I think most likely we'll just slip into completely generated false news from every source, no longer have any idea what's really going on, but be convinced this AI thing was overblown, and have no idea we're being controlled.

[–] [email protected] 8 points 8 months ago (1 children)

I don't think it will be quite that bad. Society worked before photography was invented and now we have cryptographic ways to make sure you're really talking to the person you think you're talking to.

load more comments (1 replies)
load more comments (1 replies)
[–] [email protected] 32 points 8 months ago

Well, I think everyone has already covered that it was a big deal at the time, it simply wasn't something we could wipe out as a society.

And it's still a big deal.

However, I don't think anyone touched on why fake nudes, even ones that are obviously fake, or even labeled as fake by the creator are a problem.

It comes back to the entire idea of consent. That's for anyone, but women in particular are heavily sexualized, even well before they're women. There is a constant, unending pressure on women of knowing that they are going to be sexually objectified. It might not be every day, by everyone alive around them, but it is inescapable.

One can debate whether or not nudity should be a big deal, whether or not it is sexualized because of the rules a given culture has around nudity, but the hard truth is that nudity is sexualized. Ergo, images of a woman's body is something that they deserve to have control over access to. If someone consents to images being available, great! If they don't, then there's a problem.

Fakes, even obvious and declared fakes, violate that barrier of body autonomy. They directly ignore the person's wishes regarding their naked body.

The better the fake, the worse that violation is, because (as others said), once a fake is good enough, the subject of the fake is put in the position of having to deny it's them. They shouldn't have to ever be in that position, no matter who it is.

Even a porn performer should have the ability to be free of fakes because they didn't consent to those fakes. They also have a very valid claim on it infringing on their income as well. Now, I'm certain that legal fakes will someday be a thing. There will be contracts for likeness rights to produce fake porn. Bet on it. If I had free income, I would immediately invest in such an endeavor because I guarantee it will make money.

But, as things stand, fakes are no better than someone taking a picture through a window shade, or using infrared to sneak by clothing. It's digital, and it's fake, but it is the direct equivalent of violating someone's privacy and body autonomy.

That's why it's a big deal to begin with.

And, yeah, it is something that's here to stay, it's unavoidable. And someone is bound to comment that they wouldn't care. Great, good for you. That doesn't obligate others to not care too. But, put it to the test and provide a few pictures of yourself in your comment so that someone can make a fake nude of you, then plaster it online with zero context and labeled with at least your user name so everyone running across it can direct responses to it to you.

It's all about personal privacy, consent, and body autonomy.

[–] [email protected] 32 points 8 months ago* (last edited 8 months ago) (1 children)

Because now, anyone can do it to anyone with zero effort and a single photo.

Sure, before anyone with decent Photoshop skills could put together a halfway decently convincing fake nude but its still significantly more effort and time than most would be bothered with and even then its fairly easy to spot and dispute a fake.

Most people weren't concerned if a celebrity's fake nudes were spread around before but now that a colleague, student, teacher, family member, or even a random member of the public could generate a convincing photo the threat has become far more real and far more conceivable

[–] [email protected] 6 points 8 months ago (1 children)

To be fair. Photoshop has made tasks like this incredible simple. With a "good" photo, the process is much less esoteric now than it was once.

[–] [email protected] 8 points 8 months ago
  • it still takes time/knowledge and isn’t automated

  • it can’t be turned into an unending assembly line where one 16 year old with basic computer literacy can pump out thousands a day if they want

[–] [email protected] 28 points 8 months ago (2 children)

If AI is so convincing, why would anyone care about nudes being controversial anymore? You can just assume it's always fake. If everything is fake, why would anyone care?

[–] [email protected] 11 points 8 months ago* (last edited 8 months ago) (4 children)

You’re right. I’m going to go make convincing images of your partner/sibling/parents/kids/etc. and just share them here since no one should care.

In fact, I’ll share them on other sites as well.

load more comments (4 replies)
[–] [email protected] 9 points 8 months ago

Specifically because it's convincing. You may just assume everything is fake, that doesn't mean everyone will. You may not care about seeing someone's imperceptibly realistic nude, but if it's depicting them they may care, and they deserve the right for people not to see them like that.

Just because it's not logistically feasible to prevent convincing AI nudes from spreading around doesn't make it ethical

[–] [email protected] 28 points 8 months ago (12 children)

I have a similar opininipn. People have been forging!/editing photographs and movies for as long as the technique existed.

Now any stupid kid can do it, the hard part with AI is actually not getting porn. If it can teach everyone that fake photo are a thing, and make nudes worthless (what's the point of a nude anyway ? Genitals looks like... Genitals)

[–] [email protected] 8 points 8 months ago* (last edited 8 months ago)

Imagine this happening to your kid(s) or SO. Imagine this happening to you as a hormonal, confused 15 year old.

Would you tell them “this doesn’t really matter”?

Kids have killed themselves because they feel ostracized by social media, the act of just not being included or having a “great life” like they think they see everyone else having. You think they’re simply going to adapt to a world where their classmates (and complete strangers) can torture them with convincing nude images?

This is the bullying equivalent of a nuclear weapon IMO

load more comments (11 replies)
[–] [email protected] 24 points 8 months ago

It was always a big deal. But back then it was often pretty obvious when it was a fake. It's getting harder and harder to tell.

[–] [email protected] 22 points 8 months ago

It's always been a big deal, it just died down as Photoshop as a tool became normalized and people became accustomed to it as a tool.

[–] [email protected] 15 points 8 months ago
  • easy for anyone to do it

  • easy to do it at scale

[–] [email protected] 15 points 8 months ago* (last edited 8 months ago) (1 children)

How do you prove it's not you in either case? Photoshop doesn't make a whole video of you fucking a sheep. But AI can and is actively being used that way. With Photoshop it was a matter of getting ahold of the file and inspecting it. Even the best Photoshop jobs have some key tells. Artifacting, layering, all kinds of shading and lighting, how big the file is, etc.

I want to add something. What if all the sudden it's your 12 year old daughter being portrayed in this fake? What if it's your mom? It would have been a big deal to you to have that image out there of your loved one back in the 90's or early 2000's. It's the same kind of big deal now but more widespread because it's so easy now. It's not okay to just use the image of someone in ways they didn't consent to. I have a similar issue with facial recognition regardless of the fact that it's used in public places where I have no control over it.

[–] [email protected] 11 points 8 months ago

Doctored photos have always been a problem and, legally speaking, could lead to the faker being sued for defamation, depending on what was done with the person's image.

AI Photos are only part of the problem. Faking the voice is also possible, as is making "good enough" videos where you just change the head of the actual performer.

Another part of the problem is that this kind of stuff spreads like wildfire within groups (and it's ALWAYS groups where the victim is) and any voices stating that it's fake will be drowned by everyone else.

[–] [email protected] 10 points 8 months ago* (last edited 8 months ago) (3 children)

AI is much better. Photoshop was always a little off with size, angle, lighting, etc. Very easy to spot fakes.

load more comments (3 replies)
[–] [email protected] 9 points 8 months ago* (last edited 8 months ago) (1 children)

I got a few comments pointing this out. But media is hell bent on convincing people to hate AI tools and advancements. Why? I don't know.

Tin foil hate is that it can be an equalizer. Powerful people that own media like to keep power tools to themselves and want the regular folk to fear and regulate ourselves from using it.

Like could you imagine if common folk rode dragons in GOT. Absolutely disgusting. People need to fear them and only certain people can use it.

Same idea. If you're skeptical, go look up all the headlines about AI in the past year and compare them to right wing media's headlines about immigration. They're practically identical.

"Think of the women and children."

"They're TAKING OUR JOBS"

"Lot of turds showing up on beaches lately"

"What if they kill us"

"THEY'RE STEALING OUR RESOURCES"

load more comments (1 replies)
[–] [email protected] 7 points 8 months ago (2 children)

It's only a big deal because of puritan society like in the US or UK or similar in the first place. There are societies where nudity is not a big deal anyway, so nude photos of someone are also no big deal.

Look at Germany for example. Lots of FKK (nude) areas. No one really bats an eye. Of course there nudity is also not perfectly normalized, especially in some circles. But still, not many are concerned about nude pictures of themselves.

Obviously AI makes more nude pictures faster than someone skilled at Photoshop. So if your society has a problem with nudity, there will be more problems than before.

But really, there shouldn't be a problem anyway.

[–] [email protected] 13 points 8 months ago (1 children)

Look at Germany for example. Lots of FKK (nude) areas. No one really bats an eye.

We still don't appreciate our nudes posted online, fake or not.

[–] [email protected] 8 points 8 months ago* (last edited 8 months ago)

Of course there nudity is also not perfectly normalized, especially in some circles.

Also obviously because of privacy reasons people don't like their pictures posted online, nude or not.

[–] [email protected] 5 points 8 months ago (1 children)

They're not even actual nudes - they're fake. It seems to me to be no different than putting someone's head on a Big Bird photo.

That said, nobody gets to decide what's offensive to other people. Just do as they ask and don't do the fake nudes. It's not like there's a shortage of porn done by willing adults.

load more comments (1 replies)
[–] [email protected] 6 points 8 months ago

In addition the the reduced skill barriers mentioned, the other side effect is the reduced time spent finding a matching photo and actually doing the work. Anyone can create it in their spare time, quickly and easily.

load more comments
view more: next ›