this post was submitted on 20 Mar 2024
1028 points (100.0% liked)

Technology

70713 readers
3903 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 9 points 1 year ago* (last edited 1 year ago) (2 children)

No such thing as neutral space

it may not be intentional, but

They can suggest similar [communities] so it can't be neutral

My guy, what? If all you did was look at cat pictures you'd get communities to share fucking cat pictures. These sites aren't to blame for "radicalizing" people into sharing cat pictures any more than they are to actually harmful communities. By your logic, lemmy can also radicalize people. I see anarchist bullshit all the time, had to block those communities and curate my own experience. I took responsibility and instead of engaging with every post that pissed me off, removed that content or avoided it. Should the instance I'm on be responsible for not defederating radical instances? Should these communities be made to pay for radicalizing others?

Fuck no. People are not victims because of the content they're exposed to, they choose to allow themselves to become radical. This isn't a "I woke up and I really think Hitler had a point." situation, it's a gradual decline that isn't going to be fixed by censoring or obscuring extreme content. Companies already try to deal with the flagrant forms of it but holding them to account for all of it is truly and completely stupid.

Nobody should be responsible because cat pictures radicalized you into becoming a furry. That's on you. The content changed you and the platform suggesting that content is not malicious nor should it be held to account for that.

[–] [email protected] 9 points 1 year ago (1 children)

This is an extremely childish way of looking at the world, IT infrastructure, social media content algorithms, and legal culpability.

[–] [email protected] 5 points 1 year ago (1 children)

As neutral platforms that will as readily push cat pictures as often it will far right extremism and the only difference is how much the user personally engages with it?

Whatever you say, CopHater69. You're definitely not extremely childish and radical.

[–] [email protected] 11 points 1 year ago (1 children)

Oh I'm most certainly a radical, but I understand what that means because I got a college degree, and now engineer the internet.

[–] [email protected] 2 points 1 year ago (1 children)

I doubt you could engineer a plug into your own asshole but sure, I'll take your word that you're not just lying and have expert knowledge on this field yet still refused to engage with the point to sling insults instead.

[–] [email protected] 8 points 1 year ago (1 children)
[–] [email protected] 2 points 1 year ago

Always something about radicals and their need to point out "Ur triggered"

[–] herpaderp 7 points 1 year ago* (last edited 1 year ago)

I’ve literally watched friends of mine descend into far right thinking and I can point to the moment when they started having algorithms suggest content that puts them down a “rabbit hole”

Like, you’re not wrong they were right wing initially but they became the “lmao I’m an unironic fascist and you should be pilled like me” variety over a period of six months or so. Started stock piling guns and etc.

This phenomena is so commonly reported it makes you start wonder where all these people deciding to “radicalize themselves” all at once seemingly came out in droves.

Additionally, these companies are responsible for their content serving algorithms, and if they did not matter for affecting the thoughts of the users: why do propaganda efforts from nation states target their narratives and interests appearing within them if it was not effective? Did we forget the spawn and ensuing fall out of the Arab Spring?