this post was submitted on 15 Apr 2024
453 points (100.0% liked)

Solarpunk

6592 readers
4 users here now

The space to discuss Solarpunk itself and Solarpunk related stuff that doesn't fit elsewhere.

What is Solarpunk?

Join our chat: Movim or XMPP client.

founded 3 years ago
MODERATORS
 

I found that idea interesting. Will we consider it the norm in the future to have a "firewall" layer between news and ourselves?

I once wrote a short story where the protagonist was receiving news of the death of a friend but it was intercepted by its AI assistant that said "when you will have time, there is an emotional news that does not require urgent action that you will need to digest". I feel it could become the norm.

EDIT: For context, Karpathy is a very famous deep learning researcher who just came back from a 2-weeks break from internet. I think he does not talks about politics there but it applies quite a bit.

EDIT2: I find it interesting that many reactions here are (IMO) missing the point. This is not about shielding one from information that one may be uncomfortable with but with tweets especially designed to elicit reactions, which is kind of becoming a plague on twitter due to their new incentives. It is to make the difference between presenting news in a neutral way and as "incredibly atrocious crime done to CHILDREN and you are a monster for not caring!". The second one does feel a lot like exploit of emotional backdoors in my opinion.

top 50 comments
sorted by: hot top controversial new old
[–] [email protected] 57 points 1 year ago (7 children)

Yea, no thanks. I don't want things filtered based on what someone else thinks I should see.

[–] [email protected] 11 points 1 year ago (2 children)

What if it's based on what you think you should see?

[–] Worx 12 points 1 year ago (3 children)

Either it's you deciding as you see it (ie there is no filter), or it's past you who's deciding in which case it's a different person. I've grown mentally and emotionally as I've got older and I certainly don't want me-from-10-years-ago to be in control of what me-right-now is even allowed to see

load more comments (3 replies)
[–] [email protected] 4 points 1 year ago

Just like diet, some people prefer balancing food types and practicing moderation, and others overindulge on what makes them feel good in the moment.

Having food options tightly controlled would restrict personal liberty, but doing nothing and letting people choose will lead to bad outcomes.

The solution is to educate people on what kinds of choices are healthy and what are not, financially subsidize the healthy options so they are within reach to all, and only use law to restrict things that are explicitly harmful.

Mapping that back to news and media, I’d like to see public education promoting the value of a balanced media and news diet. Put more money into non-politically-aligned news organizations. Look closely at news orgs that knowingly peddle falsehoods and either bring libel charges against them or create new laws that address the public harm done by maliciously spreading misinformation.

But I’m no lawyer, so I don’t know how to do that last part without creating some form of tyranny.

[–] [email protected] 8 points 1 year ago (2 children)

isn't that what the upvote/downvote buttons are for? although to be fair, i'd much rather the people of lemmy decide which things are good and interesting than some "algorithm"

[–] [email protected] 4 points 1 year ago (2 children)

There's a real risk to this belief.

There are elements of lemmy who use votes to manipulate which ideas appear popular, with the intention of manipulating discourse rather than having open discussions.

load more comments (2 replies)
load more comments (1 replies)
load more comments (5 replies)
[–] [email protected] 42 points 1 year ago (1 children)

That's why I stick with platforms where hardline communist teenagers can curate what I'm exposed to.

[–] [email protected] 7 points 1 year ago

That's the only way.

[–] [email protected] 35 points 1 year ago* (last edited 1 year ago) (2 children)

Without wanting to be too aggressive, with only that quote to go on it sounds like that person wants to live in a safe zone where they're never challenged, angered, made afraid, or have to reconsider their world view. That's the very definition of an echo chamber. I don't think you're meant to live life experiencing only "approved" moments, even if you're the one in charge of approving them. Frankly I don't know how that would be possible without an insane amount of external control. You'd have to have someone/something else as a "wall" of sorts controlling your every experience or else how would things get reliably filtered?

I'd much prefer to teach people how to be resilient so they don't have to be afraid of being exposed to the "wrong" ideas. I'd recommend things like learning what emotions mean and how to deal with them, coping/processing bad moments, introspection, how to get help, and how to check new ideas against your own ethics. E.g. if you read something and it makes you angry, what idea/experience is the anger telling you to protect yourself from and how does it match your morality? How do you express that anger in a reasonable and productive way? If it's serious who do you call? And so on.

[–] [email protected] 7 points 1 year ago (2 children)

I see where you're coming from, but if you look up Karpathy, you'll probably come to a different conclusion.

load more comments (2 replies)
[–] [email protected] 5 points 1 year ago (3 children)

I think you are getting it wrong. I added a small edit for context. It is more about emotional distraction. I kinda feel like him: I want to remain informed, but please let me prepare a bit before telling me about civilians cut in pieces in a conflict between a funny cat video and a machine learning news.

For the same reason we filter out porn or gore images from our feeds, highly emotional news should be filterable

load more comments (3 replies)
[–] [email protected] 20 points 1 year ago* (last edited 1 year ago) (3 children)

Our mind is built on that "malware". I think it's more accurate to compare brain + knowledge to our immune system: the more samples you have, the better you are armed against mal-information.

[–] [email protected] 4 points 1 year ago (2 children)

But that leaves out the psychological effects of long-term exposure to ideas. If you know for a fact that the earth is round, and for the next 50 years all the media you consume keeps telling you that the earth is flat, you will at some point start believing that (or at least become unsure).

Every piece of information you receive has some tiny effect on you.

load more comments (2 replies)
load more comments (2 replies)
[–] [email protected] 19 points 1 year ago (4 children)

The real question then becomes: what would you trust to filter comments and information for you?

In the past, it was newspaper editors, TV news teams, journalists, and so on. Assuming we can't have a return to form on that front, would it be down to some AI?

[–] [email protected] 8 points 1 year ago (1 children)

My mom, she always wants the best for me.

[–] [email protected] 5 points 1 year ago

Easily better than all the other options.

[–] [email protected] 8 points 1 year ago (1 children)

Why do people, especially here in the fediverse, immediately assume that the only way to do it is to give power of censorship to a third party?

Just have an optional, automatic, user-parameterized, auto-tagger and set parameters yourself for what you want to see.

Have a list of things that should receive trigger warnings. Group things by anger-inducing factors.

I'd love to have a way to filter things out by actionnable items: things I can get angry about but that I have little ways of changing, no need to give me more than a monthly update on.

[–] [email protected] 4 points 1 year ago (1 children)

Because your "auto-tagger" is a third party and you have to trust it to filter stuff correctly.

load more comments (1 replies)
load more comments (2 replies)
[–] [email protected] 16 points 1 year ago* (last edited 1 year ago) (4 children)

I think the right approach would be to learn to deal with any kind of information, rather than to censor anything we might not like hearing.

load more comments (4 replies)
[–] [email protected] 12 points 1 year ago (1 children)
[–] [email protected] 6 points 1 year ago* (last edited 1 year ago)

I really think that as the 20th century saw the rise of basic hygiene practices we are putting in place mental hygiene practices in the 21st.

[–] Whorehoarder 12 points 1 year ago (1 children)

Reminds me of Snow Crash by Nealyboi

[–] [email protected] 10 points 1 year ago (1 children)

Not really. An executable controlled by an attacker could likely "own" you. A toot tweet or comment can not, it's just an idea or thought that you can accept or reject.

We already distance ourselves from sources of always bad ideas. For example, we're all here instead of on truth social.

[–] [email protected] 5 points 1 year ago

Jokes on you, all of my posts are infohazards that make you breathe manually when you read them.

[–] [email protected] 9 points 1 year ago* (last edited 1 year ago) (11 children)

Nah man, curl that shit into my bash and let me deal with it

load more comments (11 replies)
[–] [email protected] 9 points 1 year ago (1 children)

Hüman brain just liek PC, me so smort.

[–] [email protected] 7 points 1 year ago (2 children)

It's definitely an angle worth considering when we talk about how the weakest link in any security system is its human users. We're not just "not immune" to propaganda, we're ideological petri dishes filled with second-hand agar agar.

load more comments (2 replies)
[–] [email protected] 9 points 1 year ago* (last edited 1 year ago)

I think most people already have this firewall installed, and it's working too well - they're absorbing minimal information that contradicts their self-image or world view. :) Scammers just know how to bypass the firewall. :)

[–] [email protected] 9 points 1 year ago (1 children)

Sounds like we're reinventing forum moderation and media literacy from first principles here.

[–] [email protected] 3 points 1 year ago

Kind of, but the guy being a prominent LLM researcher, it kind of hints at the ability of not inflicting it on humans nor suffering from having to design an apolitical structure for it.

[–] [email protected] 8 points 1 year ago

Reading, watching, and listening to anything is like this. You accept communications into your brain and sort it out there. It's why people censor things, to shield others and/or to prevent the spread of certain ideas/concepts/information.

Misinformation, lies, scams, etc function entirely on exploiting it

[–] [email protected] 6 points 1 year ago

I remember watching a video from a psychiatrist with eastern Monk training. He was explaining about why yogis spend decades meditating in remote caves - he said it was to control information/stimuli exposure.

Ideas are like seeds, once they take root they grow. You can weed out unwanted ones, but it takes time and mental energy. It pulls at your attention and keeps you from functioning at your best

The concept really spoke to me. It's easier to consciously control your environment than it is to consciously control your thoughts and emotions.

[–] [email protected] 6 points 1 year ago (1 children)

I mean, this is just called censorship. We censor things for kids and all kind of people in or lives all the time. We censor things for ourselves when we don’t feel like reading the news or opening a text from a specific person. This is not some novel concept.

[–] [email protected] 5 points 1 year ago (1 children)

Not really. This is user-controlled filtering. Censorship is done to push a specific worldview to victims. Filtering we do it all the time for spam for instance.

load more comments (1 replies)
[–] [email protected] 5 points 1 year ago

In a way, the job of a teacher or journalist is to filter useful and/or relevant information for interested parties.

[–] [email protected] 5 points 1 year ago (1 children)

You are responsible for what you do with the information you process. You're not supposed to just believe everything you read, or let it affect you. We don't need some government or organization deciding what can be shown online. Read history and see what follows mass censorship.

[–] [email protected] 6 points 1 year ago (3 children)

I am bewildered that so many people contrive this as suggesting it should be a government or a company deciding what to show you. Obviously any kind of firewall/filter ought to be optional and user controlled!

load more comments (3 replies)
[–] [email protected] 5 points 1 year ago (2 children)

Leaving aside the dystopian echo chamber that this could result in, you could argue that this would help with fake news by a lot. Fake news are so easy to spread and more present than ever. And for every person there is probably that one piece of news that is just believable enough to not question it. And then the next just believable piece of news. and another. I believe no one is immune to being influenced by fake stories, maybe even radicalized if they are targeted just right. A firewall just filtering out everything non-factual would already prevent so much societal damage I think.

load more comments (2 replies)
[–] [email protected] 4 points 1 year ago (1 children)

Yeah the firewall is called "going outside more"

[–] [email protected] 6 points 1 year ago

Trust me, getting your information "from going outside" is not the best either.

[–] [email protected] 4 points 1 year ago

you already have that firewall. it's your experiences and human connections, your understanding of media, your personal history and learning and the feelings you experience.

you don't need a firewall to keep you from being manipulated, you need to learn to fucking read and think and feel. to learn and question, to develop trusted friends and family you can talk to.

if it feels like your emotional backdoors are being exploited then maybe youre thinking or behaving like a monster and your mind is revolting against itself.

[–] [email protected] 4 points 1 year ago

Yes, lemmy too is that. We need to meet people and then form groups online. I had devised a solution for exchanging public keys in person and verifying each content thereafter with that key.

[–] [email protected] 3 points 1 year ago

i have a general distaste for the mind/computer analogy. no, tweets aren't like malware, because language isn't like code. our brains were not shaped by the same forces that computers are, they aren't directly comparable structures that we can transpose risks onto. computer scientists don't have special insight into how human societies work because they understand linear algebra and network theory, in the same way that psychologists and neurologists don't have special insight into machine learning because they know how the various regions of the human brain interact to form a coherent individual mind, or the neural circuits that go into sensory processing.

i personally think that trying to solve social problems with technological solutions is folly. computers, their systems, the decisions they make, are not by nature less vulnerable to bias than we are. in fact, the kind of math that governs automated curation algorithms happens to be pretty good at reproducing and amplifying existing social biases. relying on automated systems to do the work of curation for us isn't some kind of solution to the problems that exist on twitter and elsewhere, it is explicitly part of the problem.

twitter isn't giving you "direct, untrusted" information. its giving you information served by a curation algorithm designed to maximize whatever it is twitter's programmers have built, and those programmers might not even be accurately identifying what it is that they're maximizing for. assuming that we can make a "firewall" that maximizes for neutrality or objectivity is, to my mind, no less problematic than the systems that already exist, because it makes the same assumption: that we can build computational systems that reliably and robustly curate human social networks in ways that are provably beneficial, "neutral", or unbiased. that just isn't a power that computers have, nor is it something we should want as beings with agency and autonomy. people should have control over how their social networks function, and that control does not come from outsourcing social decisions to black-boxed machine learning algorithms controlled by corporate interests.

load more comments
view more: next ›