this post was submitted on 17 Feb 2025
1193 points (100.0% liked)

Microblog Memes

7058 readers
2269 users here now

A place to share screenshots of Microblog posts, whether from Mastodon, tumblr, ~~Twitter~~ X, KBin, Threads or elsewhere.

Created as an evolution of White People Twitter and other tweet-capture subreddits.

Rules:

  1. Please put at least one word relevant to the post in the post title.
  2. Be nice.
  3. No advertising, brand promotion or guerilla marketing.
  4. Posters are encouraged to link to the toot or tweet etc in the description of posts.

Related communities:

founded 2 years ago
MODERATORS
 
top 50 comments
sorted by: hot top controversial new old
[–] [email protected] 117 points 1 month ago (5 children)

I think that means they could rip out your eye balls to prevent you from seeing ads.

[–] [email protected] 72 points 1 month ago (1 children)
[–] [email protected] 14 points 1 month ago

'Cause I love the adrenaline in my veins

[–] [email protected] 41 points 1 month ago (3 children)

Robot is allowed to kill a human to prevent a viewing of an advertisement.

[–] [email protected] 27 points 1 month ago (2 children)

Under the zeroth law they can just kill the advertiser as a last resort

[–] [email protected] 7 points 1 month ago
[–] [email protected] 4 points 1 month ago

Good start, but can we change that to "first resort"?

[–] [email protected] 13 points 1 month ago

A truly moral use case of the hanibal directive

[–] [email protected] 2 points 1 month ago

Okay, proposed second law: A robot may not harm or kill a human unless it violates the first law.

[–] [email protected] 7 points 1 month ago

This is a solid premise for a pretty metal music video.

[–] [email protected] 4 points 1 month ago

Thankfully the wording is “shown” and not “seen”. I believe our eyeballs are safe… for now.

[–] [email protected] 2 points 1 month ago

I think Asimov would agree

[–] [email protected] 105 points 1 month ago (2 children)
  1. a robot’s eyes must always turn red when they go evil
[–] [email protected] 46 points 1 month ago (2 children)

God bless the designer who always installs the blue AND red LEDs inside the eyes

[–] [email protected] 15 points 1 month ago* (last edited 1 month ago) (1 children)

For giving the robots freedom of choice?

Because obviously if they didn't install the red ones then the robot could never be evil.

[–] [email protected] 8 points 1 month ago

That's exactly what an evil robot without red LEDs would want us to think.

[–] [email protected] 10 points 1 month ago
[–] [email protected] 22 points 1 month ago (1 children)

Right, because it's hard to make a robot grow a goatee.

[–] [email protected] 5 points 1 month ago (1 children)
[–] [email protected] 5 points 1 month ago

Bender was the evil bender!?

[–] [email protected] 68 points 1 month ago (2 children)

I'd argue that advertisements fall under "A robot may not injure a human being or, through inaction, allow a human being to come to harm."

[–] [email protected] 38 points 1 month ago (2 children)

Psychic damage is real damage

[–] [email protected] 15 points 1 month ago (1 children)
[–] [email protected] 8 points 1 month ago

hiyyyyyyyyyahhhhh

[–] [email protected] 5 points 1 month ago

This is canon in the books. There is one short story where one robot bends over backwards trying to spare humans from emotional pain. Hilarity ensues.

[–] [email protected] 4 points 1 month ago

I came here to say this

[–] [email protected] 60 points 1 month ago* (last edited 1 month ago) (2 children)
  1. A machine must never prompt a human with options of "Yes" and "Maybe later" - they must always provide a "No" option.
[–] [email protected] 12 points 1 month ago
  1. A machine must never prompt for a tip or a donation to a charity for tax-evasion reasons. Or any reason. You know what, scratch that, a robot will not needlessly guilt-trip a human.
[–] [email protected] 3 points 1 month ago

that's what you get for hiring fallout 4 writers to do the job

[–] [email protected] 24 points 1 month ago* (last edited 1 month ago) (1 children)

I am very close to adopting the ideals of the Dune universe, post Butlerian Jihad:

"Thou shalt not make a machine in the likeness of a human mind."

Mainly because, us, humans, are very evidently too malicious and incompetent to be trusted with the task.

[–] [email protected] 15 points 1 month ago* (last edited 1 month ago) (1 children)

How about "a robot must have complete loyalty to its owner, even if this is not in the best interests of its manufacturer". Fat chance, I know.

load more comments (1 replies)
[–] [email protected] 12 points 1 month ago

I love it when posts line up like that

[–] [email protected] 11 points 1 month ago

Advertisements are now everything but visual. Sounds, smells, tastes, touch, the way the pavement vibrates as a train goes past...

[–] [email protected] 11 points 1 month ago

No he didn't. The laws were a plot device meant to have flaws.

[–] [email protected] 9 points 1 month ago
[–] [email protected] 8 points 1 month ago

Can we just agree that adverisements in general is harmful? So the original first (and zeroth) law is applicable.

[–] [email protected] 7 points 1 month ago

Love the username, OP!

[–] [email protected] 6 points 1 month ago

Let’s introduce musk to the zeroth law

[–] [email protected] 6 points 1 month ago (1 children)

Law 2: no poking out eyes.

[–] [email protected] 4 points 1 month ago

Law 3: any robot that accidentally kills a human, must make amends by putting together a really nice funeral service.

[–] [email protected] 5 points 1 month ago (2 children)

I don't know. "Must not kill us, somehow sounds important"

[–] [email protected] 10 points 1 month ago

It's good, but the one about the ads should be higher on the priority list.

[–] [email protected] 2 points 1 month ago

suicide bots sound kinda cool tho 🤔

[–] [email protected] 4 points 1 month ago

Luckily I have my own "robots" fighting hard to stop me from seeing ads.

[–] [email protected] 4 points 1 month ago (1 children)

Wait why is this mutually exclusive to the original laws? Can’t this just be law 4?

[–] [email protected] 6 points 1 month ago (1 children)

No because if it is lower on priority, a robot can be forced to show an AD to a human as per the 2nd law.

load more comments (1 replies)
[–] [email protected] 3 points 1 month ago

A machine must never prompt a human to tip it for serving the purpose it was created for.

[–] [email protected] 3 points 1 month ago (1 children)
[–] [email protected] 2 points 1 month ago

Unless it looks super cool by doing so, like wearing sunglasses and dual- weilding P-90s

load more comments
view more: next ›