this post was submitted on 08 Apr 2024
247 points (100.0% liked)

World News

45678 readers
4750 users here now

A community for discussing events around the World

Rules:

Similarly, if you see posts along these lines, do not engage. Report them, block them, and live a happier life than they do. We see too many slapfights that boil down to "Mom! He's bugging me!" and "I'm not touching you!" Going forward, slapfights will result in removed comments and temp bans to cool off.

We ask that the users report any comment or post that violate the rules, to use critical thinking when reading, posting or commenting. Users that post off-topic spam, advocate violence, have multiple comments or posts removed, weaponize reports or violate the code of conduct will be banned.

All posts and comments will be reviewed on a case-by-case basis. This means that some content that violates the rules may be allowed, while other content that does not violate the rules may be removed. The moderators retain the right to remove any content and ban users.


Lemmy World Partners

News !news@lemmy.world

Politics !politics@lemmy.world

World Politics !globalpolitics@lemmy.world


Recommendations

For Firefox users, there is media bias / propaganda / fact check plugin.

https://addons.mozilla.org/en-US/firefox/addon/media-bias-fact-check/

founded 2 years ago
MODERATORS
 

As civilian casualties continue to mount in the wartorn Gaza Strip, reports of Israel's use of artificial intelligence (AI) in its targeting of Hamas militants are facing increasing scrutiny. A report by the Israeli outlets +972 Magazine and Local Call earlier this month said that Israeli forces had relied heavily on two AI tools so far in the conflict — "Lavender" and "Where's Daddy."

While "Lavender" identifies suspected Hamas and Palestinian Islamic Jihad (PIJ) militants and their homes, "Where's Daddy" tracks these targets and informs Israeli forces when they return home, per the report, which cites six Israeli intelligence officers who had used AI systems for operations in Gaza, including "Where's Daddy?"

"We were not interested in killing [Hamas] operatives only when they were in a military building or engaged in a military activity," one of the officers told +972 and Local Call. "On the contrary, the IDF bombed them in homes without hesitation, as a first option. It's much easier to bomb a family's home. The system is built to look for them in these situations," they added.

all 49 comments
sorted by: hot top controversial new old
[–] xmunk@sh.itjust.works 119 points 1 year ago (6 children)

Who the fuck names a program like that "Where's Daddy?" that's fucking sociopathic.

[–] Rottcodd@kbin.social 57 points 1 year ago

Yes - it is sociopathic.

That's not a coincidence.

[–] Passerby6497@lemmy.world 52 points 1 year ago (1 children)

The type of person who is intentionally targeting children to murder. I know, what a first for Israel...

[–] AMDIsOurLord@lemmy.ml 22 points 1 year ago

Israel

For them? This is tame.

[–] Silverseren@kbin.social 22 points 1 year ago

Really makes me wonder what the meaning behind Lavender is in this context. There could quite easily be some horrific intention behind using that specific word in correlation to this murder system.

[–] snooggums@midwest.social 11 points 1 year ago* (last edited 1 year ago)

If they designed the peogrm to be used to target people in their reaidentiak homes with their families, sociopaths is probably a good guess.

[–] nonailsleft@lemm.ee 3 points 1 year ago (3 children)

In this Information Age, more and more power is given to nerds. These people are known for many traits but for empathy or ethics, they are not.

[–] xmunk@sh.itjust.works 22 points 1 year ago

As a nerd, I find that offensive. I'm quite empathetic!

[–] Zacryon@feddit.de 8 points 1 year ago* (last edited 1 year ago)

And another day of "let's shove all people of group X into one drawer and judge them".
We have all this information available through the internet. Can research even the most difficult topics by some mere hits on a keyboard and a click. And yet, there are still so many idiots. This is really mind-boggling.

[–] BestBouclettes@jlai.lu 3 points 1 year ago

I'm very much a nerd but not very empathetic. But that's the autism.

[–] PugJesus@kbin.social 27 points 1 year ago (1 children)

Every sane country on earth: "We use modern tools to reduce civilian casualties (because civilian casualties are bad PR)."

Old-school countries: "We use modern tools without regard for civilian casualties (because what are they going to do about it? fight back? lmao)"

Israel and Russia: "We use modern tools to INCREASE civilian casualties!"

[–] bamboo@lemm.ee 8 points 1 year ago

The civilian casualty count in Gaza is already higher than Ukraine. As bad as Russia is, they’re nothing in comparison to Israel.

[–] anas@lemmy.world 13 points 1 year ago

They can’t fight Hamas on the field, they have to assassinate them in other countries (funny how the best intelligence in the world knows exactly who the “Qatar billionaires” are but hasn’t done anything about it), murder them alongside their families while they sleep, or disguise as civilians to murder them in hospitals. Or the usual, take their frustration out on civilians.

[–] alternative_factor@kbin.social 12 points 1 year ago (1 children)

Really appreciating Israels SkyNet 100% speedrun here, some really impressive moves. However, I'm starting to wonder if this run passes the suspect test, I feel like you can't get past regulators so well in most cases without an exploit.

[–] nonailsleft@lemm.ee 7 points 1 year ago

Spoiler: step one was to remove the regulators

"AI" in this context only serves to obfuscate responsibility. "You've just striked a bunch of children", "Well, sure, but this AI-generated model predicted that there was a 80% chance that they were all little Hamaslings". Of course, the data that AI was trained with was bollocks to ensure everyone is Hamas, including NGO kitchen foreign workers.

[–] radicalautonomy@lemmy.world 3 points 1 year ago

at home

confusedtravolta.gif