269
submitted 3 weeks ago by [email protected] to c/[email protected]

The Basque Country is implementing Quantus Skin in its health clinics after an investment of 1.6 million euros. Specialists criticise the artificial intelligence developed by the Asisa subsidiary due to its "poor” and “dangerous" results. The algorithm has been trained only with data from white patients.

you are viewing a single comment's thread
view the rest of the comments
[-] [email protected] 2 points 2 weeks ago

It is a direct result of structural racism, as it's a product of the treatment of white men as being the default. You see it all the time in medicine. There are conditions that disproportionately affect black people that we don't know enough about because time and money hasn't been spent studying it.

Women face the same problem. Lots of conditions apply differently in women. An example of this being why women historically have been underrepresented in e.g. autism diagnoses. It presents differently so for a while the assumption was made that women just can't be autistic.

I don't think necessarily that people who perpetuate this problem are doing so out of malice, they probably don't think of women/black people as lesser (hell, many probably are women and/or black), but it doesn't change the fact that structural problems requires awareness and conscious effort to correct.

[-] [email protected] 1 points 18 hours ago

Again, no.

There are actual normal reasons that can explain this. Don't assume evil when stupidity (or in this case, physics) does it. Darker patches on darker skin are harder to detect, just as facial features in the dark, on dark skin are garder to detect because there is literally less light to work with

Scream racism all you want but you're cheapening the meaning of the word and you're not doing anyone a favor.

[-] [email protected] 1 points 10 hours ago

Don’t assume evil when stupidity

I didn't, though? I think that perhaps you missed the "I don’t think necessarily that people who perpetuate this problem are doing so out of malice" part.

Scream racism all you want but you’re cheapening the meaning of the word and you’re not doing anyone a favor.

I didn't invent this term.

Darker patches on darker skin are harder to detect, just as facial features in the dark, on dark skin are garder to detect because there is literally less light to work with

Computers don't see things the way we do. That's why steganography can be imperceptible to the human eye, and why adversarial examples work when the differences cannot be seen by humans.

If a model is struggling at doing its job it's because the data is bad, be it the input data, or the training data. Historically one significant contributor has been that the datasets aren't particularly diverse, and white men end up as the default. It's why all the "AI" companies popped in "ethnically ambiguous" and other words into their prompts to coax their image generators into generating people that weren't white, and subsequently why these image generators gave us ethnically ambigaus memes and German nazi soldiers that were black.

this post was submitted on 03 Jul 2025
269 points (100.0% liked)

Technology

73129 readers
2853 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS