this post was submitted on 29 Apr 2025
322 points (100.0% liked)

Technology

72414 readers
2414 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 32 points 2 months ago (8 children)

This is deeply unethical, when doing research you need to respect the people who participate and you have to respect what their story is. So by using a regurgitative artificial idiot (RAI) to make them their mind is not respecting them or their story.

The people who are being experimented on were not given compensation for their time and the work they contributed. While it isn’t required it is good practice in research to not actively burn bridges with people so that they will want to participate in more studies.

These people were also not given knowledge they were participating in a study nor were they given the choice to leave with their contributions at their will. Which entirely makes the study unpublishable since the data was not gathered with fucking consent.

This isn’t even taking into account any of the other things which cross ethical lines. All the “researchers” involved should never be allowed to ever conduct or participate in a study of any kind again. Their university should be fined and heavily scrutinized for their work in enabling this shit. These assholes have done damage to all researchers globally who will now have a harder time pitching real studies to potential participants because they could remember this story and how “researchers” took advantage of unknowing individuals. Shame on these people and hope they face real consequences.

[–] [email protected] 2 points 2 months ago* (last edited 2 months ago) (5 children)

This is deeply unethical,

I feel like maybe we've gone too far on research ethics restrictions.

We couldn't do the Milgram experiment today under modern ethical guidelines. I think that it was important that it was performed, even at the cost of the stress that participants experienced. And I doubt that it is the last experiment for which that is true.

If we want to mandate some kind of careful scrutiny of such experiments and some after-the-fact compensation be paid to participants in experiments in which trauma-producing deception is imposed, maybe that'd be reasonable.

That doesn't mean every study that violates present ethics standards should be greenlighted, but I do think that the present bar is too high.

[–] iknowitwheniseeit 9 points 2 months ago

From the link you provide:

In 2012, Australian psychologist Gina Perry investigated Milgram's data and writings and concluded that Milgram had manipulated the results, and that there was a "troubling mismatch between (published) descriptions of the experiment and evidence of what actually transpired." She wrote that "only half of the people who undertook the experiment fully believed it was real and of those, 66% disobeyed the experimenter".[26][27] She described her findings as "an unexpected outcome" that "leaves social psychology in a difficult situation."[28]

I mean, maybe it shouldn't have been done?

load more comments (4 replies)
load more comments (6 replies)