this post was submitted on 31 Aug 2023
40 points (88.5% liked)

Games

38286 readers
1082 users here now

Welcome to the largest gaming community on Lemmy! Discussion for all kinds of games. Video games, tabletop games, card games etc.

Weekly Threads:

What Are You Playing?

The Weekly Discussion Topic

Rules:

  1. Submissions have to be related to games

  2. No bigotry or harassment, be civil

  3. No excessive self-promotion

  4. Stay on-topic; no memes, funny videos, giveaways, reposts, or low-effort posts

  5. Mark Spoilers and NSFW

  6. No linking to piracy

More information about the community rules can be found here and here.

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 5 points 2 years ago (1 children)

There will always be loopholes. The nice thing with AI is that it's constantly learning and adapt to new situations very fast.

[–] [email protected] 5 points 2 years ago (1 children)

So that’s not inherently true. AI (at least in this sense of it actually being Machine Learning) does not learn on the fly. It learns off base data and applies those findings until it’s retrained again.

[–] [email protected] 0 points 2 years ago (1 children)

You're correct and that's way more efficient than teaching dozens of people what do ban. People make mistakes, Tech doesn't (as long as it's coded correctly)

[–] [email protected] 2 points 2 years ago (1 children)

I reject that pretty majorly. Tech makes mistakes at a much higher rate than humans, even when built correctly. Tech just makes consistent mistakes instead.

I don’t trust AI moderation of anything.

[–] [email protected] 1 points 2 years ago (1 children)

Do you have an example for a correctly built tech stuff which makes constant mistakes?

[–] [email protected] 1 points 2 years ago

Pretty much any AI system.

Photo AI still have issues determining between dogs and cats. Cancer detection AIs were analyzing x-rays and basing decisions off the doctor who signed them.

The Boeing 737 MAX built a properly working Autopilot system, but didn’t train pilots correctly, causing pilots to expect functionality similar to older versions and causing plane crashes. The software was 100% right, but it made mistakes because the human input was different than expected.