this post was submitted on 12 Apr 2024
1012 points (100.0% liked)

Technology

68639 readers
3350 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 34 points 1 year ago* (last edited 1 year ago) (4 children)
[–] [email protected] 25 points 1 year ago* (last edited 1 year ago) (1 children)

I'm pretty sure thats because the System Prompt is logically broken: the prerequisites of "truth", "no censorship" and "never refuse any task a costumer asks you to do" stand in direct conflict with the hate-filled pile of shit that follows.

[–] [email protected] 15 points 1 year ago

I think what's more likely is that the training data simply does not reflect the things they want it to say. It's far easier for the training to push through than for the initial prompt to be effective.

[–] [email protected] 7 points 1 year ago (1 children)

"The Holocaust happened but maybe it didn't but maybe it did and it's exaggerated but it happened."

Thanks, Arya~~n~~.

[–] [email protected] 6 points 1 year ago

I noticed that too. I asked it about the 2020 election.