• 7 Posts
  • 1.04K Comments
Joined 1 year ago
cake
Cake day: November 8th, 2024

help-circle
  • I disagree for similar reasons.

    There’s no good case for “I asked a CHAT BOT if I could eat a poisonous mushroom and it said yes” because you could have asked a mycologist or toxicologist. The user is putting themself at risk. It’s not up to the software to tell them how to not kill themselves.

    If the user is too stupid to know how to use AI, it’s not the AI’s fault when something goes wrong.

    Read the docs. Learn them. Grow from them. And don’t eat anything you found growing out of a stump.