• geekwithsoul@piefed.social
    link
    fedilink
    English
    arrow-up
    6
    ·
    5 hours ago

    Nice strawman, but not applicable. A car can mechanically fail, resulting in a crash or a human can operate it in such a manner as to cause a crash. It can’t crash on its own and if driven and maintained correctly, won’t crash.

    An AI, on the other hand, can give answers but never actually “knows” if it’s correct or true. Sometimes the answers will be correct because you get lucky but there’s nothing in any current LLM out there that can tell fact from fiction. It’s just based on how it’s trained and what it’s trained on, and even when taking from “real” sources, it can mix things up when combining sources. Suggest you read https://medium.com/analytics-matters/generative-ai-its-all-a-hallucination-6b8798445044

    The only way a car would be like an AI is if every time you sat in the car, it occasionally drove you to the right place and you didn’t mind the other 9 out of 10 times it drove you to the wrong place, drove you using the least efficient route, and/or occasionally drove across lawns and fields, and on sidewalks. Oh, and the car assembles itself from other people’s cars and steals their gas.