• rozodru@piefed.social
    link
    fedilink
    English
    arrow-up
    4
    ·
    2 days ago

    it’s getting worse too.

    Just this morning I asked claude a very basic question, it halucinated the answer 3 times in a row. zero correct solution. first answer It halucinated what a certain cli daemon does, second solution it provided an alternative that it literally halucinated, the thing didn’t exist at all, third solution it halucinated how another application works as well as the git repo for said application (A. doesn’t even do the the thing Claude describe and B. the repo it provided had NOTHING to do with the application it described) I just gave up and went to my searx and found the answer myself. I shouldn’t have been so lazy.

    ChatGPT isn’t much better anymore.