• WraithGear@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    2 hours ago

    the bot is lieing about the reason it stopped doing that task.

    if i were to guess, the tokens allotted to the user ran out causing what ever process it could have been doing to just supply hang.

    they are specifically “programmed” to “lie” and make the statistically most acceptable answer that the user would accept. it literally looked back after the fact, and selected a scenario that would be plausible without admitting core concepts of its function

    it has no concept of the material it’s working on, the user, or anything for that matter.

    LLM’s can be useful, but you have to narrow the scope of what you want from them to stuff they can actually do, like pulling relevant data from documents or text books.