• kadu@scribe.disroot.org
    link
    fedilink
    arrow-up
    3
    ·
    1 day ago

    Chat boxes get a system prompt describing them as a helpful assistant, and then a blank, they need to predict how to fill the blank. Then they get the exact same system prompt, the word they just filled, and a new blank. Repeat until the blank becomes an ending token.

    This automatically means the AI is likely to answer in a way a human would find natural not necessarily optimal or correct.

    Which is why the “Oh my god, you’re right! I missed this obvious feature!” remarks appear even in coding agents.