• rtxn@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    edit-2
    59 minutes ago

    auto complete

    It’s called lexical analysis or lexical tokenization. It existed long before LLMs (as long as high-level programming languages have, since lexical analysis of the source is the first step of compilation), it doesn’t rely on stolen code, and doesn’t consume a small village’s worth of electricity. Superficial parallels with chatbots do not make it AI – it’s a fucking algorithm.

    Besides, there is a world of difference between asking a clanker to spit out a Python function that multiplies two matrices, and putting the knock-off Shadowheart from TEMU in a million-dollar game.