My favorite is when someone tells me that they are too old to learn about new technology, or that they can’t use a device because they aren’t very tech-y. No, you just refuse to learn.

  • Iconoclast@feddit.uk
    link
    fedilink
    arrow-up
    5
    ·
    edit-2
    7 hours ago

    LLMs are AI - always have been. The term “artificial intelligence” has always been broad in computer science: it covers anything that performs a cognitive task normally requiring human intelligence. A chess engine from 1999 is AI. A spam filter is AI. An LLM is AI. Narrow AI, sure, but still AI.

    The confusion comes from people equating “AI” with sci-fi AGI (human-level general intelligence, HAL/JARVIS/Skynet/etc.). That’s a specific subset, not the whole category. When companies say “AI-powered” they’re not claiming AGI - they’re saying the product uses machine learning or pattern recognition in some way. Marketing inflates the language, yes, but the underlying tech is real and fits the definition.

    If/when we reach actual AGI, it will be a civilization-level shift - far beyond today’s spell-checker-that-sometimes-hallucinates. People will look back and say “we had AI for years,” but they’ll mean narrow tools, not the thing that can invent new science or run a company autonomously. The goalposts aren’t moving; the hype is just using the broad term loosely.

    • grue@lemmy.world
      link
      fedilink
      arrow-up
      4
      ·
      7 hours ago

      LLMs are AI - always have been. The term “artificial intelligence” has always been broad in computer science: it covers anything that performs a cognitive task normally requiring human intelligence.

      On the contrary, it’s not “AI” unless it’s a fuckton of hand-programmed if statements. I dunno what this newfangled “neural network” shit is, but it’s way too brain-like to be AI! \s