My favorite is when someone tells me that they are too old to learn about new technology, or that they can’t use a device because they aren’t very tech-y. No, you just refuse to learn.

  • Sterile_Technique@lemmy.world
    link
    fedilink
    English
    arrow-up
    21
    ·
    12 hours ago

    WAY too many people don’t realize “AI” is just marketing bullshit, and genuinely think that LLMs and shit are literal intelligence in a computer.

    For one, it’s driving every company under the sun to shove it into every product under the sun; and two, if we ever do create a true AI (what we’re calling “AGI” now, at least until marketing drives that one to meaninglessness too and we have to move the goal posts again), it’s going to be humanity changer in par with shit like discovering fire… and people will be confused as all hell becuase “wE’ve hAd tHAt foR yEArS!” cuz they’ll think its the same spell-checker-that’s-wrong-occaisionally-and-generates-nudes that we have today.

    • Iconoclast@feddit.uk
      link
      fedilink
      arrow-up
      5
      ·
      edit-2
      7 hours ago

      LLMs are AI - always have been. The term “artificial intelligence” has always been broad in computer science: it covers anything that performs a cognitive task normally requiring human intelligence. A chess engine from 1999 is AI. A spam filter is AI. An LLM is AI. Narrow AI, sure, but still AI.

      The confusion comes from people equating “AI” with sci-fi AGI (human-level general intelligence, HAL/JARVIS/Skynet/etc.). That’s a specific subset, not the whole category. When companies say “AI-powered” they’re not claiming AGI - they’re saying the product uses machine learning or pattern recognition in some way. Marketing inflates the language, yes, but the underlying tech is real and fits the definition.

      If/when we reach actual AGI, it will be a civilization-level shift - far beyond today’s spell-checker-that-sometimes-hallucinates. People will look back and say “we had AI for years,” but they’ll mean narrow tools, not the thing that can invent new science or run a company autonomously. The goalposts aren’t moving; the hype is just using the broad term loosely.

      • grue@lemmy.world
        link
        fedilink
        arrow-up
        4
        ·
        7 hours ago

        LLMs are AI - always have been. The term “artificial intelligence” has always been broad in computer science: it covers anything that performs a cognitive task normally requiring human intelligence.

        On the contrary, it’s not “AI” unless it’s a fuckton of hand-programmed if statements. I dunno what this newfangled “neural network” shit is, but it’s way too brain-like to be AI! \s