• Rhynoplaz@lemmy.world
    link
    fedilink
    arrow-up
    28
    ·
    edit-2
    8 hours ago

    I think the answer to why there isn’t a modern alternative is under the History tab on that Wiki page.

    Fun idea though, I had never heard of that one.

    • WhatGodIsMadeOf@feddit.org
      link
      fedilink
      arrow-up
      11
      arrow-down
      1
      ·
      edit-2
      7 hours ago

      I assume it’s the same reason CS got rid of tags.

      The smoke bomb clears and there’s suddenly a giant vagina or gay sex filling your screen, and you have to stand up to block the screen because you were on the family computer in the living room.

    • Yezzey@lemmy.caOP
      link
      fedilink
      arrow-up
      3
      arrow-down
      17
      ·
      8 hours ago

      Easy to police now with AI, if someone gets rich send me some bitcoin im poor (:

      • AmbitiousProcess (they/them)@piefed.social
        link
        fedilink
        English
        arrow-up
        13
        ·
        7 hours ago

        If it was “easy to police now with AI,” then companies wouldn’t still regularly have issues with all kinds of code injection on their websites, since literally any security vendor would have implemented bulletproof AI protection for it already.

        An AI model designed for moderation could probably block some things, but it would be no better than traditional mechanisms employed by large organizations who’s job it is to keep things secure, that still regularly fall victim to these kinds of vulnerabilities. Many of these organizations already use AI-powered tools to police their systems, and they know they’re not anywhere close to even being a full replacement, let alone foolproof.

        • Yezzey@lemmy.caOP
          link
          fedilink
          arrow-up
          2
          arrow-down
          7
          ·
          7 hours ago

          AI isn’t perfect, and it’s definitely not a magic bullet for security or moderation. But that’s true for every system we use today.

          • geekwithsoul@piefed.social
            link
            fedilink
            English
            arrow-up
            6
            ·
            6 hours ago

            It’s more than “AI isn’t perfect”, it’s that AI isn’t even good. Moderation, and even summaries, require more than predictions - they require understanding, which AI doesn’t have. It’s all hallucinations, and it’s just that through sheer dumb luck and hoovering in so much ill-gotten data that sometimes the hallucinations happen to be correct.

            • Yezzey@lemmy.caOP
              link
              fedilink
              arrow-up
              1
              arrow-down
              6
              ·
              6 hours ago

              Dismissing it all as “hallucinations” is like saying all cars are useless because they can crash. No tool is flawless but imperfect doesn’t mean worthless.

              • geekwithsoul@piefed.social
                link
                fedilink
                English
                arrow-up
                6
                ·
                5 hours ago

                Nice strawman, but not applicable. A car can mechanically fail, resulting in a crash or a human can operate it in such a manner as to cause a crash. It can’t crash on its own and if driven and maintained correctly, won’t crash.

                An AI, on the other hand, can give answers but never actually “knows” if it’s correct or true. Sometimes the answers will be correct because you get lucky but there’s nothing in any current LLM out there that can tell fact from fiction. It’s just based on how it’s trained and what it’s trained on, and even when taking from “real” sources, it can mix things up when combining sources. Suggest you read https://medium.com/analytics-matters/generative-ai-its-all-a-hallucination-6b8798445044

                The only way a car would be like an AI is if every time you sat in the car, it occasionally drove you to the right place and you didn’t mind the other 9 out of 10 times it drove you to the wrong place, drove you using the least efficient route, and/or occasionally drove across lawns and fields, and on sidewalks. Oh, and the car assembles itself from other people’s cars and steals their gas.