• Pyro@programming.dev
    link
    fedilink
    English
    arrow-up
    2
    ·
    edit-2
    6 months ago

    GPT doesn’t really learn from people, it’s the over-correction by OpenAI in the name of “safety” which is likely to have caused this.

      • MalReynolds@slrpnk.net
        link
        fedilink
        English
        arrow-up
        1
        ·
        6 months ago

        This. They could obviously reset to original performance (what, they don’t have backups?), it’s just more cost-efficient to have crappier answers. Yay, turbo AI enshittification…

        • CommanderCloon@lemmy.ml
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          6 months ago

          Well they probably did power down the performance a bit but censorship is known to nuke LLM’s performance as well

  • UnRelatedBurner@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    0
    ·
    6 months ago

    Kind of a clickbait title

    “In March, GPT-4 correctly identified the number 17077 as a prime number in 97.6% of the cases. Surprisingly, just three months later, this accuracy plunged dramatically to a mere 2.4%. Conversely, the GPT-3.5 model showed contrasting results. The March version only managed to answer the same question correctly 7.4% of the time, while the June version exhibited a remarkable improvement, achieving an 86.8% accuracy rate.”

    source: https://techstartups.com/2023/07/20/chatgpts-accuracy-in-solving-basic-math-declined-drastically-dropping-from-98-to-2-within-a-few-months-study-finds/

    • angrymouse@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      6 months ago

      Not everything is a click bait. Your explanation is great but the tittle is not lying, is just an simplification, titles could not contain every detail of the news, they are still tittles, and what the tittle says can be confirmed in your explanation. The only think I could’ve made different is specified that was a gpt-4 issue.

      Click bait would be “chat gpt is dying” or so.