• Gsus4@mander.xyz
    link
    fedilink
    English
    arrow-up
    1
    ·
    51 minutes ago

    Hahaha, AI will replace all jobs and turn professionals into … proofreaders?

    jokes aside: can’t they use a second sweep to proofread the thing?

    • Flic@mstdn.social
      link
      fedilink
      arrow-up
      1
      ·
      48 minutes ago

      @Gsus4 @TheBat the more worrying thing is how many times it has made something up that nobody has spotted because it looks normal.
      Proofreading spots spelling/punctuation/formatting issues. You need a deeper copyedit, not just a scan, to check the sense of something. And that won’t necessarily catch factually untrue, but perfectly plausible, things.

  • fodor@lemmy.zip
    link
    fedilink
    English
    arrow-up
    17
    ·
    1 day ago

    Can we put these pigs on the Brady List and charge them with perjury? They swear under oath that those reports are true. Can’t have it both ways.

  • doopen@lemmy.world
    link
    fedilink
    English
    arrow-up
    26
    ·
    2 days ago

    “The body cam software and the AI report writing software picked up on the movie that was playing in the background, which happened to be ‘The Princess and the Frog,'” police sergeant Rick Keel told the broadcaster, referring to Disney’s 2009 musical comedy. “That’s when we learned the importance of correcting these AI-generated reports” (bold and italic emphasis mine)

    • Soup@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      22 hours ago

      It’s wild considering one the first things people did with LLMs was that lawyers had it do their work for them and they showed up to court with documents that were partially made up. Literally stuff within the law sphere and these guys still couldn’t learn from the mistakes of others.

      AI is so fucking stupid.

    • village604@adultswim.fan
      link
      fedilink
      English
      arrow-up
      11
      ·
      edit-2
      2 days ago

      Seems like a pretty reasonable response if you were sold a product claiming to be able to write reports.

      Working in public sector Infosec, I can promise IT is rarely consulted or listened to before the contracts are negotiated, and the people who negotiate the contracts don’t know enough to be skeptical.

      At least they learned the lesson in a relatively benign way. Although this should be cause to order the review of every report that was written by the software to date.

  • Echo Dot@feddit.uk
    link
    fedilink
    English
    arrow-up
    21
    ·
    edit-2
    2 days ago

    But did the officer turned into a frog? We’re assuming the AI messed up but maybe he actually did. It’s important to check these things.

  • nyan@lemmy.cafe
    link
    fedilink
    English
    arrow-up
    23
    ·
    2 days ago

    According to records obtained by the group, “it’s often impossible to tell which parts of a police report were generated by AI and which parts were written by an officer.”

    This does not give me a great impression of the literacy level of American police officers. Another good reason to stay out of that country.

  • Sparrow_1029@programming.dev
    link
    fedilink
    English
    arrow-up
    18
    ·
    2 days ago

    “The AI hallucinated the brutality, your honor.”

    “Ladies and gentlemen of the jury, just because they’re a dissident that speaks out often against authority who we clearly would like to silence doesn’t mean the AI is biased in its assessment of their guilt.”