• BlackLaZoR@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    3 hours ago

    Reminds me of case when Google effectively swatted dude who sent medical images of his infant son intimate areas to a doctor (due to COVID lockdown direct visit wasnt possible)

    Nice to be accused of pedophilia based on your perfectly legit medical documentation.

    Fuck you Google.

  • SleeplessCityLights@programming.dev
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    1
    ·
    6 hours ago

    This reminds me that there are 1000s of SysAdmins that stumbled onto the Trumpstien emails. They were not hidding anything at all, that shit was setting flags off everywhere. The question is what happened next? Did the CIA show up at their door telling them they will put you in the ground if they have to and remind them that their Google employee NDA shuts thems up. Maybe the honeypot was so in-gained in Google that the emails were auto flagged as “part of an ongoing investigation” and IT just ignored them.

  • 🍉 Albert 🍉@lemmy.world
    link
    fedilink
    English
    arrow-up
    73
    arrow-down
    1
    ·
    15 hours ago

    severe mix feelings.

    glad they caught him, but corporations casually snooping through your data and report whatever they want is definitely not an good thing

    • Leon@pawb.social
      link
      fedilink
      English
      arrow-up
      55
      ·
      edit-2
      12 hours ago

      Was a gay guy here in Sweden who got assaulted and kidnapped by masked police because some American company had found CSAM on his account while crawling through Yahoo email.

      Only it wasn’t CSAM, the photos depicted the man’s 30 year old twinky boyfriend.

      No restitution. No police were punished for assaulting a suspect proved innocent. The man and his boyfriend both were humiliated.

      I’ve no mixed feelings about it. Spying through private data is entirely unforgivable. There are plenty of pedos out there who get caught and nothing happens anyway. They don’t need to violate innocent people’s privacy to do their job.

      Like if the ends justify the means you can end all suffering in the world by just nuking everything. All problems solved.

      Edit: pesos → pedos

    • tidderuuf@lemmy.world
      link
      fedilink
      English
      arrow-up
      19
      ·
      14 hours ago

      Microsoft has been doing this for years. It was with Onedrive at first but now that they’ve enabled “analytics” in every product that might connect to the internet they can have it all searched.

      Supposedly it is first filtered by algorithms but that shit is still being uploaded somewhere other than your hard drive.

      • wizardbeard@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        6
        ·
        14 hours ago

        I believe it was in preview build versions of Win 7 or 10 where researchers found it was sending the generated thumbnails of images on your PC to Redmond (MS HQ). Can’t remember if they said it was for CSAM detection or just a debugging feature in the preview builds.

        • 0x0@lemmy.zip
          link
          fedilink
          English
          arrow-up
          3
          ·
          10 hours ago

          the generated thumbnails of images on your PC

          So the precursor to Recall?

    • obvs@lemmy.world
      link
      fedilink
      English
      arrow-up
      11
      ·
      13 hours ago

      Unfortunately, the negative effects from companies like Google turning in completely ethical people for doing things that should be completely legal and uncontroversial will do drastically more damage than the positive effects from said companies turning in the poorest of the pedophiles.

        • obvs@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          6 hours ago

          The company is literally building death camps, installing statues of genociders, is run by the RICH pedophiles(who have ZERO interest in seeing pedophiles prosecuted), and is using Palantir and Flock cameras to monitor everything, meanwhile having secret police disappear people and just openly slaughter them.

          The United States Government is well beyond deserving the benefit of the doubt.

          • Zamboni_Driver@lemmy.ca
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            1
            ·
            4 hours ago

            Great do you have a single example of what you’re claiming, lol. Google turning in a perfectly ethical person for doing something that should be legal and uncontroversial.

            You’re moving the goal posts and changing your argument.

    • CompactFlax@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      7
      ·
      14 hours ago

      They’re suggesting it was automated hash based recognition.

      I don’t have a problem with CSAM hash matching.

      • UnspecificGravity@piefed.social
        link
        fedilink
        English
        arrow-up
        20
        ·
        edit-2
        13 hours ago

        Sure, until it starts flagging normal pictures with its janky AI and you get your door kicked in based on a warrant signed by Google.

        • Leon@pawb.social
          link
          fedilink
          English
          arrow-up
          13
          ·
          13 hours ago

          This literally already happened here in Sweden. A guy got assaulted by masked police in the middle of the night because an American company had gone through photos in his Yahoo mail and flagged his 30 year old boyfriend as possible CSAM.

          Long article in Swedish.

          People like to think that Sweden is progressive etc. and I’d rebut it with this. If it can happen here, it could happen anywhere.

      • 🍉 Albert 🍉@lemmy.world
        link
        fedilink
        English
        arrow-up
        12
        ·
        14 hours ago

        my issue is that we have a framework for corporations to scan all your data and inform the state. used to stop CSAM, but it’s a matter of state policy wether said structure will be used to fight discent.

        • CompactFlax@discuss.tchncs.de
          link
          fedilink
          English
          arrow-up
          9
          ·
          14 hours ago

          I agree. We’ve seen this happening in the USA “yes technically they can do that but they would never”. Now we know better.

  • Pika@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    42
    ·
    edit-2
    14 hours ago

    In the US companies(where the company is located last I knew) are legally mandated to report specific things such as CSAM and other things if they come across it.

    What the issue should be isn’t the fact that they are reporting it, the issue should be they have the capability to see it in the first place to be able to report it.

    This isn’t me defending CSAM or anything like that but, in a decent storage system, google shouldn’t be able to even see what you have, let alone what the images actually are.

  • MountingSuspicion@reddthat.com
    link
    fedilink
    English
    arrow-up
    25
    ·
    15 hours ago

    The detective alleges that that photograph and others she examined appeared to be stored in a folder on the iPhone titled “Girls I Drugged And Raped.”

  • Crt_static@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    6
    ·
    12 hours ago

    Kind of fine with this. It gives me the ick they can do that, but so does CSAM and I don’t see a middle ground.