• ExLisper@lemmy.curiana.net
    link
    fedilink
    English
    arrow-up
    43
    ·
    6 hours ago

    The jury ordered Meta to pay the maximum penalty under the law of $5,000 per violation, totaling $375m in civil penalties for violating New Mexico’s consumer protection laws.

    Meta: I guess I will only be able to spend $79.635.000.000 on my next useless venture.

  • deathbird@mander.xyz
    link
    fedilink
    English
    arrow-up
    50
    ·
    8 hours ago

    “The design feature changes the state is seeking include “enacting effective age verification, removing predators from the platform, and protecting minors from encrypted communications that shield bad actors”.”

    Oh fuck right off.

    I’m sorry but this is a bad “think of the children” decision. There are limits to what Meta or any platform can do about bad actors at that size without structural changes.

    What might actually help: only show people content from groups and people that they follow, preferably in chronological order, rather than suggesting new groups and pages algorithmically all the time and thereby increasing the likelihood of children interacting with strangers on the Internet.

    And improve parental controls for children’s accounts. I’m sure there’s nothing currently giving a “parent” account high level control over a “child” account, but I’m happy to be corrected if I’m wrong.

    But also: require intercompatibility with other platforms and a standardized form of profile data export so people can leave Facebook but stay in touch with the people who still use it.

    • lmmarsano@group.lt
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      1
      ·
      edit-2
      5 hours ago

      And improve parental controls for children’s accounts. I’m sure there’s nothing currently giving a “parent” account high level control over a “child” account, but I’m happy to be corrected if I’m wrong.

      Parental controls already exist in every major OS, they suffice to restrict & monitor social media, and they go unused.

      A better solution might be for laws to provide parents resources & incentives to parent children’s online activity (including training to use resources they already have) & to provide children education in online safety & literacy. Decades ago, federal courts citing commission findings & studies recommended these alternatives as superior in effectiveness, meeting government duties to minimize impact on civil liberties, allocation of law enforcement resources, etc. For the permanent injunction to COPA, the judge wrote

      Moreover, defendant contends that: (1) filters currently exist and, thus, cannot be considered a less restrictive alternative to COPA; and that (2) the private use of filters cannot be deemed a less restrictive alternative to COPA because it is not an alternative which the government can implement. These contentions have been squarely rejected by the Supreme Court in ruling upon the efficacy of the 1999 preliminary injunction by this court. The Supreme Court wrote:

      Congress undoubtedly may act to encourage the use of filters. We have held that Congress can give strong incentives to schools and libraries to use them. It could also take steps to promote their development by industry, and their use by parents. It is incorrect, for that reason, to say that filters are part of the current regulatory status quo. The need for parental cooperation does not automatically disqualify a proposed less restrictive alternative. In enacting COPA, Congress said its goal was to prevent the “widespread availability of the Internet” from providing “opportunities for minors to access materials through the World Wide Web in a manner that can frustrate parental supervision or control.” COPA presumes that parents lack the ability, not the will, to monitor what their children see. By enacting programs to promote use of filtering software, Congress could give parents that ability without subjecting protected speech to severe penalties.

      I also agree and conclude that in conjunction with the private use of filters, the government may promote and support their use by, for example, providing further education and training programs to parents and caregivers, giving incentives or mandates to ISP’s to provide filters to their subscribers, directing the developers of computer operating systems to provide filters and parental controls as a part of their products (Microsoft’s new operating system, Vista, now provides such features, see Finding of Fact 91), subsidizing the purchase of filters for those who cannot afford them, and by performing further studies and recommendations regarding filters.

      Adult supervision, child education on online safety & literacy, parental controls & filters are more effective at less expense to fundamental rights. Governments know this & conveniently forget it.

    • Dr. Moose@lemmy.world
      link
      fedilink
      English
      arrow-up
      13
      arrow-down
      1
      ·
      6 hours ago

      What actually might help: hold people who design these tools criminally liable. Everyone knows what they are doing but you can’t really say no to your employer because “don’t worry you’re not liable” so everyone continues on building the Torment Nexus.

      • deliriousdreams@fedia.io
        link
        fedilink
        arrow-up
        2
        arrow-down
        1
        ·
        41 minutes ago

        Are you suggesting that we should be able to criminally prosecute people who build end to end encryption software and tools? Or algorithms that find people you may know? Because that seems to be key to the Meta lawsuit as far as they are involved. That and the fact that Meta deliberately mislead the public about the safety of the website for kids. Because social media as it exists today isn’t really safe for children and a best the people responsible for that are the executives who made the decision to lie accountable.

        But your average programmer isn’t designing tools for the purpose of making kids less safe. They aren’t designing tools for the purpose of being addictive. And they aren’t designing tools for predators. They happen to have designed tools used by predators because of the flaws in the design and the fact that their executives found those flaws to be advantageous to their bottom line so they played them up. Leaned in if you will.

        It was literally part of the leak in 2021 that they had discovered that their algorithm had certain effects and the C-Suit literally went about making sure they could use that for monetary gain to keep people on the site and scrolling. Not just young users, but users of all ages.

        The main thing is that it’s really easy to social engineer on a social media website where people are encouraged to give out all kinds of information that can be used against them in social engineering attacks. That, combined with the addiction fostered there and the encrypted chat methods owned by Meta and used by quite a bit of the world en masse is what created this situation.

    • RecallMadness@lemmy.nz
      link
      fedilink
      English
      arrow-up
      2
      ·
      6 hours ago

      Unfortunately can’t codify how platforms work soecifically into law.

      But you could possibly explicitly make companies liable for promoting “detrimental” content. Then define “promoting” as something like “surfacing content to a user beyond the reach of the users immediate network. Ie algorithmic suggestions or advertising”

    • ExLisper@lemmy.curiana.net
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      6 hours ago

      What might actually help: only show people content from groups and people that they follow, preferably in chronological order, rather than suggesting new groups and pages algorithmically all the time and thereby increasing the likelihood of children interacting with strangers on the Internet.

      You would simply have big groups like “I ❤️ New Mexico” where people will comment on the same posts and interact. If you would limit all the content including comments and likes to users someone personally follows without the ability to discover other users you would turn facebook basically into WhatsApp. It would definitely solve the issue but it would also make the platform look empty and kill it. Which would not necessarily be bad but sadly killing facebook is too radical for anyone to support.

  • Fredselfish@lemmy.world
    link
    fedilink
    English
    arrow-up
    111
    ·
    11 hours ago

    So…it’s a fucking fine, which way less then he made by doing this. Until throw these fucks in jail this shit will continue.

    • staircase@programming.devOP
      link
      fedilink
      English
      arrow-up
      43
      arrow-down
      1
      ·
      edit-2
      11 hours ago

      In the next phase of the legal proceedings, due to begin on 4 May, the attorney general’s office will seek additional financial penalties and court-mandated changes to Meta’s platforms that “offer stronger protections for children”, said Torrez.

      The design feature changes the state is seeking include “enacting effective age verification, removing predators from the platform, and protecting minors from encrypted communications that shield bad actors”.

      Unclear how age verification would play out with their Digital Childhood Alliance efforts.

            • paraphrand@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              edit-2
              6 hours ago

              The victims of child exploitation? Or the lawyers representing them? Or…? I’m not asking about vague general “save the children” stuff. I’m talking about this lawsuit and similar.

              • obre@slrpnk.net
                link
                fedilink
                English
                arrow-up
                3
                ·
                edit-2
                2 hours ago

                Not the person you were responding to, but IMO it’s the defense attorneys / legal department working to ensure that the legal outcome is as beneficial to the corporations as possible, even if they “lose”. In this case the fine is a cost of doing business, not nearly enough to actually discourage malfeasance and the legal/ PR pivot to blaming encryption rather than their algorithms is something they hope will tee them up to be able to do even more massive surveillance in the near future.

    • Lost_My_Mind@lemmy.world
      link
      fedilink
      English
      arrow-up
      19
      ·
      10 hours ago

      Until throw these fucks in jail this shit will continue.

      Which is exactly why that won’t happen. Our president is a pedophile. There’s a whole network of wealthy pedophiles who no longer have an island. The pedophiles are in power.

      • waddle_dee@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 hours ago

        Or it could because it’s a civil case with no penal repercussions, because it’s a bloody civil case. For them to go to jail, the DOJ would have to file criminal charges against Meta. They won’t do that, not because of Pedo President, but because the DOJ has been too chicken shit since Enron to go after anyone else.

      • OwOarchist@pawb.social
        link
        fedilink
        English
        arrow-up
        12
        ·
        8 hours ago

        who no longer have an island

        *who now have a different island that we don’t yet know about.

        • FudgyMcTubbs@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          45 minutes ago

          Y’all are gonna end up with a Pizza Gate situation. We need real leaders who will hire and effective DOJ to investigate and charge the monsters in a timely but just manner. We need a new viable party.

  • BlackCat@piefed.social
    link
    fedilink
    English
    arrow-up
    18
    ·
    9 hours ago

    Meta has generated high volumes of “junk” reports by overly relying on AI to moderate its platforms, investigators said. These reports were useless to law enforcement, and meant crimes could not be investigated, they said.

    shocker.

  • TwilitSky@lemmy.world
    link
    fedilink
    English
    arrow-up
    19
    ·
    11 hours ago

    You mean we shouldn’t have put our children’s safety in the waxy grasp of a sentient Annabel in a t-shirt and jeans?

  • andybytes@programming.dev
    link
    fedilink
    English
    arrow-up
    8
    arrow-down
    1
    ·
    9 hours ago

    People just need to get off these platforms. Enless I can just pop off I don’t want what you got and these techno turdz can eat my arse. Facebook needs to die along with roblox

  • darklamer@feddit.org
    link
    fedilink
    English
    arrow-up
    9
    ·
    10 hours ago

    So they want us to believe that the company that knowingly profited from genocide in Myanmar also knowingly profited from child exploitation? Really? OK then, I can believe that.

  • Brem@lemmy.world
    link
    fedilink
    English
    arrow-up
    7
    ·
    10 hours ago

    Go figure, a company that exploits habits is found guilty of exploiting children…

  • Maeve@kbin.earth
    link
    fedilink
    arrow-up
    5
    ·
    11 hours ago

    “We respectfully disagree with the verdict and will appeal. We work hard to keep people safe on our platforms and are clear about the challenges of identifying and removing bad actors or harmful content,” said a Meta spokesperson. “We will continue to defend ourselves vigorously, and we remain confident in our record of protecting teens online.” Internal Meta documents and testimony obtained by the New Mexico department of justice during the litigation revealed that both company employees and external child safety experts repeatedly warned about risks and harmful conditions on Meta’s platforms. Evidence presented to the jury included details of the 2024 arrest of three men charged with sexually preying on children through Meta’s platforms, and attempting to meet up with them. This was part of a sting investigation operated by undercover agents and dubbed “Operation MetaPhile” by the attorney general’s office.