• deathbird@mander.xyz
    link
    fedilink
    English
    arrow-up
    67
    ·
    13 hours ago

    “The design feature changes the state is seeking include “enacting effective age verification, removing predators from the platform, and protecting minors from encrypted communications that shield bad actors”.”

    Oh fuck right off.

    I’m sorry but this is a bad “think of the children” decision. There are limits to what Meta or any platform can do about bad actors at that size without structural changes.

    What might actually help: only show people content from groups and people that they follow, preferably in chronological order, rather than suggesting new groups and pages algorithmically all the time and thereby increasing the likelihood of children interacting with strangers on the Internet.

    And improve parental controls for children’s accounts. I’m sure there’s nothing currently giving a “parent” account high level control over a “child” account, but I’m happy to be corrected if I’m wrong.

    But also: require intercompatibility with other platforms and a standardized form of profile data export so people can leave Facebook but stay in touch with the people who still use it.

    • lmmarsano@group.lt
      link
      fedilink
      English
      arrow-up
      13
      arrow-down
      1
      ·
      edit-2
      9 hours ago

      And improve parental controls for children’s accounts. I’m sure there’s nothing currently giving a “parent” account high level control over a “child” account, but I’m happy to be corrected if I’m wrong.

      Parental controls already exist in every major OS, they suffice to restrict & monitor social media, and they go unused.

      A better solution might be for laws to provide parents resources & incentives to parent children’s online activity (including training to use resources they already have) & to provide children education in online safety & literacy. Decades ago, federal courts citing commission findings & studies recommended these alternatives as superior in effectiveness, meeting government duties to minimize impact on civil liberties, allocation of law enforcement resources, etc. For the permanent injunction to COPA, the judge wrote

      Moreover, defendant contends that: (1) filters currently exist and, thus, cannot be considered a less restrictive alternative to COPA; and that (2) the private use of filters cannot be deemed a less restrictive alternative to COPA because it is not an alternative which the government can implement. These contentions have been squarely rejected by the Supreme Court in ruling upon the efficacy of the 1999 preliminary injunction by this court. The Supreme Court wrote:

      Congress undoubtedly may act to encourage the use of filters. We have held that Congress can give strong incentives to schools and libraries to use them. It could also take steps to promote their development by industry, and their use by parents. It is incorrect, for that reason, to say that filters are part of the current regulatory status quo. The need for parental cooperation does not automatically disqualify a proposed less restrictive alternative. In enacting COPA, Congress said its goal was to prevent the “widespread availability of the Internet” from providing “opportunities for minors to access materials through the World Wide Web in a manner that can frustrate parental supervision or control.” COPA presumes that parents lack the ability, not the will, to monitor what their children see. By enacting programs to promote use of filtering software, Congress could give parents that ability without subjecting protected speech to severe penalties.

      I also agree and conclude that in conjunction with the private use of filters, the government may promote and support their use by, for example, providing further education and training programs to parents and caregivers, giving incentives or mandates to ISP’s to provide filters to their subscribers, directing the developers of computer operating systems to provide filters and parental controls as a part of their products (Microsoft’s new operating system, Vista, now provides such features, see Finding of Fact 91), subsidizing the purchase of filters for those who cannot afford them, and by performing further studies and recommendations regarding filters.

      Adult supervision, child education on online safety & literacy, parental controls & filters are more effective at less expense to fundamental rights. Governments know this & conveniently forget it.

    • Dr. Moose@lemmy.world
      link
      fedilink
      English
      arrow-up
      15
      arrow-down
      1
      ·
      11 hours ago

      What actually might help: hold people who design these tools criminally liable. Everyone knows what they are doing but you can’t really say no to your employer because “don’t worry you’re not liable” so everyone continues on building the Torment Nexus.

      • deliriousdreams@fedia.io
        link
        fedilink
        arrow-up
        3
        arrow-down
        1
        ·
        5 hours ago

        Are you suggesting that we should be able to criminally prosecute people who build end to end encryption software and tools? Or algorithms that find people you may know? Because that seems to be key to the Meta lawsuit as far as they are involved. That and the fact that Meta deliberately mislead the public about the safety of the website for kids. Because social media as it exists today isn’t really safe for children and a best the people responsible for that are the executives who made the decision to lie accountable.

        But your average programmer isn’t designing tools for the purpose of making kids less safe. They aren’t designing tools for the purpose of being addictive. And they aren’t designing tools for predators. They happen to have designed tools used by predators because of the flaws in the design and the fact that their executives found those flaws to be advantageous to their bottom line so they played them up. Leaned in if you will.

        It was literally part of the leak in 2021 that they had discovered that their algorithm had certain effects and the C-Suit literally went about making sure they could use that for monetary gain to keep people on the site and scrolling. Not just young users, but users of all ages.

        The main thing is that it’s really easy to social engineer on a social media website where people are encouraged to give out all kinds of information that can be used against them in social engineering attacks. That, combined with the addiction fostered there and the encrypted chat methods owned by Meta and used by quite a bit of the world en masse is what created this situation.

        • Dr. Moose@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          3 hours ago

          There’s difference between making an encryption tool and hiring a top psychologists to design abusive systems.

    • RecallMadness@lemmy.nz
      link
      fedilink
      English
      arrow-up
      2
      ·
      11 hours ago

      Unfortunately can’t codify how platforms work soecifically into law.

      But you could possibly explicitly make companies liable for promoting “detrimental” content. Then define “promoting” as something like “surfacing content to a user beyond the reach of the users immediate network. Ie algorithmic suggestions or advertising”

    • ExLisper@lemmy.curiana.net
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      11 hours ago

      What might actually help: only show people content from groups and people that they follow, preferably in chronological order, rather than suggesting new groups and pages algorithmically all the time and thereby increasing the likelihood of children interacting with strangers on the Internet.

      You would simply have big groups like “I ❤️ New Mexico” where people will comment on the same posts and interact. If you would limit all the content including comments and likes to users someone personally follows without the ability to discover other users you would turn facebook basically into WhatsApp. It would definitely solve the issue but it would also make the platform look empty and kill it. Which would not necessarily be bad but sadly killing facebook is too radical for anyone to support.