• wewbull@feddit.uk
    link
    fedilink
    English
    arrow-up
    9
    ·
    edit-2
    5 hours ago

    The fallback argument for the social media ban is that it’s better than nothing. But with results like these, it may be worse than nothing, given it potentially creates new problems. Children will remain online with arguably less supervision and support, new privacy and digital security vulnerabilities seem to have appeared and the worst aspects of social media lay largely unaddressed.

    I wish more people understood this. Changing something can mean you’ve caused harm unintentionally, even if you haven’t identified it yet. Too many people seem to have the thought process “We have to do something! This is something. Let’s do this.” without ever considering the harm they might do.

  • Jimbel@lemmy.world
    link
    fedilink
    English
    arrow-up
    14
    ·
    8 hours ago

    The addictive design of platforms, software and algorithms should be adressed, not the users age.

    And the tech companies should be made responsible to design more healthy platforms, etc.

    The problem is the design of tech, not the people using it.

    • coolmojo@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      5 hours ago

      But without the addictive design the users don’t spend enough time to see all the ads and tracking required to reach the target growth. Somebody think of the shareholders /s

  • melsaskca@lemmy.ca
    link
    fedilink
    English
    arrow-up
    11
    arrow-down
    3
    ·
    7 hours ago

    Censorship is never the answer. Teaching values and the corresponding ethics and morals that come with it is closer to the answer. A world where you burn down shit just to get a job as a firefighter makes this path a bit more difficult and harder to follow.

    • Reviever@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      1
      ·
      5 hours ago

      Censorship was never their intention. So they couldn’t give any less fucks. They just want to control us.

    • UnderpantsWeevil@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      5 hours ago

      Censorship is never the answer.

      https://en.wikipedia.org/wiki/Paradox_of_tolerance

      Formally banning certain forms of vulgar and bigoted expression establish a code of conduct for the community, even if they aren’t strictly enforced.

      Teaching values and the corresponding ethics and morals that come with it is closer to the answer.

      Morality is as much about proactive and affirmative pursuit of justice as internalized codes of conduct.

      If there is no social consequence for immoral behavior, there is no reason to believe the act is immoral.

    • wewbull@feddit.uk
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      5 hours ago

      WHAAAAT?!?! Educating people is better than telling them what to do?

  • blind3rdeye@aussie.zone
    link
    fedilink
    English
    arrow-up
    13
    ·
    8 hours ago

    I’ve talked to heaps of parents and heaps of kids about this. What I think is interesting is that people face-to-face seems to be generally supportive of the law. They say that social media is problematic, and that the law helps by discouraging its use. A few different kids have said that they it helps them break an addition. Other kids say they don’t care, because it hasn’t blocked them. So mostly positive or neutral responses when face-to-face.

    But every time I see this mentioned on the internet, it’s very negative. There are always heaps of comments saying that it is a failure, and could never work, and that the government is stupid; and there are often other comments saying it is a part of a secret plan for the government to track us or whatever. In any case, mostly negative views - with just a sprinkling of fairly neutral views such as “it hasn’t been active for very long. Lets wait and see.”

    I just think that’s interesting. I guess my real-world social circles don’t totally match my internet social circles.

    • emmy67@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      1
      ·
      6 hours ago

      Kids will often just repeat what they’ve heard to adults.

      But the largest problems to these laws is the way they affected minority groups. If followed, the law would disproportionately affect disabled and queer teens who may suddenly be unable to access help and community.

      I suspect there’s some selection bias in the kids you’re speaking to.

    • oortjunk@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      2
      ·
      7 hours ago

      Or, the internet, the same medium upon which the noisome roots of social media depend, has some induced self-selection bias for increasing connectivity. It’s basically behaving like a weird superorganism and advocating for conditions to make it grow. At, I might add, the expense of the host species.

  • GreenBeanMachine@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    1
    ·
    edit-2
    8 hours ago

    Get ready for even more surveillance, censorship and restrictions. That’s all they know about how to fix problems - bandaids to hide symptoms instead of addressing the root cause of issues.

    Perhaps this was always the plan. Introduce a law for “protecting children” knowing it won’t work as it stands, so then it will be easier to introduce even more surveillance and restrictions to fix the current law,

    All in the name of protecting children. How can you be against it? /s

  • arcine@jlai.lu
    link
    fedilink
    English
    arrow-up
    3
    ·
    8 hours ago

    Support public education about the risks of social media, and better parental control software ! That is the only way to actually fix this mess.

  • deathbird@mander.xyz
    link
    fedilink
    English
    arrow-up
    34
    ·
    15 hours ago

    Key point: “Ultimately, the fundamental problem with age-gating is that it fails to address any of the root problems with our current online landscape – that is, the extractive business models and pernicious design features of mainstream tech companies. We all exist in a highly commercialised information ecosystem, rife with algorithmically amplified misinformation, scams, harmful content and AI slop. Children are particularly vulnerable to these issues but the reality is that it impacts everyone, even if you’re blissfully absent from Facebook or Instagram.”

    • imjustmsk@lemmy.ml
      link
      fedilink
      English
      arrow-up
      8
      ·
      14 hours ago

      They don’t wanna solve the root problem, they just want to make the big tech companies happy as well as the people who is sayiing shit about social media happy, Age verification is their stupid answer to which translates to “We don’t give a flying shit about kids”

  • daannii@lemmy.world
    link
    fedilink
    English
    arrow-up
    10
    arrow-down
    3
    ·
    edit-2
    13 hours ago

    Right it’s going to take longer than a few months to enforce properly and undo the damage and protect new generations from its negative effects.

    At least it’s a start.

    • fodor@lemmy.zip
      link
      fedilink
      English
      arrow-up
      10
      arrow-down
      4
      ·
      12 hours ago

      Or maybe it’s never going to work because you can’t enforce it properly because the parents don’t want it to be enforced. And the damage you’re talking about is not backed up by as good science as you think it would be if you were going to pass a law such as this.

      But many people are of the mindset that oh my God. Oh my God we have to do something and this is something and therefore it’s better than nothing, and they’re wrong. If you don’t have a good plan, that doesn’t make your bad plan reasonable.

      • Neocorporation@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        2
        ·
        9 hours ago

        So your solution is to do nothing until there is a good plan. What is a good plan? How do we measure if something is a good plan before implementing it? Especially on the scale of moderating the internet… And what would your good plan be?

        • bigmamoth@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          7 hours ago

          Children are the reponsability of parents. Enforcing parents control on device like it is in europe by exemple if far more useful than giving this responsibility to platform that have no financial interest of doing it. Also u can reasonably make internet education course the same way some do about drugs or sex.

          • Neocorporation@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            2 hours ago

            Unfortunately, I think parents have by and large failed to parent this aspect of a child’s life. Do we continue to trust parents when they are so clearly failing? I suppose education is the long term answer but I rather just remove the ability for kids to access such harmful content.

  • Baggie@lemmy.zip
    link
    fedilink
    English
    arrow-up
    18
    ·
    16 hours ago

    This and the porn thing have been massively invasive in terms of privacy. It’s so transparently just building a database of facial data. It doesn’t even make an attempt to comprehensively block everything on the internet, or realistically enforce compliance.

  • M0oP0o@mander.xyz
    link
    fedilink
    English
    arrow-up
    8
    arrow-down
    3
    ·
    14 hours ago

    What? There is emence amounts of joy in “I told you so”. The majority of people warned them this was a stupid idea and now you want to piss on the good feeling of smug correct calling of the clearly failure idea? Fuck off.

    • Psythik@lemmy.world
      link
      fedilink
      English
      arrow-up
      32
      ·
      edit-2
      20 hours ago

      Similar thing happened where I live with porn. Recently passed a law requiring ID. Instead of complying, I just started going to different websites. No way am I giving up my identity to a sketchy porn site, no matter what the law says.

    • BygoneNeutrino@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      28
      ·
      24 hours ago

      I still think it’s a step in the right direction. Once you make it illegal for children to use social media, you can start going after the platforms for knowingly manipulating children.

      • evilcultist@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        48
        arrow-down
        1
        ·
        edit-2
        22 hours ago

        Or we can just go after the platforms for knowingly manipulating everyone. And for their invasive data collection. This is probably one reason why Meta spent more on lobbying (primarily for age verification) than Boeing and Lockheed Martin did on lobbying last year. Once the kids are identified, no one gives a shit about the adults so the problem (for them) just fades away.

        • BygoneNeutrino@lemmy.world
          link
          fedilink
          English
          arrow-up
          7
          arrow-down
          14
          ·
          edit-2
          23 hours ago

          Prohibition is effective, it’s just that it doesn’t work for easy to manufacture compounds such as alcohol or marijuana. Every known human culture has independently discovered alcohol, and marijuana is a weed that is ready to smoke in its natural form.

          As far as social media goes, my country has reached a point where TikTok and Facebook are preinstalled on every phone. If a parent buys their kid a phone and removes them, they will reinstall themselves after an automatic update. When you take into consideration the “streamlined” registration process, one can argue this is a means to target prepubescent children.

          …I guess an 8 year old could download a VPN and steal their parents identification, but I feel like some form of prohibition would help.

          • Alwaysnownevernotme@lemmy.world
            link
            fedilink
            English
            arrow-up
            17
            ·
            23 hours ago

            So you not only create a grey market you immediately inculcate the children into it.

            Prohibition is generally ineffective in anything that doesn’t involve violating someone else’s rights.

            If we’re talking about getting rid of slopware I’m all for it. But this law. And other laws like it are an incredibly thinley veiled attempt to silence dissent by tying peoples online comments to their employment and subsequently housing and healthcare.

            And I will never believe that this is done out concern for children.

  • Fluffy Kitty Cat@slrpnk.net
    link
    fedilink
    English
    arrow-up
    161
    arrow-down
    1
    ·
    1 day ago

    It was never designed to protect children

    Glad to see it’s not even working. Let’s keep fighting aginst these evil laws

    • expr@piefed.social
      link
      fedilink
      English
      arrow-up
      61
      arrow-down
      12
      ·
      1 day ago

      I mean, social media should be banned for everyone, not just teenagers. It’s a great evil in the world today, and in a functional democracy that wasn’t braindead, we should ban them outright for the mass harm and destruction they have caused.

      That being said, I fully understand that the motivations of countries for these kinds of bans have little to do with the harm of social media and are much more about surveillance.

      • arcine@jlai.lu
        link
        fedilink
        English
        arrow-up
        3
        ·
        8 hours ago

        Do you realize you posted this very comment on social media ? Do you think they should ban the fediverse as well !?

      • Link@rentadrunk.org
        link
        fedilink
        English
        arrow-up
        19
        ·
        23 hours ago

        Which type of social media are we referring to here?

        Doesn’t Lemmy count as social media?

        • nguarracino@programming.dev
          link
          fedilink
          English
          arrow-up
          7
          arrow-down
          1
          ·
          18 hours ago

          There’s a list of 10 or 12 social networks that are banned: YouTube, Instagram, TikTok, etc.

          Lemmy is still legal.

          • Grainne@lemmy.dbzer0.com
            link
            fedilink
            English
            arrow-up
            14
            arrow-down
            4
            ·
            17 hours ago

            Lemmy is legal because it’s too small for them to notice.

            And YouTube is an incredible resource for finding information. It’s not social media at all.

      • Virtvirt588@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        20 hours ago

        I agree, social media is harmful for all, no matter the age. We shouldn’t be destined to further segment and disfranchise individuals solely because they’re “inferior”, based on age or any other discriminatory factor - the thing is, who is the victim and who is the abuser in this case? Because the situation at hand seems like the victims are getting punished for the wrongdoings of the abuser.

        This is where we are at, the corporations flipped the script, and we as a society gulped it all down, tightening the handcuffs around the wrong hands.

        But besides the point, relating to the logic within your statement, who are you trying to ban here? Because you mention both “everyone” and “them” - which consequently makes it ambiguous, which introduces double meaning.

      • yardratianSoma@lemmy.ca
        link
        fedilink
        English
        arrow-up
        9
        ·
        1 day ago

        It’s so bonkers how most of the older generations agree that being on the internet cannot make you social, yet became the default method to communicate.

        Ban it for everyone? I mean, lemmy itself is a social network platform, if you want it to be. But I know what you mean: social media being the most used platforms, Google, Facebook, Tik-Tok, etc . . . And for that, yeah, I do agree with a full ban. We need a cultural reset, where we aren’t being fed sensationalist bullshit and pure brainrot as entertainment via an algorithm trained on our insufficient capacity to regulate our attention.

        • Dave@lemmy.nz
          link
          fedilink
          English
          arrow-up
          24
          ·
          23 hours ago

          In my view social media is probably not the problem, but the algorithms they use that are designed to be addictive and manipulative.

          I saw an article once arguing that the algorithms should be regulated in a similar way to medicine. Give some base ingredients they can use freely (e.g. sort by newest first), then require any others to run studies to prove they are not harmful.

          There would be an expert board that approves or declines the new algorithm in the same way medicines are approved today (the important bit being that they are experts, not politicians making the decision).

          • Instigate@aussie.zone
            link
            fedilink
            English
            arrow-up
            8
            ·
            11 hours ago

            This is the correct response. Social media, as a construct, is not evil and dos not do harm to anyone. The commodification and commercialisation of social media by capitalistic companies is what has caused the harm we see today.

            All of the harms and evils of social media can be boiled down to a single concept: the algorithm. Because algorithmic recommendation of content wants to encourage people to stay on a platform (for capitalistic reasons), and the most enticing and attention-grabbing content is hate-content, these companies have forced hate-inducing concepts down the throats of people in an endeavour to make more money and destroyed individuals and families/friends in the process.

            If we regulate the algorithms, we regulate the harm without disempowering anyone. We can, and we should, regulate algorithms on social media to turn it back into what it was 20-odd years ago - a measure to keep in touch with people you know or care about.

            • Rekorse@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              2
              ·
              7 hours ago

              Social media does cause harm. It tricks you into thinking you are socializing with those near you when you aren’t. It tricks you into thinking people are talking in good faith, similar to in person communication. Finally, social media is a huge attack vector for scams and abuse due to the anonymity and ability to connect anywhere in the world.

              All of these things produce an overwhelmingly negative social experience from social media. That wouldn’t be a problem if our defining trait wasn’t how we socialize in groups. Socializing is as important as water and food for humans.

          • richmondez@lemdro.id
            link
            fedilink
            English
            arrow-up
            4
            ·
            10 hours ago

            I wish I saw this kind of insightful point of view more often in the discourse over social media. It’s stopped being about being social once algorithmic content curation became the norm to drive engagement and advertising money which is the real evil.

        • expr@piefed.social
          link
          fedilink
          English
          arrow-up
          7
          arrow-down
          1
          ·
          1 day ago

          If you take such a broad definition of social media, then nearly the entire Internet becomes “social media” and the term loses its meaning, IMO.

    • Lodespawn@aussie.zone
      link
      fedilink
      English
      arrow-up
      20
      arrow-down
      46
      ·
      1 day ago

      I don’t think they are evil. A bunch of people with good intentions who didn’t understand the problem are trying to solve it with a gut feeling rather than analysis and evidence. It’s really disappoi ting that they would waste so much of our time and money like this.

      • Scotty_Trees@lemmy.world
        link
        fedilink
        English
        arrow-up
        16
        ·
        1 day ago

        Former Facebook higher ups have gone on the record to say the Facebook uses destructive algorithms to keep people hooked, they know exactly what they are doing and don’t care how it affects us as long as they can squeeze more info from us for more profit. Thinking Silicon Valley tech billionaires actually care about you? Bro, you need to wake up.

        • Lodespawn@aussie.zone
          link
          fedilink
          English
          arrow-up
          6
          arrow-down
          5
          ·
          1 day ago

          We’re talking about Australian legislation not social media itself. The problem is real, the legislation is ineffective and poorly implemented. Calling the legislation evil is a stretch. Modern social media is most certainly evil.

      • [deleted]@piefed.world
        link
        fedilink
        English
        arrow-up
        37
        arrow-down
        5
        ·
        1 day ago

        There is no problem to solve that hasn’t already been addressed with parental controls.

        • BrianTheeBiscuiteer@lemmy.world
          link
          fedilink
          English
          arrow-up
          9
          arrow-down
          2
          ·
          1 day ago

          If the parental control comes from the social media site itself then it’s likely the parent that’s being controlled. The most important control is limiting screen time and not every site allows parents to set hard limits.

        • Lodespawn@aussie.zone
          link
          fedilink
          English
          arrow-up
          15
          arrow-down
          8
          ·
          1 day ago

          There is a problem with social media addiction but the solution isn’t restricting teens from it. The solution, as with most things, is education. Educating the kids, educating their parents and making sure they both have the tools available to them to make smart decisions.

          • Rekorse@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            1
            ·
            edit-2
            7 hours ago

            Surely there’s always time for both the kids and parents to set aside to learn every new technology and the appropriate controls to restrict them.

            And when someone says it makes far more sense to just not five their kids said technology until they are older, we have people like you arguing on behalf of the technology.

            Aww poor little facebook, we dont want to hurt its fee-fees! Let’s just give it another chance! I’m sure we can trust it this time if we just learn how to use it right!

            • Lodespawn@aussie.zone
              link
              fedilink
              English
              arrow-up
              1
              ·
              6 hours ago

              How am I arguing on behalf of the technology? I want people to understand the technology so they know how to protect themselves effectively if they use it and so can make effective decisions on how their kids interact with it.

              The technology sucks, but the technology is not going away and any fucking moron can bypass the age verification. If you think age verification is stopping teenagers from using tiktok then you’re an idiot. I’m arguing that the implemented solution is not in fact anything close to a solution, and that pulling this thread and trying to implement something in the same vein that would actually work is a terrible idea because it fucks the privacy of every Australian on the internet, even more so than the current solution.

              • Rekorse@sh.itjust.works
                link
                fedilink
                English
                arrow-up
                1
                ·
                5 hours ago

                Maybe, just maybe, parents shouldn’t let their kids in awful places, and also awful places shouldnt let kids in. Turns out parental controls are bullshit, and the only real answer is RTFM or dont use it at all. A business and its customers dont exist without each other, so the blame is on both sides.

                All that said, the government definitely could help the parental understand and controls side of thing too using regulation.

          • supamanc@lemmy.world
            link
            fedilink
            English
            arrow-up
            4
            arrow-down
            2
            ·
            19 hours ago

            That’s a bit like saying ‘there is a problem with smack/nicotine/alcohol addiction, but the solution is not restriction, it’s education’. You can educate all you want, but very clever people make a lot of money by saying ‘fuck your education’.

            • Amnesigenic@lemmy.ml
              link
              fedilink
              English
              arrow-up
              4
              arrow-down
              1
              ·
              16 hours ago

              Drug prohibition has also historically not worked out very well for anyone except prison industry shareholders

              • a_gee_dizzle@lemmy.ca
                link
                fedilink
                English
                arrow-up
                2
                ·
                edit-2
                6 hours ago

                But we still prohibit children from having drugs. Legal drugs (alcohol, nicotine, cannabis) are illegal to sell to children, even though we can legally sell them to adults.

            • Lodespawn@aussie.zone
              link
              fedilink
              English
              arrow-up
              1
              ·
              19 hours ago

              Restriction is fine if it’s a workable solution, but this one is not and anyone with half a brain could see that from its very first announcement.

              • supamanc@lemmy.world
                link
                fedilink
                English
                arrow-up
                2
                ·
                19 hours ago

                But the idea that we just need education is ridiculous. It’s the exact defence that the social media platforms espous, because they know it’s bullshit and ineffective.

          • [deleted]@piefed.world
            link
            fedilink
            English
            arrow-up
            12
            ·
            1 day ago

            Limiting total time spent on something is one of the parental control options. It isn’t just blocking things 100%.

            • Lodespawn@aussie.zone
              link
              fedilink
              English
              arrow-up
              5
              arrow-down
              5
              ·
              1 day ago

              Just having parental controls exist isn’t an effective solution. Well implemented education is required to ensure it is used effectively.

            • Lodespawn@aussie.zone
              link
              fedilink
              English
              arrow-up
              10
              arrow-down
              5
              ·
              1 day ago

              No but you can educate their support networks and build other systems to help them work through their addiction.

              • Rekorse@sh.itjust.works
                link
                fedilink
                English
                arrow-up
                3
                arrow-down
                1
                ·
                7 hours ago

                Or we could focus on preventing the addiction to begin with.

                Great examples include making people wait until adulthood to smoke nicotine or cannabis, or to drink alcohol.

                • Übercomplicated@lemmy.ml
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  6 hours ago

                  Great examples include making people wait until adulthood to smoke nicotine or cannabis, or to drink alcohol.

                  I mean, I agree with you, but highschool is a thing… these laws are basically useless to my knowledge. I think about 50% of my grade had smoked weed by tenth grade, and half again were addicted to nicotine by 12th. The only reason I didn’t fall victim to those (as many of my friends did), is because I was educated, by my parents, from an early age, about addiction and these substances. I never even tried them, because I knew better, thus never got addicted.

                • Lodespawn@aussie.zone
                  link
                  fedilink
                  English
                  arrow-up
                  2
                  arrow-down
                  1
                  ·
                  6 hours ago

                  You say that, but evidence shows its not a working solution. Its a piece of legislation that doesn’t actually achieve anything close to the desired outcome of stopping a significant number of people under 16 from accessing social media. Further than that, there isn’t an actual way to make this work without banning VPNs and implementing a Chinese style great internet filter.

                  Nicotine, Cannabis and alcohol are all banned in Australia for under 18s and you are kidding yourself if you think that has had any significant impact on stopping under 18s from getting their mits on them.

        • MurrayL@lemmy.world
          link
          fedilink
          English
          arrow-up
          8
          arrow-down
          5
          ·
          1 day ago

          The issue with this argument is that many kids don’t have good parents, and some don’t have any parents at all.

          Are those kids just supposed to be left to the mercy of bad actors because of their circumstances?

          • Lodespawn@aussie.zone
            link
            fedilink
            English
            arrow-up
            11
            arrow-down
            1
            ·
            1 day ago

            The current solution just cuts those at risk kids off from all modern support networks.

          • [deleted]@piefed.world
            link
            fedilink
            English
            arrow-up
            10
            arrow-down
            3
            ·
            1 day ago

            Guess we just let for profit companies and authoritarian states suck up all the data on everyone whether it works or not then.

            • MurrayL@lemmy.world
              link
              fedilink
              English
              arrow-up
              6
              arrow-down
              4
              ·
              1 day ago

              I didn’t say I approve of the current tactics, I’m just pointing out that circumstances can be more complex than simply saying ‘let the parents sort it out’ and leaving it at that.

        • bobzer@lemmy.zip
          link
          fedilink
          English
          arrow-up
          10
          arrow-down
          9
          ·
          1 day ago

          If you set parental controls on your own teen’s device, all you’re doing is isolating them from their peers and making them the kid with the weird parent who doesn’t let them post on tik tok.

          Social media isn’t what it was when we were growing up. It’s designed to prey on them the same way slot machines create gambling addictions.

          I’m no puritan but I do truly believe banning kids from social media and restricting teens at a legislative level would be a net benefit for society. Same as alcohol or drugs.

          • some_kind_of_guy@lemmy.world
            link
            fedilink
            English
            arrow-up
            5
            arrow-down
            2
            ·
            1 day ago

            Prohibition didn’t work for drugs either, so why would it work here? Why do we need to learn that lesson over and over again?

            • bobzer@lemmy.zip
              link
              fedilink
              English
              arrow-up
              3
              arrow-down
              3
              ·
              21 hours ago

              Prohibition didn’t work for drugs either

              I didn’t realize it was common for 14 year olds to drink alcohol and take heroin where you’re from…

              • some_kind_of_guy@lemmy.world
                link
                fedilink
                English
                arrow-up
                4
                arrow-down
                1
                ·
                20 hours ago

                Underage drinking is still more common than it should be, despite strict laws. The point is, it doesn’t do any good to go after the consumer, regardless of age. in order to make a meaningful impact, legislation would have to destroy or significantly neuter social media companies altogether, globally. Anything else will be a disappointment.

                The more effective way to reduce these harms is through social/cultural change, but that’s easier said than done.

                • bobzer@lemmy.zip
                  link
                  fedilink
                  English
                  arrow-up
                  3
                  ·
                  edit-2
                  17 hours ago

                  Underage drinking is still more common than it should be,

                  Sure, but it’s significantly lower than legal drinking.

                  We as a society acknowledge the harm of underage drinking so prohibition is effective. Prohibition of adult drinking was puritan bullshit the majority didn’t agree with so it didn’t work.

                  I think you’d find a majority of parents agree social media is shit, but they’re unwilling to isolate their child. In this case prohibition would be effective.

          • [deleted]@piefed.world
            link
            fedilink
            English
            arrow-up
            4
            arrow-down
            3
            ·
            1 day ago

            Limiting total time spent on something is one of the parental control options. It isn’t just blocking things 100%.

      • Afaithfulnihilist@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        10
        arrow-down
        1
        ·
        1 day ago

        Good intentions without the spirit of cooperation or respect for consent is still evil.

        The main problem with all of these internet surveillance tools being marketed as ways to protect children is that people are engaging with them on that basis.

        As far as I’m concerned they haven’t done anything to establish that they actually intend to protect children or that this is a reasonable way to do it. This seems like a solution to a different problem that ignores all of the problems it creates.

        Parents should be responsible for their children. A random website creator shouldn’t have to be responsible for your children.

        Websites aren’t stores where people walk in off of a public street. They are services that people reach out to and engage with specifically and intentionally. If we can address the non-consensual non-intentionality part of internet tracking and surveillance a lot of this stuff goes away. So maybe rather than regulating the website to protect your children we should be regulating the website to protect consent.

        • Lodespawn@aussie.zone
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          4
          ·
          1 day ago

          I don’t agree that the legislators left the spirit of cooperation or respect for consent out because they are evil, I think they left them out because they are ignorant. I think they are inexperienced with both technology and social media and have failed to appropriately engage people that might have helped them come up with a functional solution rather than an ineffective brute force.

          I do however agree with everything else you’ve said above.

  • gurty@lemmy.world
    link
    fedilink
    English
    arrow-up
    114
    arrow-down
    1
    ·
    1 day ago

    ‘…internally the government was aware of a lack of evidence to support the ban before they passed the legislation anyway’

    Terrific job, gov.

    • Australis13@fedia.io
      link
      fedilink
      arrow-up
      48
      ·
      1 day ago

      Our government is usually technologically inept.

      The first online census (2016) crashed the system because they didn’t allow enough capacity. Anyone with half a brain could have told them that most people were going to try to use it during one particular time – after dinner (especially since the paper census is supposed to count everyone on that particular night). Instead, they decided to rate it for only 1 million form submissions per hour, despite estimating that two-thirds of Australians would fill it out online. At one person per family, that’s around 4 million online submissions. Now factor in that the eastern states have most of the population (and are all in the same time zone at that time of year) and, predictably, the site went down after dinner on census night.

      https://www.abc.net.au/news/2016-08-09/abs-website-inaccessible-on-census-night/7711652

  • commander@lemmy.world
    link
    fedilink
    English
    arrow-up
    34
    arrow-down
    3
    ·
    1 day ago

    They’re propaganda laws. Internet censorship laws. Palestinian genocide started trending on social media and suddenly all the countries out in the west wanted to start banning/controlling social media. Plus the earlier push to ban TikTok by Facebook to try to ladder pull the market from competitors