• umami_wasabi@lemmy.ml
    link
    fedilink
    English
    arrow-up
    0
    ·
    6 months ago

    Thanks.

    I also had a brief read on the bill you linked and some relavent articles. The bill only cite “national security” yet doesn’t explain what “national security” it causes.

    The Bloomberg article states a few reasons, but none satisfied me to justify a ban. For example, reason 1 points out that the algoritm of generating feed is advanced and intoxicating. So they should be punished for a well written and effective algorithms?

    Yes, there are and were dumb to harmful contents found on TikTok. However, I think it should be a content moderation issue, not a national security issue. I heard people can find CSAM on Twitter and Discord, harmful and damaging it’s, should it get banned too due to “national security” concerns? It just have a smell of unfair.

    Just my two cents.

    Disclosure: I don’t use Facebook, Intagram, Twitter, nor TikTok. I do have a Discord account.

    • abhibeckert@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      6 months ago

      They’re not worried about CSAM. They worried about TikTok users being influenced during an election campaign.

      And yes, it is a moderation issue. Specifically, the US doesn’t want the current moderation team to be in charge of moderation.

      Disclosure: I don’t use Facebook, Intagram, Twitter, nor TikTok

      To put it in perspective, about a quarter of the US population uses TikTok. And politics are a major discussion point with the political content you’re exposed to selected by an algorithm that is opaque and constantly changing.

      It absolutely can be used to change the result of an election. And China has meddled in elections in the past (not least of all their own elections… but also foreign ones:

      “China has been interfering with every single presidential election in Taiwan since 1996, either through military exercises, economic coercion, or cognitive warfare, including disinformation or the spread of conspiracies”

      https://www.afr.com/world/asia/taiwan-warns-of-disturbing-election-interference-by-china-20240102-p5eunf

      • umami_wasabi@lemmy.ml
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        6 months ago

        It not uncommon to see misinformatuon to fabricated information appears on many SNS platforms including Facebook and Twitter. It is not unheard of Russia use social media to influence election too via popular platform that is US based. All SNS are subject to the same problem, but only TikTok have more active users thus more far reaching, but again this is a content moderation problem, not the inherent fault of TikTok itself. Whom should perform content moderation is a business decision. It should not be dictated by law, though they can make moderation standards that companies needs to comply. I think this is a bit unfair to just targeting TikTok only, and should be universal.


        EDIT:

        political content you’re exposed to selected by an algorithm that is opaque and constantly changing

        Isn’t TikTok opened access to its algorithm for reviewing?

        Actually it is not solely a content moderation problem. While some dumb and physically harmful content should be subject to moderation, speeches should be protected. Isn’t American all about the word “Freedom”? It should be free to speak what they believe, right?

        However, the recommendation algorithms might need some regulations that categorize content and have relevant display policies. For example, political content, user generated and advertisement, should be distributed equally for all views (i.e. a user will see content for all candidates for roughly same amount of time). The “addictive” thing shouldn’t be regulated as that the point of the algorithm: maximize user engagement. However, there could be a rating system similar to game ratings that affect who at what age can use which platform. Otherwise, it should be free for one to addict to something, as long as it doesn’t cause a physical harm to himself and others.

    • atrielienz@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      6 months ago

      I have a question. How would it be moderated and by whom? In an age where the warthunder forums literally have a leak of classified info like monthly, and the US is increasingly losing the cyber security war because people can’t do simple things like not plug random usb’s they found on the side of the road into their work computers, I don’t really understand why it’s hard to believe tik tok is a threat to national security.

      The permissions it asks for on your phone are kind of a red flag. Specifically access to the camera and microphone. Mostly because with it being controlled by the CCP (as most successful Chinese Businesses are), it is absolutely trivial for them to gather information “anonymously” about their users, de-anonymize it, and then target those users with anything and everything including pro CCP propaganda. That alone is reason enough for me to understand why federal employees aren’t allowed to use tik tok on any federal device (work phones and computers for instance).

      I don’t necessarily think forcing them to sell to another entity will fix the problems with tik tok. I think this bill is intended to be a “solution” to placate people. Mostly because it doesn’t seem like it’s been written by people who understand the technology. But I also wouldn’t say that tik tok is harmless or blameless.

      Why does tik tok need to gather information about what banking apps I use? What healthcare apps I use? Why does it need my GPS location? Why can it collect this data without my consent? Why and how does it collect information on people even if they don’t use tik tok? Have never used tik tok?

      On top of that Tik Tok got caught spying on reporters with the intent to track down their sources. That’s terrifying.

      https://www.welivesecurity.com/2023/03/24/what-tiktok-knows-you-should-know-tiktok/

      • abhibeckert@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        6 months ago

        How would it be moderated and by whom?

        That would be easier to answer if we had a list of companies that can afford to buy it (that’s a short list) and also willing to buy it (an even shorter list).

        I don’t necessarily think forcing them to sell to another entity will fix the problems

        Sure - it obviously depends who buys it. Elon Musk, for example, would probably be a bad steward.

        But what about Alphabet? That might not be so bad. As a fan of YouTube, I’d love to see the “shorts” feature killed off and all that content moved to a separate service where I can go the rest of my life without ever seeing a short repeating video.

        Whoever buys it, it the US can force TikTok to be sold once, they can do it again if the buyer proves to also be problematic.

        • atrielienz@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          6 months ago

          The invasion of privacy is bad regardless of who does it though. This bill isn’t about protecting consumer privacy. It’s about sticking it to the CCP. Alphabet should also be considered a company that needlessly invades the privacy of it’s users and laws should be made to protect those users. Just because tik tok is worse doesn’t mean any company doing this isn’t bad.

          I’ll also say that YouTube shorts views pay more than tik tok views for established creators, by a significant margin. I would rather creators I enjoy get paid decently. Not that YouTube doesn’t have a lot of problems and anti-creator policies of its own. But $.04 per 1K views is a lot worse than $18.00 per 1K views.