Feed algorithms are widely suspected to influence political attitudes. However, previous evidence from switching off the algorithm on Meta platforms found no political effects. Here we present results from a 2023 field experiment on Elon Musk’s platform X shedding light on this puzzle. We assigned active US-based users randomly to either an algorithmic or a chronological feed for 7 weeks, measuring political attitudes and online behaviour. Switching from a chronological to an algorithmic feed increased engagement and shifted political opinion towards more conservative positions, particularly regarding policy priorities, perceptions of criminal investigations into Donald Trump and views on the war in Ukraine. In contrast, switching from the algorithmic to the chronological feed had no comparable effects. Neither switching the algorithm on nor switching it off significantly affected affective polarization or self-reported partisanship. To investigate the mechanism, we analysed users’ feed content and behaviour. We found that the algorithm promotes conservative content and demotes posts by traditional media. Exposure to algorithmic content leads users to follow conservative political activist accounts, which they continue to follow even after switching off the algorithm, helping explain the asymmetry in effects. These results suggest that initial exposure to X’s algorithm has persistent effects on users’ current political attitudes and account-following behaviour, even in the absence of a detectable effect on partisanship.

  • Schwim Dandy@piefed.zip
    link
    fedilink
    English
    arrow-up
    60
    ·
    24 hours ago

    This is such a Captain Obvious take. Musk publicly and not at all sneakily modified Grok to do things like give him accolades, be more anti-woke and sycophantic towards Trump and cronies. Of course he’s manipulating the platform to fall in line with his personal views.

  • gravitas_deficiency@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    13
    ·
    22 hours ago

    YouTube too, tbh. I’ve been getting absolute shitloads of ivermectin and conservative Christian and general right-leaning/pushing ads (including shit for fucking turning point USA) in the last year or so

    • atopi@piefed.blahaj.zone
      link
      fedilink
      English
      arrow-up
      1
      ·
      18 hours ago

      i have gotten recommend a lot of left-wing content on youtube and no right-wing content (i cant say for ads since i use an adblocker)

      • Mikina@programming.dev
        link
        fedilink
        English
        arrow-up
        1
        ·
        4 hours ago

        The algorithm is probably made to maximize the time you spend on the platform, and is really good at it. (I mean, just look how good are ML algorithms on text -> picture, and add to it that the algorithm that does your info -> engagement has decades of data and training on billions of people).

        My theory is that it has misaligned, because it turned out that radicalizing people into right-wing bullshit will glue them to the social network very effectively, so it just started to do that. It makes sense - once you start spewing right-wing bullshit, it will probably isolate you from your IRL friends, you will have an echo chamber on the social network, and it is made to sound like some kind of deep truth no one else knows.

        You getting left-wing content might be simply because it would not be efficient to try to convert you, so the algorithm is trying something else that’s more effective on the (minority?) of people like you.

  • toiletobserver@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    21 hours ago

    I used to be a Democrat, but now i think dropping bombs on kids is ok because i need oil for my freedom-mobile. Thanks x!

    /s