PDF.
The investigation looked at what content from political parties and politicians is shown to newly created accounts that are interested in parties and politicians from either the left or the right.
It found that the algorithms push content from the far right to the right-leaning and left-leaning accounts and, to a lesser extent, push content from the radical left to left-leaning accounts.



Well, duh. Antisocial media needs ppl to keep consuming to keep consuming to generate ad revenue. Currently far right are the trending clowns, so that’s one explanation. Another’s that twatter has always been a cesspool, while tiktok just shows whatever’s popular among brain-damaged individuals.
Jokes aside, might as well be a demo of the multiple testing problem, so implying that algorithmic feeds in general do this is premature (not that the article does this, so just a note).
P.s, not defending current algorithmic feeds either; I’ve personally all but cut those out of my life for good, and advocate for using white-box approaches instead, like what’s done in reccomend_from_archive that recommends tracks based on AA’s spotify scrape data and listening history/playlist exports.