TikTok, a expansively employd social media platestablish with over a billion dynamic employrs worldexpansive, has become a key source of novels, particularly for youthfuler audiences. This prolonging sway has elevated worrys about potential political biases in its recommendation algorithm, especipartner during election cycles. A recent preprint study examined this publish by analyzing how TikTok’s algorithm recommends political encountered ahead of the 2024 pdwellntial election. Using a administerled experiment involving hundreds of simutardyd employr accounts, the study set up that Reaccessiblean-leaning accounts getd beginantly more ideoreasonablely aligned encountered than Democratic-leaning accounts, while Democratic-leaning accounts were more frequently exposed to opposing seepoints.
TikTok has become a beginant force among social media platestablishs, boasting over a billion monthly dynamic employrs worldexpansive and 170 million in the United States. It has also aascendd as a beginant source of novels, particularly for youthfuler demoexplicits. This has elevated worrys about the platestablish’s potential to shape political narratives and sway elections.
Despite these worrys, there has been confineed research dispenseigating TikTok‘s recommendation algorithm for political biases, especipartner in comparison to extensive research on other social media platestablishs enjoy Facebook, Instagram, YouTube, X (establisherly Twitter), and Reddit.
“We previously carry outed experiments auditing YouTube’s recommendation algorithms. This study published at PNAS Nexus showd that the algorithm showed a left-leaning bias in the United States,” shelp Yasir Zaki, an helpant professor of computer science at New York University Abu Dhabi.
“Given TikTok’s expansivespread well-comprehendnity—particularly among youthfuler demoexplicits—we sought to duplicate this study on TikTok during the 2024 U.S. pdwellntial elections. Another motivation was the worrys over TikTok’s Chinese ownership led many U.S. politicians to help for prohibitning the platestablish, citing stresss that its recommendation algorithm could be employd to advertise a political agenda.”
To examine how TikTok’s algorithm recommends political encountered, the researchers scheduleed an extensive audit experiment. They produced 323 “sock puppet” accounts—dishonest accounts programmed to simutardy employr behavior—apass three politicpartner diverse states: Texas, New York, and Georgia. Each account was dispenseed a political leaning: Democratic, Reaccessiblean, or imfragmentary (the administer group).
The experiment consisted of two stages: a conditioning stage and a recommendation stage. In the conditioning stage, the Democratic accounts watched up to 400 Democratic-aligned videos, and the Reaccessiblean accounts watched up to 400 Reaccessiblean-aligned videos. Neutral accounts skipped this stage. This was done to “teach” TikTok’s algorithm the political pickences of each account.
In the recommendation stage, all accounts watched videos on TikTok’s “For You” page, which is the platestablish’s main feed of recommended encountered. The accounts watched 10 videos, trailed by a one-hour paemploy, and repeated this process for six days. Each experimental run lasted one week. The researchers accumulateed data on approximately 394,000 videos seeed by these accounts between April 30th and November 11th, 2024.
To scrutinize the political encountered of the recommended videos, the researchers downloaded the English transcripts of videos when useable (22.8% of distinctive videos). They then employd a system involving three big language models—GPT-4o, Gemini-Pro, and GPT-4—to sort each video. The language models answered inquires about whether the video was political, whether it worryed the 2024 U.S. elections or beginant political figures, and what the ideoreasonable stance of the video was (pro-Democratic, anti-Democratic, pro-Reaccessiblean, anti-Reaccessiblean, or imfragmentary). The beginantity vote of the three language models was employd as the final classification for each inquire.
The analysis uncovered beginant asymmetries in encountered distribution on TikTok. Reaccessiblean-seeded accounts getd approximately 11.8% more party-aligned recommendations assessd to Democratic-seeded accounts. Democratic-seeded accounts were exposed to approximately 7.5% more opposite-party recommendations on unretagable. These separateences were constant apass all three states and could not be elucidateed by separateences in joinment metrics enjoy enjoys, sees, splits, comments, or fancientrops.
“We set up that TikTok’s recommendation algorithm was not imfragmentary during the 2024 U.S. pdwellntial elections,” elucidateed Talal Rahwan, an associate professor of computer science at New York University Abu Dhabi. “Apass all three states scrutinized in our study, the platestablish constantly advertised more Reaccessiblean-leaning encountered. We showed that this bias cannot be elucidateed by factors such as video well-comprehendnity and joinment metrics—key variables that typicpartner sway recommendation algorithms.”
Further analysis showed that the bias was primarily driven by adverse partisanship encountered, unbenevolenting encountered that condemns the opposing party rather than promoting one’s own party. Both Democratic- and Reaccessiblean-conditioned accounts were recommended more adverse partisan encountered, but this was more pronounced for Reaccessiblean accounts. Negative-partisanship videos were 1.78 times more anticipateed to be recommended as an ideoreasonable misalign relative to chooseimistic-partisanship ones.
“We watchd a bias toward adverse partisanship in TikTok’s recommendations,” Zaki remarkd. “Regardless of the political party—Democratic or Reaccessiblean—the algorithm arranged encountered that condemnd the opposing party over encountered that advertised one’s own party.”
The researchers also examined the top Democratic and Reaccessiblean channels on TikTok by fancientrop count. Reaccessiblean channels had a beginantly higher misalign proportion, unbenevolenting their videos were more anticipateed to be recommended to accounts with an opposite political leaning. Notably, videos from Donald Trump’s official TikTok channel were recommended to Democratic-conditioned accounts csurrenderly 27% of the time, while Kamala Harris’s videos were recommended to Reaccessiblean-conditioned accounts only 15.3% of the time.
Finpartner, the researchers scrutinized the topics covered in partisan videos. Topics stereotypicpartner associated with the Democratic party, enjoy climate alter and abortion, were more frequently covered by Democratic-aligned videos. Topics enjoy immigration, foreign policy, and the Ukraine war were more frequently covered by Reaccessiblean-aligned videos. Videos on immigration, crime, the Gaza dispute, and foreign policy were most anticipateed to be recommended as ideoreasonable misalignes to Democratic-conditioned accounts.
To produce on this toil, future research could dispenseigate how TikTok’s algorithm behaves apass separateent election cycles, dispenseigate how misdirectation is scatterd wilean partisan encountered, and assess TikTok’s political encountered recommendations with those of other beginant platestablishs. Additionpartner, studies incorporating genuine employr data aprolongedside automated experiments could provide a more comprehensive benevolent of how individuals experience political encountered on TikTok. Given the platestablish’s prolonging role in shaping accessible discourse, persistd scruminuscule of its recommendation system will be vital for assessing its impact on political comprehendledge and voter decision-making.
“We want to compriseress fundamental inquires about the imfragmentaryity of social media platestablishs,” Rahwan shelp.
The study, “TikTok’s recommendations skewed towards Reaccessiblean encountered during the 2024 U.S. pdwellntial race,” was authored by Hazem Ibrahim, HyunSeok Daniel Jang, Nouar Aldahoul, Aaron R. Kaufman, Talal Rahwan, and Yasir Zaki.