...
...
...
Next Story

Deep divides on social media, but algorithm tweaks may not help: Study on FB, Insta

ByBinayak Dasgupta, New Delhi
Jul 28, 2023 12:12 AM IST

The experiments showed there were no easy answers — ideological divides were deep, and just changing algorithms did not help address them

There is significant ideological divide in the political information Facebook and Instagram users are exposed to, and they spend less time on these services when specially curated feeds are switched off, a set of new studies from what is the largest under-the-hood analysis of social media platforms has found, including clues that many problems that exist on these technologies may be hard to address by technical tweaks alone.

HT Image

Over a period of three months during the 2020 US presidential election season, researchers in each of the studies — three of the studies are due to be published early on Friday in Science and another in Nature at the same time — examined what sort of content over 200 million American users saw and how tweaking algorithms in specific experiments with smaller subgroups could help problems such as polarisation and misinformation.

The experiments showed there were no easy answers — ideological divides were deep, and just changing algorithms did not help address them.

“We now know just how influential the algorithm is in shaping people’s on-platform experiences, but we also know that changing the algorithm for even a few months isn’t likely to change people’s political attitudes. What we don’t know is why,” said professor Talia Stroud of the University of Texas at Austin during a virtual interaction with journalists on Tuesday ahead of the reports’ release.

Stroud and Joshua Tucker of New York University coordinated Facebook and Instagram parent Meta’s research partnership to understand how its products affected the 2020 US elections.

“These findings add to a growing body of research showing there is little evidence that social media causes harmful ‘affective’ polarisation or has any meaningful impact on key political attitudes, beliefs or behaviours. They also challenge the now commonplace assertion that the ability to re-share content on social media drives polarisation,” said Nick Clegg, Meta’s president for global affairs, in advance comments shared by the company.

But the researchers added that it was an important caveat to remember “it was only a three-month study”.

“It could be because the length of time for which the algorithms were changed wasn’t long enough, or these platforms have been around for decades already, or that while Facebook and Instagram are influential sources of information, they are not people’s only sources,” said Tucker.

The problem: Ideological segregation

Among the most crucial findings, significant because the researchers analysed the entire American user base of Facebook, was that people were being exposed to “ideologically segregated” information. “Conservatives and liberals are so engaged with different sets of political news,” said Sandra Gonzales-Bailon of the University of Pennsylvania.

Gonzales-Bailon’s study also found that “pages” and “groups” contribute much more to segregation than user posts themselves and that there were cases when “political news URLs were almost exclusively seen by conservatives and political news URLs exclusively seen by liberals”.

“And then the fourth takeaway is that the large majority of political news rated as false by Meta’s third-party fact checker program were seen by more conservative than liberals,” she said, during an interaction with journalists over a video call.

The detailed study paper put this problem in stark terms: almost 97% of the political news flagged for being fake or misleading was seen by more conservatives than liberals.

Mixed effect of algorithmic tweaks

In two of the other papers, the researchers deployed multiple ways in which they tried to isolate the effects of the algorithms, or computer code that mostly learns what a user typically will want to see or interact with, and amplifies content accordingly.

In one, they did so by switching people’s feeds to a simple reverse chronological order of showing posts and having them answer questionnaires. It was this experiment that made people spend less time on these services. “The reverse chronological feed decreased the time users spent on the platforms and their activity… by about 20% on Facebook, and 11, or 12% on Instagram, [and] in some cases, this led people to use other platforms instead,” said Andrew Guess of Princeton University.

Doing so also led to people seeing more posts from groups and pages instead of friends, less content from like-minded sources, and increased their exposure to political content than usual.

Crucially, moving to the reverse chronological feed “caused proportion of content from untrustworthy sources to more than double on Facebook and increased by more than 20% on Instagram”. In other words, this exposed more people to dodgy content.

The headline finding that Clegg referred to was when surveys of people — over 23,000 on Facebook and 21,000 on Instagram — found that removing the algorithmic feed “did not significantly alter levels of issue polarisation, affective polarisation, political knowledge, self-reported political, political behaviour or other key attitudes” during the study period.

This is significant since switching to a reverse chronological feed is being explored by many policymakers as one of the ways to mitigate social media harms.

“These findings suggest social media algorithms may not be the root cause of phenomena such as increasing political polarisation, [but] it raises the stakes for finding out what other online factors—such as the incentives created by the advertising model of social media — or offline factors — such as long-term demographic changes, partisan media, rising inequality, or geographic sorting — may be driving changes that affect democratic processes and outcomes,” said the study led by Guess.

Guess and his colleagues also carried out a second experiment by suppressing articles “re-shared” (and not posted) by people. Here, there were some useful effects in the context of misinformation.

“Removing re-shares substantially decreased the amount of political news and content from untrustworthy sources people saw on their feeds, decreased overall clicks and reactions and reduced clicks on posts from partisan news sources,” said Guess.

“Second, removing re-shares reduced the proportion of political content and people’s feeds by nearly 20%. And the proportion of political news by more than half,” he added.

In the fourth study, when researchers reduced the prevalence of politically like-minded content in participants’ feeds, there was no measurable effects on affective polarisation, ideological extremity, candidate evaluations and belief in false claims.

Michael Wagner, professor at the University of Wisconsin and a special rapporteur who was asked to be an independent observer, said the research experiments had several strengths, including the outside academics having “control rights” for the papers — meaning that in the event of a dispute between the outside academics and the Meta researchers, the lead author — which would always be an outside academic — would have the final decision.

But, he added, such an exercise “cannot be a model for future industry — academy collaborations” — a demand that many looking at harms of Big Tech have raised.

“Simply put, researchers don’t know what they don’t know, and the incentives are not clear for industry partners to reveal everything they know about their platforms,” Wagner wrote in a separate paper.

This was, after all, an exercise Meta agreed to voluntarily. “In the end, independence by permission is not independent at all. Rather, it is a sign of things to come in the academy: incredible data and research opportunities offered to a select few researchers at the expense of true independence. Scholarship is not wholly independent when the data are held by for-profit corporations, nor is it independent when those same corporations can limit the nature of what it studied,” Wagner said.

 
Get India Pakistan News Live. Today's India News, Weather Today,and Latest News, on Hindustan Times.
SHARE THIS ARTICLE ON
Subscribe Now