Facebook Shut Down Research Firm Investigating Instagram’s Algorithm
Researchers from Germany-based AlgorithmWatch say that Facebook forced them to abandon their research project into Instagram’s algorithm after the company came after them with legal threats.
In March of 2020, AlgorithmWatch launched a project that it says is designed to monitor Instagram’s newsfeed algorithm. Over the course of the next 14 months, over 1,500 volunteers installed an add-on that would scrape their newsfeeds and send that data to AlgorithmWatch to determine how the company prioritized pictures and videos on a timeline.
“With their data, we were able to show that Instagram likely encouraged content creators to post pictures that fit specific representations of their body, and that politicians were likely to reach a larger audience if they abstained from using text in their publications,” AlgorithmWatch writes.
Facebook denies both of these claims. Specifically, the first data point showed that Instagram seemed to encourage users to show more skin. When they discovered this, AlgorithmWatch initially reached out to Facebook for comment only to be initially ignored, and later told that Facebook found the researchers’ work “flawed in a number of ways.”
“Although we could not conduct a precise audit of Instagram’s algorithm, this research is among the most advanced studies ever conducted on the platform,” AlgorithmWatch continues.
The project was supported by the European Data Journalism Network and by the Dutch foundation SIDN and done in partnership with Mediapart in France, NOS, Groene Amsterdammer, and Pointer in the Netherlands, and Süddeutsche Zeitung in Germany.
1/9 ⚠️ AlgorithmWatch was forced to shut down its #Instagram monitoring project after threats from #Facebook!
👉🏻 Read our story: https://t.co/wx0d9nwZb3
✍🏻 And sign our open letter to European lawmakers to protect future research on online platforms: https://t.co/naWQBz9elg 🧵 pic.twitter.com/Ty74KkjIHK
— AlgorithmWatch (@algorithmwatch) August 13, 2021
In a blog post originally spotted by The Verge, the AlgorithmWatch team says that it was called to a meeting by Facebook in May, and in it, the social media giant informed the group that they had breached the company’s Terms of Service and that Facebook would have to “move to a more formal engagement” if AlgorithmWatch did not “resolve” the issue on Facebook’s terms — what AlgorithmWatch calls a “thinly veiled threat.”
AlgorithmWatch says that it decided to go public with this conversation with Facebook after the company shut down the accounts of researchers who were working on the Ad Observatory at New York University. That group had built a browser add-on that collected some data about advertisements on the platform
As reported by the Associated Press, Facebook says that the researchers violated its terms of service and were involved in unauthorized data collection from its network. The researchers argued that the company is attempting to exert control on any research that paints it in a negative light.
“This is not the first time that Facebook aggressively goes against organizations that try to empower users to be more autonomous in their use of social media,” AlgorithmWatch continues. “In August 2020, it threatened Friendly, a mobile app that lets users decide on how to sort their newsfeed. In April 2021, it forced several apps that allowed users to access Facebook on their terms out of the Play Store. There are probably more cases of bullying that we do not know about. We hope that by coming forward, more organizations will speak up about their experiences.”
While AlgorithmWatch was forced to stop its research, it says that it is urgently important for organizations to shed light on Instagram’s algorithms and points to several cases where the company appears to take specific action against the proliferation of certain types of information, such as how both Colombian and Palestinian users noticed that content that was posted about ongoing protests in their countries tended to disappear.
3/9 🔎 Only if we understand how our public sphere is influenced by platforms’ #AlgorithmicChoices, can we take measures to ensure they do not undermine our autonomy and freedom, and the collective good. To do so, we need access to platform data!
— AlgorithmWatch (@algorithmwatch) August 13, 2021
“Large platforms play an oversized, and largely unknown, role in society, from identity-building to voting choices. Only by working towards more transparency can we ensure, as a society, that there is an evidence-based debate on the role and impact of large platforms – which is a necessary step towards holding them accountable,” AlgorithmWatch concludes.
Image credits: Header photo licensed via Depositphotos.