Amid privacy firestorm, Facebook curbs research tool
Facebook has curbed access to a controversial feature allowing searches of the vast content within the social network -- a tool which raised privacy concerns but was also used for research and investigative journalism.
The leading social network acknowledged this week it had "paused" some elements of its "graph search," a feature introduced in 2013 which has sparked criticism for allowing posts and other content to be unearthed with a simple query.
But graph search turned out to be an important tool for researchers, rights activists and journalists. It has been used to track activity of suspected war criminals and human traffickers and to monitor extremists.
"We paused some aspects of graph search late last week," a Facebook spokesperson said. "We're in conversations with a few researchers to learn more about how they used this tool."
Jennifer Grygiel, a Syracuse University professor who follows social media, said the move is the latest to tighten data access to Facebook since the scandal over Cambridge Analytica, the consultancy which hijacked personal data of tens of millions of Facebook users.
The new curbs make it harder for researchers to find Facebook posts about topics ranging from war crimes to anorexia to the anti-vaccine movement, Grygiel said.
Grygiel said while the move may be seen as promoting privacy, it also limits the ability of researchers to investigate Facebook itself and its efforts to weed out hate speech and extremist content.
"Researchers like myself were using social graph to show how bad Facebook's content moderation was," she told AFP. "This may be a public relations move because Facebook is tired of having everyone understand how bad their privacy is."
After being introduced in 2013, graph search drew immediate fire from privacy activists as a "creepy" tool that could enable stalking or unwanted disclosures.
Facebook has made changes over the years to graph search and offered users privacy settings limiting what information is unearthed. The company did not respond to an AFP query to elaborate on the most recent changes or the reason for the new policy.
- Legitimate research or abuse? –
"This tool was used both for abusive purposes and for legitimate research," said Adi Kamdar, a legal fellow at the Knight First Amendment Institute at Columbia University.
Kamdar said Facebook's move may be troubling because of Facebook's ongoing "obstruction of good-faith journalism and research on the platform."
The Knight Institute this week sent a letter to Facebook signed by some 200 journalists and researchers asking the social network for improved access to study the platform.
"Digital research and journalism serve the public interest by advancing public understanding of the social media platforms," the letter said.
"These platforms -- and Facebook's in particular -- have a powerful but poorly understood influence on public discourse and, by extension, on societies around the world."
The letter urged Facebook to create a "safe harbor" for certain researchers and journalists that "would permit us to do our work without impeding Facebook's ability to protect the privacy of its users and the integrity of its platform."
Kamdar said the letter was drafted before Facebook's latest change and that the new curbs "may be seen as a setback" for research, for example, on how Facebook's algorithms and recommendations work.
Investigative journalist Michael Hayden of the Southern Poverty Law Center, which monitors right-wing extremist groups, expressed concern over whether the change could limit the ability to track hate speech and violent activity on Facebook.
"It's important to remember that these users generally want as many people to see their content as possible," Hayden said. "The greater concern would be if these changes enabled them to organize in secret on the platform. This would particularly concern me with people who plot violence and terrorism."
Casey Fiesler, a University of Colorado professor and social computing researcher, said Facebook should not shut off access to its data for research but should be selective.
"Any tool that people use to gather data can be used for good or bad purposes," Fiesler said.
"I would rather see ethical decisions be made about a particular use of data rather than to shut it down entirely."