One of the more perceptive articles I have seen on the recent Facebook controversy and the overall issues regarding the lack of neutrality in algorithms:
The question isn’t whether Facebook has outsize power to shape the world — of course it does, and of course you should worry about that power. If it wanted to, Facebook could try to sway elections, favor certain policies, or just make you feel a certain way about the world, as it once proved it could do in an experiment devised to measure how emotions spread online.
There is no evidence Facebook is doing anything so alarming now. The danger is nevertheless real. The biggest worry is that Facebook doesn’t seem to recognize its own power, and doesn’t think of itself as a news organization with a well-developed sense of institutional ethics and responsibility, or even a potential for bias. Neither does its audience, which might believe that Facebook is immune to bias because it is run by computers.
That myth should die. It’s true that beyond the Trending box, most of the stories Facebook presents to you are selected by its algorithms, but those algorithms are as infused with bias as any other human editorial decision.
“Algorithms equal editors,” said Robyn Caplan, a research analyst at Data & Society, a research group that studies digital communications systems. “With Facebook, humans are never not involved. Humans are in every step of the process — in terms of what we’re clicking on, who’s shifting the algorithms behind the scenes, what kind of user testing is being done, and the initial training data provided by humans.”
Everything you see on Facebook is therefore the product of these people’sexpertise and considered judgment, as well as their conscious and unconscious biases apart from possible malfeasance or potential corruption. It’s often hard to know which, because Facebook’s editorial sensibilities are secret. So are its personalities: Most of the engineers, designers and others who decide what people see on Facebook will remain forever unknown to its audience.
Photo
CreditStuart Goldenberg
Facebook also has an unmistakable corporate ethos and point of view. The company is staffed mostly by wealthy coastal Americans who tend to support Democrats, and it is wholly controlled by a young billionaire who has expressed policy preferences that many people find objectionable. Mr. Zuckerberg is for free trade, more open immigration and for a certain controversialbrand of education reform. Instead of “building walls,” he supports a “connected world and a global community.”
You could argue that none of this is unusual. Many large media outlets are powerful, somewhat opaque, operated for profit, and controlled by wealthy people who aren’t shy about their policy agendas — Bloomberg News, The Washington Post, Fox News and The New York Times, to name a few.
But there are some reasons to be even more wary of Facebook’s bias. One is institutional. Many mainstream outlets have a rigorous set of rules and norms about what’s acceptable and what’s not in the news business.
“The New York Times contains within it a long history of ethics and the role that media is supposed to be playing in democracies and the public,” Ms. Caplan said. “These technology companies have not been engaged in that conversation.”
According to a statement from Tom Stocky, who is in charge of the trending topics list, Facebook has policies “for the review team to ensure consistency and neutrality” of the items that appear in the trending list.
But Facebook declined to discuss whether any editorial guidelines governed its algorithms, including the system that determines what people see in News Feed. Those algorithms could have profound implications for society. For instance, one persistent worry about algorithmic-selected news is that it might reinforce people’s previously held points of view. If News Feed shows news that we’re each likely to Like, it could trap us into echo chambers and contribute to rising political polarization. In a study last year, Facebook’s scientists asserted the echo chamber effect was muted.
But when Facebook changes its algorithm — which it does routinely — does it have guidelines to make sure the changes aren’t furthering an echo chamber? Or that the changes aren’t inadvertently favoring one candidate or ideology over another? In other words, are Facebook’s engineering decisions subject to ethical review? Nobody knows.
About Andrew Andrew blogs and tweets public policy issues, particularly the relationship between the political and bureaucratic levels, citizenship and multiculturalism. His latest book, Policy Arrogance or Innocent Bias, recounts his experience as a senior public servant in this area.