When Algorithms Discriminate – The New York Times
2015/07/13 Leave a comment
Given that people have biases, not surprising that the algorithms created reflect some of these biases:
Algorithms, which are a series of instructions written by programmers, are often described as a black box; it is hard to know why websites produce certain results. Often, algorithms and online results simply reflect people’s attitudes and behavior. Machine learning algorithms learn and evolve based on what people do online. The autocomplete feature on Google and Bing is an example. A recent Google search for “Are transgender,” for instance, suggested, “Are transgenders going to hell.”
“Even if they are not designed with the intent of discriminating against those groups, if they reproduce social preferences even in a completely rational way, they also reproduce those forms of discrimination,” said David Oppenheimer, who teaches discrimination law at the University of California, Berkeley.
But there are laws that prohibit discrimination against certain groups, despite any biases people might have. Take the example of Google ads for high-paying jobs showing up for men and not women. Targeting ads is legal. Discriminating on the basis of gender is not.
The Carnegie Mellon researchers who did that study built a tool to simulate Google users that started with no search history and then visited employment websites. Later, on a third-party news site, Google showed an ad for a career coaching service advertising “$200k+” executive positions 1,852 times to men and 318 times to women.
The reason for the difference is unclear. It could have been that the advertiser requested that the ads be targeted toward men, or that the algorithm determined that men were more likely to click on the ads.
Google declined to say how the ad showed up, but said in a statement, “Advertisers can choose to target the audience they want to reach, and we have policies that guide the type of interest-based ads that are allowed.”
Anupam Datta, one of the researchers, said, “Given the big gender pay gap we’ve had between males and females, this type of targeting helps to perpetuate it.”
It would be impossible for humans to oversee every decision an algorithm makes. But companies can regularly run simulations to test the results of their algorithms. Mr. Datta suggested that algorithms “be designed from scratch to be aware of values and not discriminate.”
“The question of determining which kinds of biases we don’t want to tolerate is a policy one,” said Deirdre Mulligan, who studies these issues at the University of California, Berkeley School of Information. “It requires a lot of care and thinking about the ways we compose these technical systems.”
Silicon Valley, however, is known for pushing out new products without necessarily considering the societal or ethical implications. “There’s a huge rush to innovate,” Ms. Mulligan said, “a desire to release early and often — and then do cleanup.”
