Using Algorithms to Determine Character – The New York Times
2015/07/27 Leave a comment
Good piece on the increasing use of algorithms in granting loans and in the workplace, and the potential for (and limits to – see emphasized text) reducing bias:
Mr. Merrill, who also has a Ph.D. in psychology (from Princeton, in case Mr. Gu wants to lend him money), thinks that data-driven analysis of personality is ultimately fairer than standard measures.
“We’re always judging people in all sorts of ways, but without data we do it with a selection bias,” he said. “We base it on stuff we know about people, but that usually means favoring people who are most like ourselves.” Familiarity is a crude form of risk management, since we know what to expect. But that doesn’t make it fair.
Character (though it is usually called something more neutral-sounding) is now judged by many other algorithms. Workday, a company offering cloud-based personnel software, has released a product that looks at 45 employee performance factors, including how long a person has held a position and how well the person has done. It predicts whether a person is likely to quit and suggests appropriate things, like a new job or a transfer, that could make this kind of person stay.
It also characterizes managers as “rainmakers” or “terminators,” depending on how well they hold talent. Inside Workday, the company has analyzed its own sales force to see what makes for success. The top indicator is tenacity.
“We all have biases about how we hire and promote,” said Dan Beck, Workday’s head of technology strategy. “If you can leverage data to overcome that, great.”
People studying these traits will be encouraged to adopt them, he said, since “if you know there is a pattern of success, why wouldn’t you adopt it?”
In a sense, it’s no different from the way people read the biographies of high achievers, looking for clues for what they need to do differently to succeed. It’s just at a much larger scale, based on observing everybody.
There are reasons to think that data-based character judgments are more reasonable. Jure Leskovec, a professor of computer science at Stanford, is finishing up a study comparing the predictions of data analysis against those of judges at bail hearings, who have just a few minutes to size up prisoners and decide if they could be risks to society. Early results indicate that data-driven analysis is 30 percent better at predicting crime, Mr. Leskovec said.
“Algorithms aren’t subjective,” he said. “Bias comes from people.”
That is only true to a point: Algorithms do not fall from the sky. Algorithms are written by human beings. Even if the facts aren’t biased, design can be, and we could end up with a flawed belief that math is always truth.
Upstart’s Mr. Gu, who said he had perfect SAT scores but dropped out of Yale, wouldn’t have qualified for an Upstart loan using his own initial algorithms. He has since changed the design, and he said he is aware of the responsibility of the work ahead.
“Every time we find a signal, we have to ask ourselves, ‘Would we feel comfortable telling someone this was why they were rejected?’ ” he said.
Using Algorithms to Determine Character – The New York Times.
