To Understand This Era, You Need to Think in Systems

Good interview and insights by Tufekci that applies in so many areas, including racism and discrimination:

As a million media theorists have argued, in a few short decades (or, at most, centuries) we’ve moved from information scarcity, the problem that defined most of human history, to information abundance, the problem that defines our present. We know too much, and it’s paralyzing. The people worth following right now are those who seem able to find the signal in the noise. Few have a better track record of that in recent years than Zeynep Tufekci.

As my colleague Ben Smith wrote in an August profile, Tufekci has “made a habit of being right on the big things.” She saw the threat of the coronavirus early and clearly. She saw that the public health community was ignoring the evidence on masking, and raised the alarm persuasively enough that she tipped the Centers for Disease Control and Prevention toward new, lifesaving guidance. Before Tufekci was being prescient about the coronavirus, she was being prescient about disinformation online, about the way social media was changing political organizing, about the rising threat of authoritarianism in America.

So I asked Tufekci — who is a sociologist at the University of North Carolina, as well as a columnist at The Atlantic and a contributor to New York Times Opinion — to come on “The Ezra Klein Show” for a conversation about how she thinks, and what the rest of us can learn from it.

Tufekci describes herself as a “systems thinker.” She tries to learn about systems, and think about how they interact with one another. For instance, she studied authoritarian systems, and one rule for understanding them is that “you want to look at what they do and not what they say,” she said. So when China, after downplaying the severity of the virus early on, locked down Wuhan, she took it seriously.

“If a country like China is closing down a city of 11 million,” she told me, “this is a big deal. It is spreading, it is deadly, and we’re going to get hit.” Even then, many public health experts in the United States thought the Chinese were wrong, or lying, when they warned that the virus was spreading through asymptomatic transmission. But Tufekci knew that authoritarian systems tend to hide internal problems from the rest of the world. Only a true emergency would force them to change their public messaging. “There’s a principle called the principle of embarrassment,” she explained. “If a story is really embarrassing to the teller, they might be telling the truth.”

Here are a few other frameworks Tufekci told me she finds helpful:

  • Herding effects. Public health experts — including figures like Dr. Anthony Fauci who are lauded today — were slow to change guidance on disruptive measures like masking and travel bans. That led to a cascade of media failures that reflected what journalists were hearing from expert sources. One reason Tufekci was willing to challenge that consensus was she saw experts as reflecting social pressure, not just empirical data. “The players in the institution look at each other to decide what the norm is,” she said. The problem is social frameworks “have a lot of inertia to them,” because everyone is waiting for others to break the norm. That cost precious time in this crisis.
  • Thinking in exponents. The difficulty of exponential growth, as in the fable of the chessboard and the wheat, is that early phases of growth are modest and manageable, and then, seemingly all of a sudden, tip into numbers that are shocking in size — or, in this case, viral spread that is catastrophic in its scale. “My original area of study is social media,” Tufekci said, and that’s another area where the math tends to be exponential. This was, she said, a reason some in Silicon Valley were quick to see the danger of the virus. “A lot of venture capitalism, the VC world and the software people, they’re looking for that next exponential effect … so they had some intuition because of the field they were in.”
  • Population versus individual. In clinical medicine, Tufekci said, “we tend to really think about individual outcomes rather than public health and what we need at the population level.” But thinking at the population level changes the situation dramatically. For instance, a test with a high rate of false positives may be a terrible diagnostic tool for a doctor’s office. But if it could be done cheaply, and repeatedly, and at home, it could be a very useful tool for a population because it would give people a bit more information at a mass scale. Thinking in individual terms versus public health terms is, Tufekci said, why the Food and Drug Administration has been so resistant to approving rapid at-home antigen testing (though that is, at last, beginning to change).

There’s much more in our full conversation, of course, including Tufekci’s systems-level view of the Republican Party, why she thinks media coverage of the vaccines is too pessimistic, why Asian countries so decisively outperformed Western Europe and the United States in containing the coronavirus, and her favorite vegetarian Turkish food. You can listen by subscribing to “The Ezra Klein Show” wherever you get your podcasts, or pressing play at the top of this post.

Source: https://www.nytimes.com/2021/02/02/opinion/ezra-klein-podcast-zeynep-tufecki.html?showTranscript=1

YouTube, the Great Radicalizer – The New York Times

Good article on how social media reinforces echo chambers and tends towards more extreme views:

At one point during the 2016 presidential election campaign, I watched a bunch of videos of Donald Trump rallies on YouTube. I was writing an article about his appeal to his voter base and wanted to confirm a few quotations.

Soon I noticed something peculiar. YouTube started to recommend and “autoplay” videos for me that featured white supremacist rants, Holocaust denials and other disturbing content.

Since I was not in the habit of watching extreme right-wing fare on YouTube, I was curious whether this was an exclusively right-wing phenomenon. So I created another YouTube account and started watching videos of Hillary Clinton and Bernie Sanders, letting YouTube’s recommender algorithm take me wherever it would.

Before long, I was being directed to videos of a leftish conspiratorial cast, including arguments about the existence of secret government agencies and allegations that the United States government was behind the attacks of Sept. 11. As with the Trump videos, YouTube was recommending content that was more and more extreme than the mainstream political fare I had started with.

Intrigued, I experimented with nonpolitical topics. The same basic pattern emerged. Videos about vegetarianism led to videos about veganism. Videos about jogging led to videos about running ultramarathons.

It seems as if you are never “hard core” enough for YouTube’s recommendation algorithm. It promotes, recommends and disseminates videos in a manner that appears to constantly up the stakes. Given its billion or so users, YouTube may be one of the most powerful radicalizing instruments of the 21st century.

This is not because a cabal of YouTube engineers is plotting to drive the world off a cliff. A more likely explanation has to do with the nexus of artificial intelligence and Google’s business model. (YouTube is owned by Google.) For all its lofty rhetoric, Google is an advertising broker, selling our attention to companies that will pay for it. The longer people stay on YouTube, the more money Google makes.

What keeps people glued to YouTube? Its algorithm seems to have concluded that people are drawn to content that is more extreme than what they started with — or to incendiary content in general.

Is this suspicion correct? Good data is hard to come by; Google is loath to share information with independent researchers. But we now have the first inklings of confirmation, thanks in part to a former Google engineer named Guillaume Chaslot.

Mr. Chaslot worked on the recommender algorithm while at YouTube. He grew alarmed at the tactics used to increase the time people spent on the site. Google fired him in 2013, citing his job performance. He maintains the real reason was that he pushed too hard for changes in how the company handles such issues.

The Wall Street Journal conducted an investigationof YouTube content with the help of Mr. Chaslot. It found that YouTube often “fed far-right or far-left videos to users who watched relatively mainstream news sources,” and that such extremist tendencies were evident with a wide variety of material. If you searched for information on the flu vaccine, you were recommended anti-vaccination conspiracy videos.

It is also possible that YouTube’s recommender algorithm has a bias toward inflammatory content. In the run-up to the 2016 election, Mr. Chaslot created a program to keep track of YouTube’s most recommended videos as well as its patterns of recommendations. He discovered that whether you started with a pro-Clinton or pro-Trump video on YouTube, you were many times more likely to end up with a pro-Trump video recommended.

Combine this finding with other research showing that during the 2016 campaign, fake news, which tends toward the outrageous, included much more pro-Trump than pro-Clinton content, and YouTube’s tendency toward the incendiary seems evident.

YouTube has recently come under fire for recommending videos promoting the conspiracy theory that the outspoken survivors of the school shooting in Parkland, Fla., are “crisis actors” masquerading as victims. Jonathan Albright, a researcher at Columbia, recently “seeded” a YouTube account with a search for “crisis actor” and found that following the “up next” recommendations led to a network of some 9,000 videos promoting that and related conspiracy theories, including the claim that the 2012 school shooting in Newtown, Conn., was a hoax.

What we are witnessing is the computational exploitation of a natural human desire: to look “behind the curtain,” to dig deeper into something that engages us. As we click and click, we are carried along by the exciting sensation of uncovering more secrets and deeper truths. YouTube leads viewers down a rabbit hole of extremism, while Google racks up the ad sales.

Human beings have many natural tendencies that need to be vigilantly monitored in the context of modern life. For example, our craving for fat, salt and sugar, which served us well when food was scarce, can lead us astray in an environment in which fat, salt and sugar are all too plentiful and heavily marketed to us. So too our natural curiosity about the unknown can lead us astray on a website that leads us too much in the direction of lies, hoaxes and misinformation.

In effect, YouTube has created a restaurant that serves us increasingly sugary, fatty foods, loading up our plates as soon as we are finished with the last meal. Over time, our tastes adjust, and we seek even more sugary, fatty foods, which the restaurant dutifully provides. When confronted about this by the health department and concerned citizens, the restaurant managers reply that they are merely serving us what we want.

This situation is especially dangerous given how many people — especially young people — turn to YouTube for information. Google’s cheap and sturdy Chromebook laptops, which now make up more than 50 percent of the pre-college laptop education market in the United States, typically come loaded with ready access to YouTube.

This state of affairs is unacceptable but not inevitable. There is no reason to let a company make so much money while potentially helping to radicalize billions of people, reaping the financial benefits while asking society to bear so many of the costs.

via YouTube, the Great Radicalizer – The New York Times