Terry Glavin: Democracy is a shambles, and you’re a citizen. Get out of your echo chamber

Good advice by Glavin:

A healthy distrust of “experts,” the media, government and the business class is not a bad thing, but democracy is in a shambles the world round, and it’s not going to get better by heading for the hills, or by retreating into some safe space, or by listening only to people you agree with, or by ignoring information that challenges your opinions. You’re a citizen. Act like one. Get out of your echo chamber.

You might be surprised by what you find out there.

Source: Terry Glavin: Democracy is a shambles, and you’re a citizen. Get out of your echo chamber | National Post

Facebook’s AI boss: Facebook could fix its filter bubble if it wanted to – Recode

While Zuckerberg is correct that we all have a tendency to tune-out other perspectives, the role that Facebook and other social media have in reinforcing that tendency should not be downplayed:

One of the biggest complaints about Facebook — and its all-powerful News Feed algorithm — is that the social network often shows you posts supporting beliefs or ideas you (probably) already have.

Facebook’s feed is personalized, so what you see in your News Feed is a reflection of what you want to see, and people usually want to see arguments and ideas that align with their own.

The term for this, often associated with Facebook, is a “filter bubble,” and people have written books about it. A lot of people have pointed to that bubble, as well as to the proliferation of fake news on Facebook, as playing a major role in last month’s presidential election.

Now the head of Facebook’s artificial intelligence research division, Yann LeCun, says this is a problem Facebook could solve with artificial intelligence.

“We believe this is more of a product question than a technology question,” LeCun told a group of reporters last month when asked if artificial intelligence could solve this filter-bubble phenomenon. “We probably have the technology, it’s just how do you make it work from the product side, not from the technology side.”

A Facebook spokesperson clarified after the interview that the company doesn’t actually have this type of technology just sitting on the shelf. But LeCun seems confident it could be built. So why doesn’t Facebook build it?

“These are questions that go way beyond whether we can develop AI technology that solves the problem,” LeCun continued. “They’re more like trade-offs that I’m not particularly well placed to determine. Like, what is the trade-off between filtering and censorship and free expression and decency and all that stuff, right? So [it’s not a question of if] the technology exists or can be developed, but … does it make sense to deploy it. This is not my department.”

Facebook has long denied that its service creates a filter bubble. It has even published a study defending the diversity of peoples’ News Feeds. Now LeCun is at the very least acknowledging that a filter bubble does exist, and that Facebook could fix it if it wanted to.

And that’s fascinating because while it certainly seemed like a fixable problem from the outside — Facebook employs some of the smartest machine-learning and language-recognition experts in the world — it once again raises questions around Facebook’s role as a news and information distributor.

Facebook CEO Mark Zuckerberg has long argued that his social network is a platform that leaves what you see (or don’t see) to computer algorithms that use your online activity to rank your feed. Facebook is not a media company making human-powered editorial decisions, he argues. (We disagree.)

But is showing its users a politically balanced News Feed Facebook’s responsibility? Zuckerberg wrote in September that Facebook is already “more diverse than most newspapers or TV stations” and that the filter-bubble issue really isn’t an issue. Here’s what he wrote.

“One of the things we’re really proud of at Facebook is that, whatever your political views, you probably have some friends who are in the other camp. … [News Feed] is not a perfect system. Research shows that we all have psychological bias that makes us tune out information that doesn’t fit with our model of the world. It’s human nature to gravitate towards people who think like we do. But even if the majority of our friends have opinions similar to our own, with News Feed we have easier access to more news sources than we did before.”

So this, right here, explains why Facebook isn’t building the kind of technology that LeCun says it’s capable of building. At least not right now.

There are some benefits to a bubble like this, too, specifically user safety. Unlike Twitter, for example, Facebook’s bubble is heightened by the fact that your posts are usually private, which makes it harder for strangers to comment on them or drag you into conversations you might not want to be part of. The result: Facebook doesn’t have to deal with the level of abuse and harassment that Twitter struggles with.

Plus, Facebook isn’t the only place you’ll find culture bubbles. Here’s “SNL” making fun of a very similar bubble phenomenon that has come to light since election night.

Escaping the election cocoon

Good piece by Scott Gilmore on the risks of living in a bubble (I try to ensure my newsfeed includes a range of perspectives). As always, it starts from mindfulness of one’s own biases and applies to more than just politics).

Sound advice:

Unfortunately, our habit of tuning out ideas and voices we don’t like is part of our biological programming. “Confirmation bias,” the tendency to search for information that confirms our beliefs and to remember it longer, is a well-documented and inescapable element of our behavior. As a result, we instinctively tailor our universe to limit the emotionally upsetting views that contradict us. Until recently, the shortage of media choices made this hard to do. Left or right, we all watched the same suppertime newscast. Now, it’s finally possible to be bound in a nutshell, and count ourselves kings of infinite space, because we can avoid any bad dreams.

This has been very apparent in the refugee debate. A significant number of Canadians are opposed to allowing in more Syrians, due to the possibility that they would include Islamic State supporters, or that they would spread Islam or because we should be helping our own poor first. If you listen to a specific set of radio stations, read certain blogs and interact with people similar to yourself on Facebook, these ideas aren’t only defensible, they are overwhelmingly obvious.

Likewise, another group of Canadians who subscribe to different newspapers, listen to the CBC and read the Huffington Post are equally convinced of the self evident fact that there is a clear need for Canada to do more, and accepting far more refugees would neither strain our economy nor our social fabric. In reality, both sides are filtering out important pieces of information, making it impossible to see the full picture. Which is why neither group can grasp how anyone could possibly be so asinine as to dispute what is so clearly self-evident.

This is bad, and not just because it prevents us from having civil conversations about Canada’s refugee and immigration policies. It creates a lack of empathy that leads us to denigrate and dismiss the opinions of others. The leaders of all political parties, who are equally unable to acknowledge they do not have a monopoly on the truth, demonstrate this attitude repeatedly.

Our self-made cocoons also impair our ability to make intelligent decisions. In this election, most voters will not watch a single debate, read any of the party platforms or attend any campaign events. They don’t need to. They already know whom they’re going to vote for and, coincidentally, everyone else in his or her cocoon is voting the same way.

And for those we ultimately elect? Their own filters will make their governing decisions less effective. Ruling parties of all stripes tend only to listen to academics who support their agenda, only attend rallies that contain true believers, only read newspapers that  endorse their policies and only engage constituents who already voted for them. If it looks as if the Conservative party has only been thinking about its base for the last nine years, it’s because that’s literally true.

There are ways to cut through these cocoons, however. Just by being aware that you are constantly self-censoring the information that reaches you helps. You can also consciously resist the urge to mute the outspoken critic on Twitter, or unfollow the Facebook friend who shares articles in support of that politician you loathe. One step further would be to actually read some of those articles, or pick up a newspaper you wouldn’t normally read, no matter how much of a rag you think it is.

Source: Escaping the election cocoon