Liberal, Moderate or Conservative? See How Facebook Labels You – The New York Times

Not surprising that Facebook is doing this kind of analysis. Does not appear to work for Canadian political leanings when I checked my profile (no “Canadian politics” tab):

You may think you are discreet about your political views. But Facebook, the world’s largest social media network, has come up with its own determination of your political leanings, based on your activity on the site.

And now, it is easy to find out how Facebook has categorized you — as very liberal or very conservative, or somewhere in between.

Try this (it works best on your desktop computer):

Go to facebook.com/ads/preferences on your browser. (You may have to log in to Facebook first.)

That will bring you to a page with your ad preferences. Under the “Interests” header, click the “Lifestyle and Culture” tab.

Then look for a box titled “US Politics.” In parentheses, it will describe how Facebook has categorized you, such as liberal, moderate or conservative.

(If the “US Politics” box does not show up, click the “See more” button under the grid of boxes.)

Facebook makes a deduction about your political views based on the pages that you like — or on your political preference, if you stated one, on your profile page. If you like the page for Hillary Clinton, Facebook might categorize you as a liberal.

Even if you do not like any candidates’ pages, if most of the people who like the same pages that you do — such as Ben and Jerry’s ice cream — identify as liberal, then Facebook might classify you as one, too.

Facebook has long been collecting information on its users, but it recently revamped the ad preferences page, making it easier to view.

The information is valuable. Advertisers, including many political campaigns, pay Facebook to show their ads to specific demographic groups. The labels Facebook assigns to its users help campaigns more precisely target a particular audience.

For instance, Donald J. Trump’s presidential campaign has paid for its ads to be shown to those who Facebook has labeled politically moderate.

Campaigns can also use the groupings to show different messages to different supporters. They may want to show an ad to their hard-core supporters, for example, that is unlike an ad targeted at people just tuning in to the election.

It is not clear how aggressively Facebook is gathering political information on users outside the United States. The social network has 1.7 billion active users, including about 204 million in the United States.

Political outlook is just one of the attributes Facebook compiles on its users. Many of the others are directly commercial: whether you like television comedy shows, video games or Nascar.

To learn more about how political campaigns are targeting voters on social media, The New York Times is collecting Facebook ads from our readers with a project called AdTrack. You can take part by visiting nytimes.comand searching for “Send us the political ads.”

Source: Liberal, Moderate or Conservative? See How Facebook Labels You – The New York Times

Israel: Facebook experiment reveals how ‘terror-related’ posts are treated differently

Interesting and revealing:

Two Israelis — an Arab and a Jew — posted messages on Facebook saying they were going to kill someone on the other side of the Israeli-Palestinian conflict.

The two posters were real people with active Facebook pages, but the threat was part of an experiment conducted by an Israeli news station last week. The goal was to monitor the reactions of individuals and Israeli authorities who are tasked with keeping tabs on social-media posts that they say might inspire terrorist attacks.

Critics in both communities say social media has served as a conduit for unstoppable deadly violence. While the low-intensity Israeli-Palestinian conflict has been burning for decades, the platforms have given rise to individual extremists and lone-wolf attackers who are much more difficult to stop, officials say.

After posting that he had been inspired to kill Jews, Shadi Khalileh, the Arab citizen, received calls from concerned friends and family. Israeli Arab members of parliament, who heard about his post via word of mouth, even called to ask why he would post such a message, or whether his page had been hacked. Only 12 people “liked” his post.

The Jewish citizen, Daniel Levy, wrote that he had to seek revenge after a Palestinian killed a 13-year-old Jewish girl in her bed. His post drew some 600 “likes,” 25 shares and comments such as “I am proud of you” and “you are a king.” One comment urged him to “please take the post down before you are arrested.”

Israeli police questioned Khalileh about his post — it took some work to convince them that it had all been an experiment. But Levy’s post went undetected by the authorities, the news station said.

In neither case did Facebook flag the posts, which remained online until the station ended the experiment.

The failure of social-media platforms to take action against posts calling for the murder of Israelis or Palestinians, Jews or Arabs, has become a growing issue for those on both sides of this decades-old conflict.

Source: Facebook experiment reveals how ‘terror-related’ posts are treated differently | Toronto Star

Facebook’s Bias Is Built-In, and Bears Watching – The New York Times

One of the more perceptive articles I have seen on the recent Facebook controversy and the overall issues regarding the lack of neutrality in algorithms:

The question isn’t whether Facebook has outsize power to shape the world — of course it does, and of course you should worry about that power. If it wanted to, Facebook could try to sway elections, favor certain policies, or just make you feel a certain way about the world, as it once proved it could do in an experiment devised to measure how emotions spread online.

There is no evidence Facebook is doing anything so alarming now. The danger is nevertheless real. The biggest worry is that Facebook doesn’t seem to recognize its own power, and doesn’t think of itself as a news organization with a well-developed sense of institutional ethics and responsibility, or even a potential for bias. Neither does its audience, which might believe that Facebook is immune to bias because it is run by computers.

That myth should die. It’s true that beyond the Trending box, most of the stories Facebook presents to you are selected by its algorithms, but those algorithms are as infused with bias as any other human editorial decision.

“Algorithms equal editors,” said Robyn Caplan, a research analyst at Data & Society, a research group that studies digital communications systems. “With Facebook, humans are never not involved. Humans are in every step of the process — in terms of what we’re clicking on, who’s shifting the algorithms behind the scenes, what kind of user testing is being done, and the initial training data provided by humans.”

Everything you see on Facebook is therefore the product of these people’sexpertise and considered judgment, as well as their conscious and unconscious biases apart from possible malfeasance or potential corruption. It’s often hard to know which, because Facebook’s editorial sensibilities are secret. So are its personalities: Most of the engineers, designers and others who decide what people see on Facebook will remain forever unknown to its audience.

Photo

CreditStuart Goldenberg 

Facebook also has an unmistakable corporate ethos and point of view. The company is staffed mostly by wealthy coastal Americans who tend to support Democrats, and it is wholly controlled by a young billionaire who has expressed policy preferences that many people find objectionable. Mr. Zuckerberg is for free trade, more open immigration and for a certain controversial brand of education reform. Instead of “building walls,” he supports a “connected world and a global community.”

You could argue that none of this is unusual. Many large media outlets are powerful, somewhat opaque, operated for profit, and controlled by wealthy people who aren’t shy about their policy agendas — Bloomberg News, The Washington Post, Fox News and The New York Times, to name a few.

But there are some reasons to be even more wary of Facebook’s bias. One is institutional. Many mainstream outlets have a rigorous set of rules and norms about what’s acceptable and what’s not in the news business.

“The New York Times contains within it a long history of ethics and the role that media is supposed to be playing in democracies and the public,” Ms. Caplan said. “These technology companies have not been engaged in that conversation.”

According to a statement from Tom Stocky, who is in charge of the trending topics list, Facebook has policies “for the review team to ensure consistency and neutrality” of the items that appear in the trending list.

But Facebook declined to discuss whether any editorial guidelines governed its algorithms, including the system that determines what people see in News Feed. Those algorithms could have profound implications for society. For instance, one persistent worry about algorithmic-selected news is that it might reinforce people’s previously held points of view. If News Feed shows news that we’re each likely to Like, it could trap us into echo chambers and contribute to rising political polarization. In a study last year, Facebook’s scientists asserted the echo chamber effect was muted.

But when Facebook changes its algorithm — which it does routinely — does it have guidelines to make sure the changes aren’t furthering an echo chamber? Or that the changes aren’t inadvertently favoring one candidate or ideology over another? In other words, are Facebook’s engineering decisions subject to ethical review? Nobody knows.

Source: Facebook’s Bias Is Built-In, and Bears Watching – The New York Times

Facebook’s Sandberg: Counter Hate Speech With Positivity | Re/code

Not sure how realistic and effective a strategy this can be. While better always to be respectful in person and on-line, not sure the degree to which this can be effective with some of the more extreme language and views being expressed.

Presumably, given all the information Facebook collects, there must be some data, rather than anecdotes, on the effectiveness of this approach.

But it is telling that Facebook is not willing to make changes to its NewsFeed algorithm, effectively outsourcing the issue.

My way of handling the few comments on my blog that border on hate speech is either to ignore them or throw back a few questions to the writer, aimed to provoke reflection. Sometimes people engage, sometimes not:

How should you fight back against people spewing hate speech in your Facebook News Feed? Kill ’em with kindness, of course!

That’s according to Facebook COO Sheryl Sandberg, who spoke on a panel Wednesday at the World Economic Forum in Davos, Switzerland. Sandberg talked about how Facebook tries to combat hate speech on its platform, and part of the strategy is encouraging counter-speech, the usually uplifting messages that provide the opposite viewpoints to degrading or negative language online.

Sandberg told a specific story about users in Germany who “Liked” a neo-Nazi Facebook page and then flooded it with positive messages. She called the effort a “Like attack.”

“The best antidote to bad speech is good speech. The best antidote to hate is tolerance,” she said. “Amplifying … counter-speech to the speech that’s perpetrating hate is, we think, by far the best answer.”

The strategy feels pretty “Kumbaya,” but that’s how Facebook has approached the issue of hate speech on its service, specifically when it comes to religious extremism and terrorism. Facebook will take down hate speech when it is flagged by a user, but it doesn’t go looking for it. That means the company is leaning on its user base to create positive content to fight against extremist material.

It’s making some effort to prod users in the right direction. Facebook is partnering with the U.S. government to encourage college students to launch anti-terrorism campaigns, for example. It is also partnering with the German government to better locate and remove hateful content. In both cases, Facebook is making financial contributions to the cause.

But the company is not using what is perhaps its most valuable asset in this matter: Its News Feed algorithm. Facebook claims that it doesn’t elevate this counter-speech in News Feed; it is instead offering a neutral playing field and hoping that positive speech wins out.

Facebook’s role in all of this has been top of mind for U.S. government officials, especially since a mass shooting took place in San Bernardino, Calif., back in December. Sandberg was part of a meeting between top government officials and Silicon Valley bigwigs earlier this month to discuss this very issue. Sandberg hasn’t spoken publicly about those meetings, so Wednesday’s panel was the first we’ve heard from her on this issue.

If you want to watch the entire panel you can do so here. Sandberg’s comments on counter-speech start right around the 18:00 minute mark.

Source: Facebook’s Sandberg: Counter Hate Speech With Positivity | Re/code

The Spiral Of Silence – Social Media « The Dish

Another angle to how social media narrows discussion and debate:

It seems counter-intuitive–if we’re getting only what Facebook things we want to get based on everything they know about us, which is a lot, shouldn’t we assume we are always among friends? But it makes sense. We’re worried about losing friends, which is to say that we’re worried our number of friends will diminish.

What’s peculiar about the Pew study is how the questions were asked. Though the survey took place in the months after Snowden’s revelations, the subjects were asked will you and would you… not did you. Using the conditional to report on behavior that already might or might not have happened tends to make the whole exercise, well, an exercise.

It turns out, too, that the spiral of silence does not only extend to individuals. Take this week’s revelation about the NSA’s Google-like search engine that shares something on the order of 850 billion data points such as private emails obtained without a warrant from ordinary American citizens among numerous government agencies. This is a big deal for many reasons, not the least of which is that it may enable the FBI or the DEA to illegally obtain evidence and cover their tracks while so doing. Yet the mainstream media almost uniformly ignored the story. When I searched ICREACH today, only the online tech media had picked it up and run with it. Is it possible that the mainstream media is afraid of losing friends, too?

The Spiral Of Silence « The Dish.