Regg-Cohn: Surprised that some Black people and Latinos voted for Trump? Try looking at them as individuals

Good commentary on the diversity within groups:

In other news, it turns out that more Blacks, Latinos and gays turned out for Donald Trump this time than last time.

Why is that news? The only surprise is that anyone is surprised.

That certain groups are presumed to vote in their supposed self-interest — as determined by other groups who know better what’s best for them — is not merely presumptuous. It’s profiling.

Today, some of the same social critics who warn against stereotyping Blacks or Latinos are now scratching their heads about why they didn’t vote as expected in the U.S. presidential election. Profiling can be perilous.

Today, some of the same social critics who warn against stereotyping Blacks or Latinos are now scratching their heads about why they didn’t vote as expected in the U.S. presidential election. Profiling can be perilous.

It is a human impulse. But impossibly dehumanizing at times.

Profiling seeks out similarities, but it is pointless if we forget individual differences. It relies on the notion that people of similar backgrounds or aspirations hold similar beliefs, live in similar neighbourhoods, and so on.

Profiling seeks out similarities, but it is pointless if we forget individual differences. It relies on the notion that people of similar backgrounds or aspirations hold similar beliefs, live in similar neighbourhoods, and so on.

The biggest problems with profiling are the premises and definitions that underlie it. That more Latinos voted for Trump this time tells us little of interest, because it’s such an imprecise term (and is overshadowed by the overpowering reality that whites voted massively and decisively for him).

Latinos range from anti-Communist arch-capitalists in Miami’s Cuban émigré community to impoverished Honduran refugees fleeing drug wars via Mexico, to second-generation strivers in Texas or Arizona aspiring to join the ruling Republican establishment. Ethnic is not monolithic.

Just as LGBTQ voters can be Republican or Democrat, Latinos are more different than they are alike.

Profiling is a tool and a template. It is a form of demography and part of democracy, for better or for worst — which is why pollsters, political operatives and party fundraisers mine the data to harvest votes and donations at election time.

They’re just more sophisticated than the rest of us in slicing and dicing the fruit salad. They know that skin colour is only skin deep, so they drill down for other demographic details such as education, income, location.

That’s why postal codes are the preferred proxies for pollsters. Yet zeitgeist and zip codes are rarely congruent.

My own education in demographic divisions came when I was posted to the Toronto Star’s Middle East bureau years ago. Despite my background as a political reporter, I only realized as a foreign correspondent how many ways Israelis could be subdivided.

Not merely as hawks versus doves, but ethnic Ashkenazi versus Sephardi; secular Russian immigrants versus ultra-Orthodox Haredi; socialist kibbutzniks versus modern Orthodox Jewish settlers; urban versus suburban; Muslim and Christian Arab citizens versus Jewish citizens; and last but not least, left versus right. The miracle was how quickly those internecine divisions melted away when Israelis faced an external enemy and existential threat; and how quickly the internal tensions returned (Palestinians, too, fought their own civil war in Gaza between Islamist Hamas rejectionists and secular Yasser Arafat loyalists).

The security services typecast people as safe or threatening based not only on background but back story and behaviour — whether at airport check-ins, military checkpoints or political rallies. Which is why Yitzhak Rabin’s security guards let down their guard when a kippah-wearing orthodox Jew chatted them up before assassinating the prime minister — he didn’t fit their Palestinian profile of a clear and present danger.

Stephen Harper’s Tories made inroads in the GTA suburbs by appealing to the traditional values of many immigrant communities that converged with conservatism. His then-minister of multiculturalism, Jason Kenney, once sat me down to demonstrate his mastery of Chinese Canadian demographics — delineating early anti-Communist immigrants from Taiwan, subsequent waves of Cantonese-speaking Hong Kong dual citizens, and more recent (more apolitical) arrivals from mainland China.

The New Democratic Party — founded as an alliance between the co-operative agricultural movement and the labour movement — long ago learned the working class would not reflexively rally to their side. If workers are reluctant to recognize their own enlightened self-interest — rallying to Doug Ford’s Tories even when they campaigned on cancelling a minimum wage hike and then freezing it for years — why are progressives perplexed when Blacks or Latinos warm up to Trump?

Vote-determining issues are more likely to be economic than ethnic, and political preferences are often more idiosyncratic than ideological. That’s only human.

The point is that profiling tells you everything and nothing about people. Just as postal codes are imprecise — people are unpredictable.

Political parties bank on profiling because there’s much to gain from voters and donors, and little to lose from mass mailings or email blasting that misses the mark. The minimal cost of bulk postage and mass spamming is a mere rounding error.

The point is that profiling tells you everything and nothing about people. Just as postal codes are imprecise — people are unpredictable.

Political parties bank on profiling because there’s much to gain from voters and donors, and little to lose from mass mailings or email blasting that misses the mark. The minimal cost of bulk postage and mass spamming is a mere rounding error.

The rest of us can’t afford to be so reckless with our wild guesses, unproven hunches and dehumanizing assumptions. If the penalty of your profiling is an assassin’s bullet, or an airplane bombing, or a human rights humiliation, then the miscalculation yields an incalculable cost.

Source: https://www.thestar.com/politics/political-opinion/2020/11/11/surprised-that-blacks-and-latinos-voted-for-donald-trump-try-looking-at-them-as-individuals.html

How German cops learned to ignore political correctness to get tough on refugee crime

Another take on profiling or targeting those deemed at greater risk of crime:

Those who have branded Europe, and Germany in particular, too weak and politically correct to stop a purported wave of crime brought on by the arrival of more than a million asylum seekers, should pay attention to the news. German police haven’t taken long to get their act together, and immigrant crime is down sharply. Their methods, which include a sort of racial– or at least behavioural –profiling may be controversial, but they are proving effective.

On New Year’s Eve, 2016, more than 500 women were sexually assaulted, and 22 raped, in the vicinity of the central station in Cologne by crowds of young men, many of them of North African extraction. Police were outnumbered and humiliated. A few days later, the city’s police chief was fired. Mayor Henriette Reker was ridiculed for advising women to stick to a “code of conduct” that included keeping at “arm’s length” from strangers. It made Germany look enfeebled and confused, and the many critics of Chancellor Angela Merkel’s decision to open the country’s borders to asylum seekers had a field day.

On Dec. 31, 2016, the central station neighborhood in Cologne was flooded by 1,700 police. They were checking documents and pushing young men, more than a hundred at the last count, into vans. While this was going on, a tweet appeared on the Cologne police force’s account: “At Central Station, several hundred Nafris are being checked.” Nafris is shorthand for North Africans, and it set off waves of predictable criticism from left-wing politicians who called the term “dehumanizing” and accused Cologne police of racial profiling. The police chief, Juergen Mathies, apologized for “Nafris” — it was only a “working term” police used, he said — but not for his officers’ actions. After all, only a handful of assaults, and no rapes, were reported.

“From the experience of last New Year’s Eve and from experience gained in raids in general, a clear impression has emerged here about which persons to check,” he said. “There were no gray-haired older men or blonde, young women there.”

Though the German Interior Ministry also winced at the “Nafris” tweet, Mathies will not be fired. His pre-emptive action has been lauded by federal and local officials including Mayor Reker, that softie from a year ago. Lip service has been paid to politically correct language, but everyone knows what the police chief had to deal with.

German police didn’t catch the perpetrator of the pre-Christmas terror attack in Berlin — an Italian patrolman ended up shooting him — but the investigation that led to a Europe-wide manhunt for Anis Amri was quick and precise. Just before New Year’s, police arrested a Syrian who had apparently planned another terror attack. Germany’s security apparatus is clearly on high alert, and it’s been increasingly well-funded. In 2016, the Ministry of the Interior received a 1.5 billion euro ($1.56 billion) budget increase compared with the previous year, and the federal police were allowed to hire 3,000 additional officers. In 2017, the ministry’s budget is set to rise by another 500 million euros to 8.3 billion euros.

High immigration — in the 11 months through November, 723,027 asylum applications were filed in Germany, compared with 476,649 in all of 2015 — is driving the budget increases. That’s based on some hard facts. In 2015, 6.5 percent of all crimes in Germany were committed by immigrants, compared with 3.6 percent in 2014. In 2016, the proportion is likely to be higher — in the first nine months, immigrants committed 214,600 crimes, more than the 206,201 registered in all of 2015, and the general crime rate in Germany has been steady in recent years. Immigrants from North Africa are the least law-abiding group: They make up 2 percent of Germany’s immigrant population, but in the nine months of 2016, they accounted for 22 percent of immigrant crime.

In the third quarter of 2016, however, crime by immigrants dropped 23 percent compared with the first three months of the year. One reason could be that police are taking account of the numbers and the trends they reflect, and they are not being too sentimental or too careful of being branded racist.

Source: How German cops learned to ignore political correctness to get tough on refugee crime | National Post

Ontario firm’s social-media monitoring software linked to racial profiling by U.S. police

Not surprising. While the underlying technology may or may not be neutral, how it is used and which terms it looks for, is not:

A London company’s software has been implicated in racial profiling by police departments in the United States and banned from Twitter.

Media Sonar has sold software to police and law-enforcement agencies, marketing it as a tool to gather data from social media to help identify threats to public safety.

But an investigation by the American Civil Liberties Union (ACLU) has found that police used the London-made technology to monitor such hashtags as #BlackLivesMatter, #DontShoot, #ImUnarmed and #PoliceBrutality, to name a few.

“Law enforcement should not be using tools that treat protesters like enemies,” the ACLU, which did not have a spokesperson available to comment directly, said in a blog entry about the issue that it sent to The London Free Press.

“The utter lack of transparency, accountability and oversight is particularly troubling because social media surveillance software used by California law enforcement”Š — “Š tools like Media Sonar … — have been marketed in ways to target protesters.”

David Strucke, a partner in Media Sonar, was unavailable for comment in response to repeated Free Press phone calls and emails.

“Their software is very intelligent, tracking activities online. It is a great tool for law-enforcement agencies,” said Jaafer Haidar, a London technology observer and entrepreneur who founded Carbyn and is launching Socialseek.

“But it is not the responsibility of the technology company to police their customers. Customers have to be held responsible for how they use technology.”

The online news site Daily Dot reported that Media Sonar, from 2014 to 2016, sold the technology to 19 government services that spent at least $10,000 on the software.

The larger issue is the balance, and tension, between technology firms and law enforcement in using technology, added Haidar.

He pointed to Apple’s refusal to aid the FBI in hacking the phone of a shooter in the attack on a San Bernadino, Calif., Christmas party in 2015 that left 14 dead, and reports that BlackBerry has for years worked with police to hand over data from phone users, as proof that it’s uncharted territory.

“There is a lot of pressure on companies from government and law enforcement to use technology to survey (suspects)” Haidar said.

In an October interview with The Free Press, Strucke, chief executive of Media Sonar, described the company’s software as a “social media and online data investigation platform.”

The software tracks online actions, especially social media, and gives customers the ability to gather online and social media data and filter, analyze and search to gather information on individuals police want to keep an eye on.

Media Sonar’s software is being used by police forces in Toronto, Cleveland and Tampa Bay, and by the Los Angeles County sheriff’s office, to name a few.

The company also sells to sports teams, universities and corporations for “asset and executive protection.”

In recent years, sales at Media Sonar have grown by about 300 per cent every year, on average.

“This is an ethical issue a lot of (technology) companies are facing,” Johanna Westar, a Western University professor and technology analyst, said of privacy versus security.

She draws a parallel to the police carding issue, where police stop people to gather data, frequently targeting visible minorities.

“We have to decide how technology will be used, and it is a decision we have to make as a society.”

The ACLU of California scoured “thousands of pages” of public records and found law-enforcement agencies were secretly acquiring social media spying software.

The investigation also found that police did not receive approval or permission to buy or use the software.

Social-media monitoring software — two U.S. software businesses also have been implicated and banned from social media sites — was used by police to monitor protesters in Ferguson, Mo., and rioters in Baltimore after the killing of unarmed black men by police.

“The racist implications of social-media surveillance technology are not surprising. We know that when law enforcement gets to conceal the use of surveillance technology, they also get to conceal its misuse,” said the ACLU.

“Discriminatory policing that targets communities of colour is unacceptable …Š and secretive, sophisticated surveillance technologies supersize the impact of racial profiling and abuse.”

Racial profiling, by a computer? Police facial-ID tech raises civil rights concerns. – The Washington Post

The next frontier of combatting profiling:

The growing use of facial-recognition systems has led to a high-tech form of racial profiling, with African Americans more likely than others to have their images captured, analyzed and reviewed during computerized searches for crime suspects, according to a new report based on records from dozens of police departments.

The report, released Tuesday by the Center for Privacy & Technology at Georgetown University’s law school, found that half of all American adults have their images stored in at least one facial-recognition database that police can search, typically with few restrictions.

The steady expansion of these systems has led to a disproportionate racial impact because African Americans are more likely to be arrested and have mug shots taken, one of the main ways that images end up in police databases. The report also found that criminal databases are rarely “scrubbed” to remove the images of innocent people, nor are facial-recognition systems routinely tested for accuracy, even though some struggle to distinguish among darker-skinned faces.

The combination of these factors means that African Americans are more likely to be singled out as possible suspects in crimes — including ones they did not commit, the report says.

“This is a serious problem, and no one is working to fix it,” said Alvaro M. Bedoya, executive director of the Georgetown Law center that produced the report on facial-recognition technology. “Police departments are talking about it as if it’s race-blind, and it’s just not true.”

The 150-page report, called “The Perpetual Line-Up,” found a rapidly growing patchwork of facial-recognition systems at the federal, state and local level with little regulation and few legal standards. Some databases include mug shots, others driver’s-license photos. Some states, such as Maryland and Pennsylvania, use both as they analyze crime-scene images in search of potential suspects.

At least 117 million Americans have images of their faces in one or more police databases, meaning their resemblance to images taken from crime scenes can become the basis for follow-up by investigators. The FBI ran a pilot program this year in which it could search the State Department’s passport and visa databases for leads in criminal cases. Overall, the Government Accountability Office reported in May, the FBI has had access to 412 million facial images for searches; the faces of some Americans appear several times in these databases.

Source: Racial profiling, by a computer? Police facial-ID tech raises civil rights concerns. – The Washington Post

New airline passenger vetting could amount to racial profiling: [privacy] watchdog

Always a thorny issue, how to use big data and other tools to focus on where the risk is the greatest. Our family joke whenever I was subjected to enhanced screening (always politely) was that I was the token white person to demonstrate no profiling.

While some of the big data used includes this kind of racial or ethnic data, some likely also includes information regarding travel patterns and other behavioural data (e.g., paying for tickets with cash):

The federal border agency’s new system for scrutinizing incoming air passengers could open the door to profiling based on race or other personal factors, warns Canada’s privacy czar.

Privacy Commissioner Daniel Therrien is pressing the Canada Border Services Agency to explain the program’s rationale and build in safeguards to protect civil liberties.

Canadian law requires commercial airlines to provide the border agency with specific information about passengers flying to Canada, including name, birthdate, citizenship, seat number and other data.

For years the border agency has used the information to try to zero in on terrorists or other serious international criminals. Travellers are assessed for risk, allowing the agency to single out those with high-risk scores for closer examination at the airport.

The border agency is moving to a system known as scenario-based targeting, already used by the United States, as part of Canada’s commitment to work closely with Washington under a perimeter security pact forged in 2011.

The border agency says the new scheme will be more efficient, effective and accurate, directing the focus to a smaller segment of the travelling population who represent a potential high risk.

The new scenario-based method uses Big Data analytics — extensive number-crunching to identify patterns — to evaluate all data collected from air carriers, says Therrien’s office, which reviewed the border agency’s privacy impact assessment of the project.

“Designed to harmonize with the system used by the U.S., it could allow the operator to, for example, search for all males aged between the ages of 18-20 who are Egyptian nationals and who have visited both Paris and New York,” Therrien says in his recently released annual report.

The privacy commissioner is concerned travellers may now be targeted for increased scrutiny if they fit the general attributes of a group — “subjected to recurring and unnecessary attention at the border because of characteristics they cannot change,” such as age, gender, nationality, birthplace, or racial or ethnic origin.

Therrien’s office recommended the border agency:

— Demonstrate the necessity of scenario-based targeting, beyond the general purpose of aligning Canada’s system with that of the U.S.;

— Be more transparent by fleshing out the privacy impact assessment with general descriptions of the types of scenarios that might be used to identify potentially high-risk travellers;

— Conduct regular reviews of the “effectiveness and proportionality of scenarios,” including an examination of impacts on civil liberties and human rights;

— Prepare a broader privacy assessment of the overall program used to collect passenger information from airlines.

The border agency “responded positively” to all of the recommendations, Therrien’s office says in the annual report.

Source: New airline passenger vetting could amount to racial profiling: watchdog – The Globe and Mail