‘Racism plays a role in immigration decisions,’ House Immigration Committee hears

While always important to recognize that bias and discrimination can influence decisions, different acceptance rates can also reflect other factors, and that misrepresentation may be more prevalent in some regions than others.

Training guides and materials need to provide illustrations and examples. Meurrens is one of the few lawyers who regularly looks at the data but his challenge of the training guide “Kids in India are not back-packers as they are in Canada.” is odd given that the data likely confirms that statement.

Moreover, the call for more transparency, welcome and needed, may provide opportunities for the more unscrupulous to “game the system.”

“Kids in India are not back-packers as they are in Canada” reads a note appended to a slide in a presentation used to train Canadian immigration officials in mid-2019.

In a recent access to information request, Immigration lawyer Steven Meurrens said he received a copy of the presentation which was used in a training session by Immigration, Refugees and Citizenship Canada (IRCC) officials, dated April 2019 and titled “India [Temporary Resident Visa]s: A quick introduction.” He shared the full results of the request with The Hill Times.

The slides, which detail the reasons why Indians may apply for a Temporary Resident Visa (TRV) and what officials should look for in applications—have notes appended to them, as if they were speaking notes for the person giving the presentation. On one slide detailing potential reasons for travel to Canada, the notes read: “Kids in India are not back-packers as they are in Canada.”

In an interview, Meurrens spoke to an apparent double standard for Indian people looking to travel to Canada.

“It drives me nuts, because I’ve often thought that, as a Canadian, a broke university student, I could hop on a plane, go anywhere, apply for visas, and no one would be like, ‘That’s not what Canadians do,’” Meurrens said, adding that he’s representing people from India who did in fact intend to come to Canada to backpack through the country.

A screenshot of the page wherein an IRCC presentation notes that ‘Kids in India are not back-packers as they are in Canada.’ Image courtesy of IRCC

“To learn that people are trained specifically that Indian people don’t backpack” was “over the top,” he said. It reminded him of another instance of generalizations made within IRCC about different nationalities of people, when in 2015, an ATIP he received showed that training materials within the department stated that when a Chinese person marrying a non-Chinese person was a likely indicator of marriage fraud.

At the time, the department said that document was more than five years old, and no longer in use.

“[I’d like us] to get to a state where someone’s country of origin doesn’t dictate the level of procedural fairness that they’ll get and how they’re assessed,” he said.

The fact of systemic racism within Canada’s Department of Citizenship, Immigration, and Refugees Canada (IRCC) is not new; evidence of such racism was uncovered through what is colloquially known as the Pollara report. This report, conducted by Pollara Strategic Insights and released in 2021, was the result of focus groups conducted with IRCC employees to better understand “current experiences of racism within the department.”

The report found that within the department, the use of the phrase “the dirty 30” was widely used to refer to certain African nations and that Nigerians in particular were stereotyped as “particularly corrupt or untrustworthy.”

As the House Immigration Committee heard last week, there remains much work to be done to combat systemic racism within IRCC.

On March 22, the House Committee on Immigration and Citizenship began its study on differential outcomes in immigration decisions at IRCC, and Immigration Minister Sean Fraser (Central Nova, N.S.) appeared at the committee on March 24. Other issues brought up by witnesses included a lack of transparency from the department as well as concerns of systemic racism and bias being embedded in any automated intelligence (AI) the department uses to assess applications.

From students in Nigeria being subjected to English-language proficiency tests when they hail from an English-speaking country, to the differential treatment of some groups of refugees versus others, to which groups are eligible for resettlement support and which are not, the committee heard several examples of differential treatment of potential immigrants to Canada due to systemic racism and bias within IRCC.

“I know it’s very uncomfortable raising the issue of racism,” said Dr. Gideon Christian, president of the African Scholars Initiative and an assistant professor of AI and law at the University of Calgary.

“But the fact is that we need to call racism for what it is—as uncomfortable as it might be. … Yes, this is a clear case of racism. And we should call it that. We should actually be having conversations around this problem with a clear framework as to how to address it,” he said.

According to Christian, Nigerian students looking to come to Canada to study through the Nigerian Study Express program are subjected to an English-language proficiency test, despite the fact that the official language in Nigeria is English, that English is the language used in all official academic institutions there, and that academic institutions in Canada do not require a language test from Nigerian students for their admission.

A spokesperson for IRCC said the department does not single out Nigeria in its requirement for a language test.

“IRCC is committed to a fair and non-discriminatory application process,” reads the written statement.

“While language testing is not a requirement to be eligible for a study permit, individual visa offices may require them as part of their review of whether the applicant is a bona fide student. This includes many applicants from English-speaking countries, including a large number from India and Pakistan, two nations where English is widely taught and top countries for international students in Canada.”

“Nigeria is not singled out by the requirement of language tests for the Nigeria Student Express initiative,” the spokesperson said.

Systemic racism embedded in AI

Christian, who is also an assistant professor of AI and law at the University of Calgary and has spent the last three years researching algorithmic racism, expressed concern that the “advanced analytics” IRCC uses to triage its immigration applications—including the Microsoft Excel-based software system called Chinook—has systemic racism and bias embedded within it.

“IRCC has in its possession a great deal of historical data that can enable it to train AI and automate its visa application processes,” Christian told the committee. As revealed by the Pollara report, systemic bias, racism and discrimination does account for differential treatment of immigration applications, particularly when it comes to study visa refusals for those applying from Sub-Saharan Africa, he said.

“External story of IRCC—especially the Pollara report—have revealed systemic bias, racism and discrimination in IRCC processing of immigration applications. Inevitably, this historical data imposition of IRCC is tainted by the same systemic bias, racism and discrimination. Now the problem is that the use of these tainted data to train any AI algorithm will inevitably result in algorithmic racism. Racist AI, making immigration decisions,” he said.

The Pollara report echoed these concerns in a section that laid out a few ways processes and procedures adopted for expediency’s sake “have taken on discriminatory undertones.” This included “concern that increased automation of processing will embed racially discriminatory practices in a way that will be harder to see over time.”

Meurrens, who also appeared at committee on March 22, said a lack of transparency from the government impedes the public’s ability to assess whether it is indeed making progress on the issue of addressing systemic racism or not.

He said he’d like to see the department publish Access to Information results pertaining to internal manuals, visa office specific training guides, and other similar documents as downloadable PDFs on its website, pointing out this is how the provincial government of B.C. releases its ATIP responses. He also said he thinks IRCC should publish “detailed explanations and reports of how its artificial intelligence triaging and new processing tools work in practice.”

“Almost everything public today [about the AI programs] has been obtained through access to information results that are heavily redacted and which I don’t believe present the whole picture,” he said.

Whether the concerns were actually reflected in the AI itself, Meurrens said, could not be known without more transparency from the department.

“In the absence of increased transparency, concerns like this are only growing,” he said.

Fraser: racism is a ‘sickness’

On Thursday, Fraser told the committee that he agrees that racism is a problem within the department, calling it a “sickness in our society.”

“There are examples of racism not just in one department but across different levels of government. It’s a sickness in our society that limits the productivity of human beings who want to fully participate in our communities. IRCC is not immune from that social phenomenon that hampers our success as a nation, and we have to do everything we can to eradicate racism, not just from our department,” he said.

Fraser said there is “zero tolerance for racism, discrimination, or harassment of any kind,” but acknowledged those problems do exist within the department.

The minister pointed towards the anti-racism task force which was created in 2020 and “guides the department’s strategy to eliminate racism and applies an anti-racism lens” to the department’s work. He also said IRCC has been “actively reviewing its human resource systems so that Indigenous, Black, racialized peoples and persons with disabilities are better represented across IRCC at every level.”

Fraser also referenced a three-year anti-racism strategy for the department, which includes plans to implement mandatory bias training, anti-racist work and training objectives, and trauma coaching sessions for Black employees and managers to recognize the impacts of racism on mental health, among other things.

“It’s not lost on me that there have been certain very serious issues that have pertained to IRCC,” he said.

These measures are different from the ones witnesses and opposition MPs are calling for, however.

NDP MP Jenny Kwan (Vancouver East, B.C.) her top priority on this topic is to convince the government to put an independent ombudsperson in place whose job it would be to assess IRCC policies and the application of said policies as they relate to differential treatment, systemic racism, and gender biases.

“Let’s dig deep. Have an officer of the House do this work completely independent from the government,” she said in an interview with The Hill Times.

At the March 22 meeting, Kwan asked all six witnesses to state for the record if they agreed that the government should put such an ombudsperson in place. All six witnesses agreed.

Kwan questioned the ability of the department to conduct its own internal reviews.

“As the minister said [at committee], he’s undertaking a variety of measures to address these issues and to see how they can rectify it. … But how deeply is it embedded? And if it’s done internally, then how independent is it?” she wondered.

Fraser said the implementation of an ombudsperson was something he would consider after reading the committee’s report.

Conservative MP Jasraj Singh Hallan (Calgary Forest Lawn, Alta.), his party’s immigration critic and the vice-chair of the committee, agreed with Meurrens’ calls for increased transparency. “We need more evidence that the government is serious about this,” he said in an interview.

Hallan also said he wants to see consequences for those within the department who participated in the racism documented by the Pollara report.

“[Fraser] should start by approaching those employees of IRCC that made these complaints from that Pollara report and find out who is making these remarks. Reprimand them, fire them if they need to be,” he said.

Source: ‘Racism plays a role in immigration decisions,’ House Immigration Committee hears

Canada is refusing more study permits. Is new AI technology to blame?

Given the high volumes (which immigration lawyers and consultants benefit from), expanded use of technology and templates inevitable and necessary, although thorough review and safeguards necessary.

Alternate narrative, given reporting on abuse and exploitation of international students and the program itself (The reality of life in Canada for international students), perhaps a system generating more refusals has merit:

Soheil Moghadam applied twice for a study permit for a postgraduate program in Canada, only to be refused with an explanation that read like a templated answer.

The immigration officer was “not satisfied that you will leave Canada at the end of your stay,” he was told.

After a third failed attempt, Moghadam, who already has a master’s degree in electronics engineering from Iran, challenged the refusal in court and the case was settled. He’s now studying energy management at the New York Institute of Technology in Vancouver.

His Canadian lawyer, Zeynab Ziaie, said that in the past couple of years, she has noticed a growing number of study permit refusals like Moghadam’s. The internal notes made by officers reveal only generic analyses based on cookie-cutter language and often have nothing to do with the particular evidence presented by the applicant.

“We’re seeing a lot of people that previously would have been accepted or have really what we consider as complete files with lots of evidence of financial support, lots of ties to their home country. These kinds of files are just being refused,” said Ziaie, who added that she has seen more than 100 of these refusals in her practice in the past two years.

It’s a Microsoft Excel-based system called Chinook. 

Its existence came to light during a court case involving Abigail Ocran, a woman from Ghana who was refused a study permit by the Immigration Department.

Government lawyers in that case filed an affidavit by Andie Daponte, director of international-network optimization and modernization, who detailed the working and application of Chinook.

That affidavit has created a buzz among those practising immigration law, who see the new system — the department’s transition to artificial intelligence — as a potential threat to quality decision making, and its arrival as the harbinger of more troubling AI technology that could transform how immigration decisions are made in this country.

All eyes are now on the pending decision of the Ocran case to see if and how the court will weigh in on the use of Chinook. 

Chinook was implemented in March 2018 to help the Immigration Department handle an exponential growth in cases within its existing, and antiquated, Global Case Management System (GCMS).

Between 2011 and 2019, before everything slowed down during the pandemic, the number of visitor visa applications skyrocketed by 109 per cent, with the caseload of applications for overseas work permits and study permits up by 147 per cent and 222 per cent, respectively.

In 2019 alone, Daponte said in his affidavit, Canada received almost 2.2 million applications from prospective visitors, in addition to 366,000 from people looking to work here and 431,500 from would-be international students.

Meanwhile, the department’s 17-year-old GCMS system, which requires officers to open multiple screens to download different information pertaining to an application, has not caught up. Each time decision-makers move from screen to screen they must wait for the system to load, causing significant delays in processing, especially in countries with limited network bandwidth.

Chinook was developed in-house and implemented “to enhance efficiency and consistency, and to reduce processing times,” Daponte said.

As a result, he said, migration offices have generally seen an increase of between five per cent and 35 per cent in the number of applications they have been able to process.

Here’s how Chinook works: an applicant’s information is extracted from the old system and populated in a spreadsheet, with each cell on the same row filled with data from that one applicant — such as name, age, purpose of visit, date of receipt of the application and previous travel history.

Each spreadsheet contains content from multiple applicants and is assigned to an officer to enable them to use “batch processes.”

After the assessment of an application is done, the officer will click on the decision column to prompt a pop-up window to record the decision, along with a notes generator if they’re giving reasons in the case of a refusal.

(An officer can refuse or approve an application, and sometimes hold it for further information.)

When done, decision-makers click a button labelled “Action List,” which organizes data for ease of transfer into the old system. It presents the decision, reasons for refusal if applicable, and any “risk indicators” or “local word flags” for each application.

The spreadsheets are deleted daily after the data transfer for privacy concerns.

While working on the spreadsheet, said Daponte, decision-makers continue to have access to paper applications or electronic documents and GCMS if needed.

“Chinook was built to save decision-makers time in querying GCMS for application information and to allow for the review of multiple applications,” Daponte noted.

However, critics are concerned that the way the system is set up may be guiding the officers toward certain conclusions, giving them the option of not reviewing all the material presented in each case, and that it effectively shields much of the decision making from real scrutiny.

According to Daponte’s court affidavit, the notes generator presents standard language that immigration officers may select, review and modify to fit the circumstances of an application in preparing reasons for refusal. The function is there to “assist them in the creation of reasons.”

Ziaie believes that explains the templated reasons for refusals she’s been seeing.

“These officers are looking at a spreadsheet of potentially 100 different applicants. And those names don’t mean anything to the officers. You could mix up rows. You could easily make errors,” said the Toronto lawyer.

“There’s no way to go back and check that because these decisions end up with very similar notes that are generated right when they’re refused. So my concern is about accountability. Every time we have a decision, it has to make sense. We don’t know if they make mistakes.”

That’s why she and other lawyers worry the surge of study permit refusals is linked to the implementation of Chinook. 

In fact, that question was put to Daponte during the cross-examination in the Ocran case by the Ghanaian student’s lawyer, Edos Omorotionmwan.

Immigration data obtained by Omorotionmwan showed the refusal rate of student permit applications had gone from 31 per cent in 2016 to 34 per cent in 2018, the year Chinook was launched. The trend continued in 2019 to 40 per cent and reached 53 per cent last year.

“Is there a system within the Chinook software requiring some oversight function where there is some other person to review what a visa officer has come up with before that decision is handed over to the applicants?” asked Omorotionmwan.

“Within Chinook, no,” replied Daponte, who also said there’s no mechanism within this platform to track if an officer has reviewed all the support documents and information pertaining to an applicant’s file in the GCMS data.

“This idea of using portals and technology to speed up the way things are done is the reality of the future,” said Vancouver-based immigration lawyer Will Tao, who has tracked the uses of Chinook and blogged about it.

“My concern as an advocate is: who did this reality negatively impact and what systems does it continue to uphold?”

Tao said the way the row of personal information is selected and set out in the Chinook spreadsheet “disincentivizes” officers to go into the actual application materials and support documents out of convenience.

“And then the officers are supposed to use those notes generators to justify their reasoning and not go into some of the details that you would like to see to reflect that they actually reviewed the facts of the case. The biggest problem I have is that this system has had very limited oversight,” he said.

“It makes it easier to refuse because you don’t have to look at all the facts. You don’t have to go through a deep, thoughtful analysis. You have a refusal notes generator that you can apply without having read the detailed study plans and financial documents.”

He points to Chinook’s built-in function that flags “risk factors” — such as an applicant’s occupation and intended employer’s information — for inconsistency in an application, as well as “local flag words” to triage and ensure priority processing of time-sensitive applications to attend a wedding or a funeral.

Those very same flag words used in the spreadsheet can also be misused to mark a particular group of applicants based on their personal profiles and pick them out for refusals, said Tao.

In 2019, in a case involving the revocation of citizenship to the Canadian-born sons of two Russian spies, the Supreme Court of Canada made a landmark ruling that helps guide judges to review the decisions of immigration officials.

In the unanimous judgment, Canada’s highest court ruled it would be “unacceptable for an administrative decision maker to provide an affected party formal reasons that fail to justify its decision, but nevertheless expect that its decision would be upheld on the basis of internal records that were not available to that party.”

Tao said he’s closely watching how the Ocran decision is going to shed light on the application of Chinook in the wake of that Supreme Court of Canada ruling over the reasonableness standard.

“Obviously, a lot of these applications have critical points that they get refused on and with the reasons being template and standard, it’s hard for reviewers to understand how that came to be,” he said.

In a response to the Star’s inquiry about the concerns raised about Chinook, the Immigration Department said the tool is simply to streamline the administrative steps that would otherwise be required in the processing of applications to improve efficiency.

“Decision makers are required to review all applications and render their decisions based on the information presented before them,” said spokesperson Nancy Caron.

“Chinook does not fundamentally change the way applications are processed, and it is always the officer that gives the rational for the decisions and not the Chinook tool.”

For immigration lawyer Mario Bellissimo, Chinook is another step in the Immigration Department’s move toward digitalization and modernization.

Ottawa has been using machine learning technology since 2018 to triage temporary resident visa applications from China and India, using a “set of rules derived from thousands of past officer decisions” then deployed by the technology to classify applications into high, medium and low complexity.

Cases identified as low complexity and low risk automatically receive positive eligibility decisions, allowing officers to review these files exclusively on the basis of admissibility. This enables officers to spend more time scrutinizing the more complex files.

Chinook, said Bellissimo, has gone beyond the triage. He contends it facilitates the decision-making process by officers.

The use of templated responses from the notes generator makes the refusal reasons “devoid of meaning,” he noted.

“Eventually, do you see age discriminators put into place for study permits when anyone over the age of 30 is all automatically streamed to a different tier because they are less likely bona fide students? This is the type of stuff we need to know,” Bellissimo explained.

“When they’re just pulling standard refusal reasons and just slapping it in, then those decisions become more difficult to understand and more difficult to challenge. Who made the decision? Was technology used? And that becomes a problem.”

He said immigration officials need to be accountable and transparent to applicants about the use of these technologies before they are rolled out, not after they become an issue.

Petra Molnar, a Canadian expert specializing in migration and technology, said automated decision-making and artificial intelligence tools are difficult to scrutinize because they are often very opaque, including how they are developed and deployed and what review mechanisms, if any, exist once they are in use.

“Decisions in the immigration and refugee context have lifelong and life-altering ramifications. People have the right to know what types of tools are being used against them and how they work, so that we can meaningfully challenge these types of systems.”

Ziaie, the lawyer, said she understands the tremendous pressure on front-line immigration officers, but if charging a higher application fee — a study permit application now costs $150 — can help improve the service and quality of decisions, then that should be implemented.

“They should allocate a fair amount of that revenue toward trying to hire more people, train their officers better and give them more time to review the files so they actually do get a better success rate,” she said. “By that, I mean fewer files going to Federal Court.”

As a study permit applicant, Moghadam said it’s frustrating not to understand how an immigration officer reaches a refusal decision because so much is at stake for the applicant.

It took him two extra years to finally obtain his study permit and pursue an education in Canada, let alone the additional application fees and hefty legal costs.

“Your life is put on hold and your future is uncertain,” said the 39-year-old, who had a decade of work experience in engineering for both Iranian and international companies.

“There’s the time, the costs, the stress and the anxiety.”

Source: https://www.thestar.com/news/canada/2021/11/15/canada-is-refusing-more-study-permits-is-new-ai-technology-to-blame.html