Canada is refusing more study permits. Is new AI technology to blame?

Given the high volumes (which immigration lawyers and consultants benefit from), expanded use of technology and templates inevitable and necessary, although thorough review and safeguards necessary.

Alternate narrative, given reporting on abuse and exploitation of international students and the program itself (The reality of life in Canada for international students), perhaps a system generating more refusals has merit:

Soheil Moghadam applied twice for a study permit for a postgraduate program in Canada, only to be refused with an explanation that read like a templated answer.

The immigration officer was “not satisfied that you will leave Canada at the end of your stay,” he was told.

After a third failed attempt, Moghadam, who already has a master’s degree in electronics engineering from Iran, challenged the refusal in court and the case was settled. He’s now studying energy management at the New York Institute of Technology in Vancouver.

His Canadian lawyer, Zeynab Ziaie, said that in the past couple of years, she has noticed a growing number of study permit refusals like Moghadam’s. The internal notes made by officers reveal only generic analyses based on cookie-cutter language and often have nothing to do with the particular evidence presented by the applicant.

“We’re seeing a lot of people that previously would have been accepted or have really what we consider as complete files with lots of evidence of financial support, lots of ties to their home country. These kinds of files are just being refused,” said Ziaie, who added that she has seen more than 100 of these refusals in her practice in the past two years.

It’s a Microsoft Excel-based system called Chinook. 

Its existence came to light during a court case involving Abigail Ocran, a woman from Ghana who was refused a study permit by the Immigration Department.

Government lawyers in that case filed an affidavit by Andie Daponte, director of international-network optimization and modernization, who detailed the working and application of Chinook.

That affidavit has created a buzz among those practising immigration law, who see the new system — the department’s transition to artificial intelligence — as a potential threat to quality decision making, and its arrival as the harbinger of more troubling AI technology that could transform how immigration decisions are made in this country.

All eyes are now on the pending decision of the Ocran case to see if and how the court will weigh in on the use of Chinook. 


Chinook was implemented in March 2018 to help the Immigration Department handle an exponential growth in cases within its existing, and antiquated, Global Case Management System (GCMS).

Between 2011 and 2019, before everything slowed down during the pandemic, the number of visitor visa applications skyrocketed by 109 per cent, with the caseload of applications for overseas work permits and study permits up by 147 per cent and 222 per cent, respectively.

In 2019 alone, Daponte said in his affidavit, Canada received almost 2.2 million applications from prospective visitors, in addition to 366,000 from people looking to work here and 431,500 from would-be international students.

Meanwhile, the department’s 17-year-old GCMS system, which requires officers to open multiple screens to download different information pertaining to an application, has not caught up. Each time decision-makers move from screen to screen they must wait for the system to load, causing significant delays in processing, especially in countries with limited network bandwidth.

Chinook was developed in-house and implemented “to enhance efficiency and consistency, and to reduce processing times,” Daponte said.

As a result, he said, migration offices have generally seen an increase of between five per cent and 35 per cent in the number of applications they have been able to process.

Here’s how Chinook works: an applicant’s information is extracted from the old system and populated in a spreadsheet, with each cell on the same row filled with data from that one applicant — such as name, age, purpose of visit, date of receipt of the application and previous travel history.

Each spreadsheet contains content from multiple applicants and is assigned to an officer to enable them to use “batch processes.”

After the assessment of an application is done, the officer will click on the decision column to prompt a pop-up window to record the decision, along with a notes generator if they’re giving reasons in the case of a refusal.

(An officer can refuse or approve an application, and sometimes hold it for further information.)

When done, decision-makers click a button labelled “Action List,” which organizes data for ease of transfer into the old system. It presents the decision, reasons for refusal if applicable, and any “risk indicators” or “local word flags” for each application.

The spreadsheets are deleted daily after the data transfer for privacy concerns.

While working on the spreadsheet, said Daponte, decision-makers continue to have access to paper applications or electronic documents and GCMS if needed.

“Chinook was built to save decision-makers time in querying GCMS for application information and to allow for the review of multiple applications,” Daponte noted.

However, critics are concerned that the way the system is set up may be guiding the officers toward certain conclusions, giving them the option of not reviewing all the material presented in each case, and that it effectively shields much of the decision making from real scrutiny.

According to Daponte’s court affidavit, the notes generator presents standard language that immigration officers may select, review and modify to fit the circumstances of an application in preparing reasons for refusal. The function is there to “assist them in the creation of reasons.”

Ziaie believes that explains the templated reasons for refusals she’s been seeing.

“These officers are looking at a spreadsheet of potentially 100 different applicants. And those names don’t mean anything to the officers. You could mix up rows. You could easily make errors,” said the Toronto lawyer.

“There’s no way to go back and check that because these decisions end up with very similar notes that are generated right when they’re refused. So my concern is about accountability. Every time we have a decision, it has to make sense. We don’t know if they make mistakes.”

That’s why she and other lawyers worry the surge of study permit refusals is linked to the implementation of Chinook. 

In fact, that question was put to Daponte during the cross-examination in the Ocran case by the Ghanaian student’s lawyer, Edos Omorotionmwan.

Immigration data obtained by Omorotionmwan showed the refusal rate of student permit applications had gone from 31 per cent in 2016 to 34 per cent in 2018, the year Chinook was launched. The trend continued in 2019 to 40 per cent and reached 53 per cent last year.

“Is there a system within the Chinook software requiring some oversight function where there is some other person to review what a visa officer has come up with before that decision is handed over to the applicants?” asked Omorotionmwan.

“Within Chinook, no,” replied Daponte, who also said there’s no mechanism within this platform to track if an officer has reviewed all the support documents and information pertaining to an applicant’s file in the GCMS data.


“This idea of using portals and technology to speed up the way things are done is the reality of the future,” said Vancouver-based immigration lawyer Will Tao, who has tracked the uses of Chinook and blogged about it.

“My concern as an advocate is: who did this reality negatively impact and what systems does it continue to uphold?”

Tao said the way the row of personal information is selected and set out in the Chinook spreadsheet “disincentivizes” officers to go into the actual application materials and support documents out of convenience.

“And then the officers are supposed to use those notes generators to justify their reasoning and not go into some of the details that you would like to see to reflect that they actually reviewed the facts of the case. The biggest problem I have is that this system has had very limited oversight,” he said.

“It makes it easier to refuse because you don’t have to look at all the facts. You don’t have to go through a deep, thoughtful analysis. You have a refusal notes generator that you can apply without having read the detailed study plans and financial documents.”

He points to Chinook’s built-in function that flags “risk factors” — such as an applicant’s occupation and intended employer’s information — for inconsistency in an application, as well as “local flag words” to triage and ensure priority processing of time-sensitive applications to attend a wedding or a funeral.

Those very same flag words used in the spreadsheet can also be misused to mark a particular group of applicants based on their personal profiles and pick them out for refusals, said Tao.

In 2019, in a case involving the revocation of citizenship to the Canadian-born sons of two Russian spies, the Supreme Court of Canada made a landmark ruling that helps guide judges to review the decisions of immigration officials.

In the unanimous judgment, Canada’s highest court ruled it would be “unacceptable for an administrative decision maker to provide an affected party formal reasons that fail to justify its decision, but nevertheless expect that its decision would be upheld on the basis of internal records that were not available to that party.”

Tao said he’s closely watching how the Ocran decision is going to shed light on the application of Chinook in the wake of that Supreme Court of Canada ruling over the reasonableness standard.

“Obviously, a lot of these applications have critical points that they get refused on and with the reasons being template and standard, it’s hard for reviewers to understand how that came to be,” he said.

In a response to the Star’s inquiry about the concerns raised about Chinook, the Immigration Department said the tool is simply to streamline the administrative steps that would otherwise be required in the processing of applications to improve efficiency.

“Decision makers are required to review all applications and render their decisions based on the information presented before them,” said spokesperson Nancy Caron.

“Chinook does not fundamentally change the way applications are processed, and it is always the officer that gives the rational for the decisions and not the Chinook tool.”

For immigration lawyer Mario Bellissimo, Chinook is another step in the Immigration Department’s move toward digitalization and modernization.

Ottawa has been using machine learning technology since 2018 to triage temporary resident visa applications from China and India, using a “set of rules derived from thousands of past officer decisions” then deployed by the technology to classify applications into high, medium and low complexity.

Cases identified as low complexity and low risk automatically receive positive eligibility decisions, allowing officers to review these files exclusively on the basis of admissibility. This enables officers to spend more time scrutinizing the more complex files.

Chinook, said Bellissimo, has gone beyond the triage. He contends it facilitates the decision-making process by officers.

The use of templated responses from the notes generator makes the refusal reasons “devoid of meaning,” he noted.

“Eventually, do you see age discriminators put into place for study permits when anyone over the age of 30 is all automatically streamed to a different tier because they are less likely bona fide students? This is the type of stuff we need to know,” Bellissimo explained.

“When they’re just pulling standard refusal reasons and just slapping it in, then those decisions become more difficult to understand and more difficult to challenge. Who made the decision? Was technology used? And that becomes a problem.”

He said immigration officials need to be accountable and transparent to applicants about the use of these technologies before they are rolled out, not after they become an issue.

Petra Molnar, a Canadian expert specializing in migration and technology, said automated decision-making and artificial intelligence tools are difficult to scrutinize because they are often very opaque, including how they are developed and deployed and what review mechanisms, if any, exist once they are in use.

“Decisions in the immigration and refugee context have lifelong and life-altering ramifications. People have the right to know what types of tools are being used against them and how they work, so that we can meaningfully challenge these types of systems.”

Ziaie, the lawyer, said she understands the tremendous pressure on front-line immigration officers, but if charging a higher application fee — a study permit application now costs $150 — can help improve the service and quality of decisions, then that should be implemented.

“They should allocate a fair amount of that revenue toward trying to hire more people, train their officers better and give them more time to review the files so they actually do get a better success rate,” she said. “By that, I mean fewer files going to Federal Court.”

As a study permit applicant, Moghadam said it’s frustrating not to understand how an immigration officer reaches a refusal decision because so much is at stake for the applicant.

It took him two extra years to finally obtain his study permit and pursue an education in Canada, let alone the additional application fees and hefty legal costs.

“Your life is put on hold and your future is uncertain,” said the 39-year-old, who had a decade of work experience in engineering for both Iranian and international companies.

“There’s the time, the costs, the stress and the anxiety.”

Source: https://www.thestar.com/news/canada/2021/11/15/canada-is-refusing-more-study-permits-is-new-ai-technology-to-blame.html

About Andrew
Andrew blogs and tweets public policy issues, particularly the relationship between the political and bureaucratic levels, citizenship and multiculturalism. His latest book, Policy Arrogance or Innocent Bias, recounts his experience as a senior public servant in this area.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.

%d bloggers like this: