IRCC Settlement Services Statistics 2018-2022 to date

I recently received settlement service data from IRCC (open data tables date from 2019). Some highlights below.

Starting with the monthly data by service type, the effect of COVID can clearly be seen with levels having largely caught up with the pre-pandemic period, albeit during higher immigration levels.

The second chart compares the current July 2022 period with July 2021 and July 2020 periods, along with full-year 2021 with full year 2018, highlighting the increase on a monthly basis and the overall decrease compared to pre-pandemic levels.

Again, given that immigration has increased significantly since 2018, this understates the decline from 2018.

While the regional breakdown has generally been fairly stable, the recent increase in the share of European-origin users of settlement services reflects increased Ukrainian users while the decline in Asia reflects declines from Syria, China, India and Afghanistan.

Of the top 10 countries, Afghan users have increased the most following the Taliban takeover and consequent refugees. Ukrainian users, not shown, increased about 10 fold following the Russian war and consequent migration flows.

The last chart compares users by province with Alberta showing the greatest monthly increases and Atlantic Canada the only region showing an increase compared to 2018.

Increase in Cuban Migration Has No Historical Precedent

Interesting and significant shift:

When it comes to immigration across the U.S.-Mexico border, media coverage tends to focus on the increasing numbers of migrants attempting to cross it. What’s missing from the conversation, however, is the changing demographics of these migrants.  

Historically, the majority of people who attempted to cross the southwestern border between border crossing stations — officially called ports of entry, or “POEs” — were Mexican nationals. This began to change in recent years, when U.S. Customs and Border Protection (CBP) began encountering  large numbers of Central American migrants also attempting the crossing. Honduras, El Salvador, and Guatemala are known as the Northern Triangle, and many migrants from these countries frequently attempted the crossings in family units. 

Beginning in 2020, however, CBP began to encounter increasing numbers of immigrants from outside Mexico and the Northern Triangle. According to CBP data, the number of migrants from countries other than these four has increased 11,000% since 2007, with the sharpest increase occurring in the past two years. Border Patrol apprehensions involving migrants from countries beyond Mexico and Central America’s Northern Triangle were 9% in fiscal year 2019, but climbed to 22% in 2021 and 40% in 2022. In fact, encounters by CBP with migrants from these “other” countries are on track to outpace encounters with migrants from Mexico and the Northern Triangle. 

Migrants from these “other” countries come from a handful of nations, including Cuba, Colombia, Nicaragua and Venezuela. Each of these countries has seen dramatic increases in encounters at the southwest border over the past two years. The rapid increase in Cuban migrants is particularly notable. 

Cubans who remain on the island face widespread poverty, inflation, power blackouts, basic supply shortages, and intense government repression following massive anti-government protests in 2021. These conditions are driving a historic increase in Cuban migration, surpassing the 1980 Mariel boatlift. CBP has reported nearly 176,000 encounters with Cuban migrants at the southwest land border since October. 

Hundreds of unaccompanied Cuban children have arrived at the U.S.-Mexico border in the past year, as more parents appear to be sending their children to safety amid deteriorating conditions in Cuba.  

Since October 2021, CBP reported 662 encounters with unaccompanied Cuban children at the southern border, compared to 32 encounters in the FY 2021 and 57 encounters in 2020, marking an increase of 1,969%. 

In the midst of these increased numbers, USCIS has restarted the Cuban Family Reunification Parole program “to provide a safe, orderly pathway to the United States for certain Cuban beneficiaries of approved family-based immigrant petitions.” 

Source: https://www.boundless.com/blog/boundless-weekly-immigration-news/

Govender: B.C.’s new anti-racism legislation allows us to turn intersectional data into systemic change

From BC’s Human Rights Commissioner:

The “grandmother perspective” to data collection, which I first learned from Gwen Phillips of the Ktunaxa Nation, suggests that government should collect data as a grandmother would collect information about her family: to better care for them, rather than exercise control with a big-brother mentality. This perspective formed the basis for the recent recommendations on disaggregated data collection from British Columbia’s Office of the Human Rights Commissioner to the provincial government.

A grandmother collects her grandchildren’s stories like pencil marks on the wall, measuring their growth. Data can also tell a story – one that helps us to understand people’s needs at a community level. Policy makers, too, need good information to design good law, policy and services.

This week, the B.C. government introduced the Anti-Racism Data Act, new legislation to collect disaggregated demographic data. The new law, if passed, facilitates the collection of personal information for the purposes of identifying systemic racism and advancing racial equity.

Disaggregated demographic data are information based on different aspects of our identities: for example, information broken down by race, gender or educational status. While the Statistics Canada census already collects disaggregated data in relation to the general population, this new law will facilitate the collection of such data in relation to government policies, practices and services, such as health care. Comparing statistics based on these two datasets can reveal patterns and inequalities. Information about inequalities, in turn, can help us design better policies to tackle systemic discrimination. We can’t act on what we don’t know.

B.C. opens consultation on anti-racism legislation as groups praise data collection

Importantly, data must be collected on more than just racism in order to be effective. If we can’t understand how gender, race, age and other factors work together or intersect to inform our experiences in the world – and more accurately, how sexism, racism, ageism, ableism and so on inform our experiences – then we won’t be able to create good public policy that meets people’s real needs.

Race-based data only tell part of the story. For example, we know that in Canada, racialized men earn 78 cents for every dollar earned by white men, according to a 2019 Canadian Centre for Policy Alternatives report. But that gap widens significantly for racialized women, Indigenous women, transgender women and women with disabilities. Indeed, racialized women earn only 59 cents for every dollar earned by white men.

Treating all racialized people as a homogenous group not only obscures the problem, but it also reinforces it by leading to solutions that are only tailored to the experiences of the dominant subgroup within that category.

We may identify racist stereotypes as being one of the barriers contributing to the wage gap. But we also need to understand that stereotypes of racialized women may be quite different than stereotypes of racialized men. Ignoring these gendered differences silences and omits the experiences of racialized women. We need to truly understand the scope and complexity of the wage gap in order to solve it; intersectional data collection and analysis is key to that end.

However, there is a serious downside to collecting all this information. Despite its power to focus the gaze of policy makers on real world inequities, data also have the power to reinforce negative stereotypes, and some people have legitimate concerns about sharing this information.

In recent recommendations from my office, we called on the provincial government to put control over data in the hands of those from whom the data are being collected. For example, disaggregated demographic data about a First Nation should only be collected in service of that community and upon the consent of that Nation, and used at their direction. The new legislation creates advisory roles for those who are directly impacted to embed this democratic approach to data and to counter any harms it may cause.

For decades, racialized communities, scholars and activists have been calling for this kind of legislation. Over the last two years, the public calls for disaggregated-data collection have grown louder. Protests against police brutality, a growing awareness of the ways in which racism impacts health outcomes, including those of COVID-19, and a movement to push back on the mainstream emergence of white nationalism have brought systemic racism into the consciousness of the masses. While data may not be the most glamorous call to action, they may be one of the most fruitful.

The new legislation is an important marker of our growth toward a more equal society. However, data collection is just one pencil mark on the wall; the next milestone to measure will be whether we are able to use it to create real social change. Implementation requires using intersectional data and a meaningful community governance model to turn information into action.

Kasari Govender is British Columbia’s first independent Human Rights Commissioner. BC’s Office of the Human Rights Commissioner exists to address the root causes of inequality, discrimination and injustice in the province by shifting laws, policies, practices and cultures.

Source: B.C.’s new anti-racism legislation allows us to turn intersectional data into systemic change

Sarantakis: Taking data seriously: A call to public administrators

Important flagging of the importance of data for governments and how governments increasingly lag the private sector in their collection, analysis and use of data and AI to understand citizen needs.

However striking that a senior official would make the case without acknowledging the challenge in doing do for the public sector given that each time the government does so, significant criticism occurs, whether it be for IRCC’s use of the Chinook system, Statistics Canada use of anonymized credit card information to understand consumer spending, or PHAC’s collection of anonymized COVID phone data.

Perhaps a second piece on this harder issue?

It is said that the first step in overcoming a problem is first admitting its existence. So, here goes: Contemporary public administration is data-challenged.

This would have been an implausible statement to utter, historically. After all, public administrators as individuals know how important data is to public policy formulation and program delivery. Public administration has proved its worth over time with the value of record-keeping, and creating and using data — recording, ordering, sorting and tabulating counts of people, forests, geography, geology, tanks, guns and things like the production of butter.

Indeed, the two great and insatiable needs of the early state, formulated by Yale scholar James C. Scott, were taxation and conscription. Without revenues and the capacity to pay to defend sovereignty, states are not durable. In turn, without public administrators recording, ordering, sorting and tabulating data, the state does not endure.

Historically, public administration has been on the cutting edge of data. Entities often went to various state organs and state registries for data. The public service apparatus of the state knew, even in the state formed explicitly to curb government involvement in the daily affairs of its citizens.

But something dramatic has happened. The administrative state – that part of government that continues regardless of whether elections yield majorities or minorities that are red, blue, orange, green, or purple – is no longer on the cutting edge of data. Yes, the state still knows, but often it only now knows after, while private sector entities know now. Even more powerfully, with predictive analytics, sophisticated private entities increasingly know before.

How can we understand this switch? How can we understand public administration losing its historical position of relative data supremacy? To do that, we need to detour from public administration for a moment and veer into the private-sector economy. What we find gives us important clues to our mystery vis-à-vis data and public administration.

The factors of production 

Since Adam Smith, we have understood three core factors of production: land, labour and capital. There are others that have competed to be added to this list. Channeling Peter Drucker, some have argued for “management” – those who directresources. Others have argued for “entrepreneurs” – those who combine resources in new and innovative ways. But Smith’s formulation has proven remarkably durable for more than two centuries.

If Smith were to return and look at some of the most valuable and dynamic corporations of our era – the digital giants Google, Meta (formerly Facebook), Amazon, Apple, Spotify and others – he would likely be mystified. Yes, he would see some land. Yes, he would see some labour. But nowhere near enough to justify the heady heights – and incredible influence and power – of the digital giants. Finally, he would also see some capital. But remarkably, that capital would largely be a by-product of “production,” and not a driver of production.

Seeing the most valuable and powerful entities on earth during his era, Smith would have seen people – lots and lots of labour. He would have seen land. He would have seen capital in the form of constructed ships, and tools, and extracted then refined natural resources. He would have seen stuff – tangible things that he could touch.

But the contemporary Adam Smith would see negligible amounts of people and land in today’s largest companies. Certainly nothing approaching their value, status or their power. These companies, perhaps most surprisingly of all, “consume” relatively little capital.

So if you are generating enormous profits but not drawing heavily on the “factors of production” …. something makes no sense. What is going on?

Brains? Computers? Digital? Algorithms? Cloud computing?

Yes, yes, yes, yes, yes, and lots more.

But fundamentally, what is going on now is the fourth factor of production.

Data.

Data as differentiator 

Data has now become the most valuable commodity on earth. Data stocks are more valuable than natural resources. Data is more valuable than manufacturing facilities; more valuable than land; more valuable than labour. Data – the new oil? Oil should be so lucky.

Why?

Data is now the differentiator. Data is now the value-add. As computers, software, micro-processing power, storage, cloud computing and algorithms all become (or all trend toward) commodity status, it is the quantity and quality of data that will transform the mediocre into the successful.

A commodity is an interchangeable and undistinguished part. Where I buy a barrel of oil or a bar of gold or a truck load of gravel or road salt is overwhelmingly just price-contingent. The lowest price wins. To avoid becoming a commodity in data – valued only for how cheaply you can deliver something – you need more and better data than the competition. Increasingly, if you are data-deficient, you will not be competitive or sustainable as an entity.

Put another way, Company A and Company B already compete based on the quantity and the quality of their data. This will also increasingly be true in the coming years for Country A and Country B. Countries have competed forever for oil and gas and timber and nickel. Now they are also adding “quantity and quality of data” to that list of competitions.

Spotify is a data company that deals in music. Netflix is a data company that deals in entertainment. Tesla is a data company on wheels. Google is a data company that deals in information. Amazon is a data company that provides many things – same with Instagram, same with Facebook.

Computing, computation, communication, software, digital distribution – all are, or are rapidly becoming – commodities. Algorithms still have differentiating value, but as advances in artificial intelligence continue, these as well will also invariably trend to commodity status. What really adds value in production increasingly is the quality and quantity of data.

Data and public administration

What does all this have to do with public administration? At first glance, perhaps nothing. But on closer examination, a great deal.

The digital giants became digital giants because they understood – before others – the enormous value of enormous quantities of data. They understood – like the early state understood the power of knowing the quantity and location of trees and people and minerals – that data is power.

As Shoshana Zuboff expertly describes in The Age of Surveillance Capitalism, data becomes the nexus of power. But the power of data in the contemporary age isn’t about counting trees and people, it is rather about the “instrumentalization of behavior for the purposes of modification, prediction, monetization, and control.”

Contemporary public administration, which traces its very heritage back to data, is far less sophisticated in data today than the digital giants. Data is not utilized for public good applications anywhere near the degree to which data is utilized for commercial gain.

Over time, that will harm us all because the public-good realm will have less access to rich data than the private profit realm. Over time, that will make public administration a dinosaur. We need to better understand the power and application of data.

Public administration and real-time actionable data

States often revert to using blunt policy instruments because public administrations do not have the granularity of data – in real time – that is available to the digital giants. When you don’t have real-time actionable data, you estimate. You ask people to apply. You create programs with criteria instead of directly apply funding to public policy objectives.

That worked for a world when real-time actionable data either did not exist or was enormously expensive to actualize. But that is not today’s world. The percentage of the economy migrating online is growing every day, and the online economy has grown much faster than the analog economy in recent years. But something else is happening, too. With the internet of things(IoT), our toasters and our refrigerators and our lightbulbs and our ventilation systems and our water treatment plants and our garage doors and our pacemakers are all migrating online. The enormous oceans of data we have today will, in a few very short years, look like little trickles of water when the IoT begins to take hold in full flight.

Public administration is already behind. Imagine what happens when the volume of data being generated every moment of every day by billions of connected things across the globe increases at an even faster rate.

Does public administration understand the power of data? Do we understand how to use it to serve public policy goals? Do we understand how to regulate it for the public good? Do we have the systems in place to capture data? Do we have the systems in place to safeguard data? Do we have the systems in place to safeguard its use by non-state actors?

These are the many questions facing public administration today. The faster we get the answers, the better public administrators will be able to serve their political decision-makers and their state populations.

Time is not our friend on these questions.

Source: Taking data seriously: A call to public administrators

Canadian government braces for surge in passport renewals ahead of U.S. border reopening

Some interesting data. Surprising that there is not a monthly report in IRCC’s “Operational Processing” open data sets, some 8 years after passport was moved from Global Affairs to IRCC in 2013:

Source: Canadian government braces for surge in passport renewals ahead of U.S. border reopening

How CBC is diving deeper when it comes to newsroom diversity

While promotional, some interesting data of diversity within the CBC, both in the newsroom as well as management, highlighting the relative under-representation of the different visible minority and Indigenous groups. Also some interesting analysis regarding the diversity of people being interviewed (but not the thought diversity that is harder to measure and assess):

Soon after the news broke about the discovery of unmarked graves at the former Kamloops Indian Residential School, we convened a small group of our leaders and Indigenous journalists from across the country to act as an advisory committee for the CBC division of News, Current Affairs and Local.

We knew the story would only grow. There would be more discoveries in many different parts of Canada in the months ahead. We knew there was important accountability and investigative journalism to be done, building on years of excellent work tracking Truth and Reconciliation in Canada. (See Beyond 94, for example.)

We were also aware of the pain and trauma our journalism could create, not only for survivors and their families, but for our own staff with ties to this terrible legacy.

The committee was quick to identify areas in which we could support our staff. We rolled out a special edition of our “Reporting in Indigenous Communities” training course to about 30 leaders and assignment editors involved in deploying people to cover the story. We connected with the Dart Center for Journalism and Trauma at Columbia University to create a training program specific to the residential school story that will help our journalists understand trauma and how to approach people in affected communities, while also managing their own mental well-being.

And we created a dedicated residential school unit to ensure sustained, focused investigative journalism in the months ahead. The unit created an email tip line, wherearethey@cbc.ca, which received more than 200 messages in the first few weeks. It now has a toll-free number: 1-833-824-0800.

That early and proactive impulse to set up a committee and regularly consult with our Indigenous staff as this difficult story emerged resulted in greater sensitivity and understanding — and ultimately better, more nuanced journalism.

It’s a good example of what’s possible when a news organization like ours embraces the call for greater racial representation, equity and inclusion in everything it does, at every level. It’s a step forward on a long journey, with many more steps and undoubtedly years of hard work still to come.

We are 15 months into the cultural and social revolution sparked by the murder of George Floyd. As I’ve written before, this revolution swept news organizations the world over and resulted in some profound self-reflection about how we hire and promote, our core journalistic values and who defines them, and the stories, voices and perspectives we include — or exclude — as we cover the news.

To be clear, we started this important work long before May 2020 in many parts of our organization. We have always had a duty and responsibility to authentically portray this country and, as a result, the root of nearly every inclusion challenge we face are four key questions: Who’s at the table? Who’s speaking? Who’s missing? Who’s deciding?

Here’s a brief update on some of the work happening at CBC News, Current Affairs and Local to keep us on the path forward:

Newsroom diversity survey

We are participants in the Canadian Newsroom Diversity Survey led by the Canadian Association of Journalists (CAJ). The results, expected this fall, will offer a comparative analysis of the gender and racial makeup of at least 170 news organizations in Canada.

CBC/Radio-Canada is an industry leader when it comes to tracking and reporting on equity and staffing, having done so since the 1980s. As a federally regulated Crown corporation, CBC reports annually on our overall staffing composition per the Employment Equity Act, but many of us want more detail.

Are we reflective of Canada’s demography in the voices you hear, see or read each day? What about behind the scenes? Does management look different from part-time staff? Can we get more detail about specific racial groups as opposed to broad Employment Equity Act definitions such as “visible minority” or terms like “people of colour”?

We saw a great opportunity to get some of these answers in the CAJ initiative.

The measurement is imperfect. For instance, our numbers — a now-outdated snapshot in time as of December 2020 — come from self-declarations on a “cultural census” that we ask staff to complete. Many employees are captured under the broad equity definitions, but they have not completed the cultural census declaration for various reasons, which means we are forced to report many “unknowns” when asked for specific information about ethnocultural identity. Our gender data is binary (CBC is in the process of changing that to include non-binary). Biracial and multiracial staff may self-identify with one or more of the available categories in the survey. How should they be more accurately represented?

Still, the data will offer a baseline and provide some clarity on where we need to focus our recruitment and promotion efforts as a news organization. Here are few of the topline results for CBC’s journalism division, with more detail to come in the CAJ release this fall:

On gender, our newsrooms skew female at all levels: senior leadership is 54 per cent female and 46 per cent male; journalists are 56 per cent female and 44 per cent male; supervisors are 59 per cent female and 41 per cent male; part-time staff are 60 per cent female and 40 per cent male.

Of senior newsroom leaders in management positions, 22 per cent are people of colour or Indigenous. Here are a few graphs that show breakdowns in more detail:

Journalists (full time):

Journalists (part time):

Supervisors:

Senior leadership:

* Notes on Senior Leadership: As this is a relatively small group of leaders, we addressed inconsistencies in the CBC cultural census data with what we know to be our leadership. We tallied leaders identified under one of the five ethnic categories and grouped everyone else under uncategorized. 

JSP and inclusion

We are also months into a review of how our Journalistic Standards and Practices (JSP) — the framework that guides our journalism — are interpreted through the lens of inclusion. A staff-led consultation led to 65 recommendations. We are moving immediately on 20 action items and continuing consultations on the rest. Among the biggest commitments included in that first set of 20:

  • We will create an advisory group involving Black, Indigenous and journalists of colour to support the JSP office.
  • We will create a separate staff advisory committee with representation from various communities to consult and support ongoing changes to our internal language and style guide.
  • We will reinforce that lived experience and being a part of any one community does not constitute a conflict of interest when covering those communities. We will remind all that we value lived experience and community connections in our journalists because it helps us to broaden and deepen our journalism.
  • We will continue to hire and promote representation at all levels of our organization, including leadership and decision-making roles. We will exceed 55 per cent representation for new hires from three equity deserving groups (people of colour, Indigenous peoples and people with disabilities) in the year ahead.

Content tracking

In addition, more than 25 CBC journalistic programs have been involved in a staff-led content-tracking pilot project that tracks who appears on our airwaves and websites. Each team aims to identify at least three aspects: gender, race/ethnicity and whether or not the subject is speaking about their race or ethnicity. We are also tracking people who have publicly identified themselves as non-binary. Additional customized questions, such as the role of the guest on the program, can be added by the teams participating in this content-tracking project.

The results provide a baseline; a check on our assumptions and intentions around gender and racial equity. We learned, for example, that of nearly 5,000 guests counted across all the participating programs, 60 per cent were male. Hard numbers like that give our teams direction and ensure they course-correct. One consumer program saw that male experts appeared more often than females, for example, and the team made a concerted effort to bring more female guests onto their show.

We learned that 64 per cent of Indigenous guests and story subjects who appeared in our programs during the pilot spoke about their race and ethnicity, compared to 34 per cent of Black guests and story subjects. There is no right or wrong with these figures, considering how prominent the story of the Indigenous experience in Canada has been in recent months of news coverage. But the data forces us to self-reflect and discuss how we should incorporate the perspectives and experiences of these equity-deserving groups in all stories we are doing, beyond just issues related to aspects of their identities.

We aim to make this project a permanent, consistent practice across News, Current Affairs and Local. The staff leading this change have done extensive research and have years of experience in content tracking in Canada. They have already been asked to share their learnings with other newsrooms with similar efforts, including the BBC, NPR and many more.

What’s next?

We’ve come a long way. We have a long way to go.

The goal is clear: We will deepen our journalism and relevance to Canadians by broadening the perspectives at all levels of our organization and in the stories we tell.

Those four fundamental questions continue to guide us: Who’s at the table? Who’s speaking? Who’s missing? Who’s deciding?

Because as Canada’s public broadcaster, with one of the most trusted news services in the country, it is critical we are authentically and truly representing this country and all of its diversity.

Source: How CBC is diving deeper when it comes to newsroom diversity

Canada’s data gaps hampered pandemic response, hurting vaccination tracking: report

An area that governments need to address:

The pandemic has exposed significant problems with how Canada gathers and processes data on everything from case numbers to vaccinations, which has hurt the country’s response to COVID-19, a new report conducted for the federal government says.

Canada could not track the spread of the virus as effectively as it needed to last year, according to a report prepared by the Pan-Canadian Health Data Strategy Expert Advisory Group that will be made public Thursday. The country is now struggling to keep tabs on vaccine effectiveness because of flaws in the system, including how different jurisdictions record and share information.

These data gaps, created by a patchwork of health systems that don’t always work together and often code data in different ways, need to be addressed with a national approach, the report warns.

“There is no doubt that our response to the pandemic has been severely limited as a result,” says an advance copy of the report, which was reviewed by The Globe and Mail.

The report was ordered by Ottawa last year to examine data problems exposed by COVID-19. The group will put together a list of recommendations to the Public Health Agency of Canada and other departments on how to fix these weaknesses, said Vivek Goel, who chaired the review.

When the COVID-19 outbreak hit, problems in reporting new cases, symptoms and other crucial data became apparent in Canada’s patchwork system. Since provincial and territorial jurisdictions don’t necessarily use the same standards for collecting or codifying information, pooling crucial data on a national level became difficult.

“Early on it was challenging to get a full national picture, even of basic case counts,” Dr. Goel said, noting that crucial information such as the sites of the outbreaks, or the occupations of those who became ill, weren’t always collected, codified, or shared between health jurisdictions. This prevented policy makers from knowing where and how hot spots were developing, and where the next crisis might be lurking.

“That [information] is something that is collected on the front lines of public health as people do their interviews, or it is collected at the time someone goes for testing. But if it’s not collected in a consistent way in every place and then coded and loaded into the system, we don’t wind up with a good picture,” Dr. Goel said.

“I would say if we had some of that information in a more timely manner, we might have had some decisions [by the government] being made sooner,” Dr. Goel said.

The country got better at processing information as the pandemic progressed, but “Canada had had some pretty significant challenges early on in even getting some of that basic data shared and uploaded,” he said.

These data gaps have become magnified as the country tries to mount a rapid immunization campaign across those same varied jurisdictions. Lacking the ability to quickly and effectively pool data from around the country, Canada is struggling to track, in real time, how effectively the vaccines are working in the broader population.

“Probably the most important question around vaccination in Canada is around the effectiveness of the vaccines in the real world with the dosing schedules and approaches that we’ve taken in Canada, because we’re the country that’s taken the longest dose interval,” Dr. Goel said.

“We’ve got reports that have started to come out, but they’re coming out at the provincial level,” he said. “We don’t have a national report, and every province’s systems are slightly different. So we wind up with slightly different estimates. They’re not going to be comparable.”

More detailed data on vaccine uptake is also difficult to compile, he said. “We need to have data coming together around how many people have been immunized by age group, occupation codes, all sorts of information. For example, people want to know how many teachers have had [the vaccine]. But we don’t have systems that really allow us to easily bring that kind of data together,” Dr. Goel said.

Questions specific to Canada, such as the effectiveness of mixing vaccines, are also hard to answer without properly collecting and analyzing data from across the country, he said. “We’ve got more of this mixing and matching coming up, so we need to be generating real-world evidence on how well it’s working,” Dr. Goel said.

The findings echo a report by the Auditor-General of Canada in March that said the government lacked proper data procedures to accurately track the spread of the virus. Dr. Goel said the issues are due to a number of causes, from lack of investment and concerns over privacy breaches to provinces simply wanting to oversee their own systems.

He also noted that various reports and governments have tried to address these issues in the past, but the problems were never fixed. After the 2003 SARS outbreak, Ottawa oversaw the creation of a database system known as Panorama, intended to improve infectious-disease surveillance and immunization tracking on a national level. However, the project struggled to gain support, ran into numerous roadblocks and was never effective.

“Despite all these good intentions, we don’t seem to make the progress we’d like to see,” said Dr. Goel, a professor at the University of Toronto’s Dalla Lana School of Public Health who is leaving to become president of the University of Waterloo next month.

The report calls for Ottawa to work with provinces and territories, as well as First Nations, Inuit and Métis organizations, to build a system where health data, including information on outbreaks and immunization, can be pooled effectively, and governments can act faster. Overcoming privacy concerns is a key challenge, and any such initiative must ensure that personalized information is protected, the report says.

“We need to tackle the root causes of the problems that have plagued our ability to make progress toward a common aim for all Canadians,” the report says. “Put simply, our systems, processes and policies are geared towards an analog world, while we live in a digital age.”

Dr. Goel said there are several examples of countries that collect, share and process data better than Canada, while still protecting privacy and respecting regional autonomy. Several Scandinavian countries have systems Canada should seek to emulate, he said, while the British, despite having data challenges of their own, have a more effective surveillance system implemented across England, Scotland, Wales and Northern Ireland.

“There are models for how we could approach that in Canada, but until we get to the point where we work together on these things, we wind up with these siloed sorts of approaches across the country,” Dr. Goel said.

“These issues have been underscored through Canada’s response to COVID-19,” the report says. The challenges include “timely collection and use of testing, case and vaccination data; assessing impacts of the pandemic in specific populations; sharing genomic data for management of variants; and the persistent challenges of long-term care.”

Source: https://www.theglobeandmail.com/canada/article-canadas-data-gaps-hampered-pandemic-response-hurting-vaccination/

Government’s failure to keep stock of PPE reserves hurt us when we needed it most

Good commentary on the long history of government data management and use issues, brought to prominence during COVID-19, along with systemic accountability issues.

And yes, the default option for government data would be public (and to be fair, the open government initiative has resulted in more availability of data):

Seventeen years ago, there was a cabinet minister named Reg Alcock, the President of the Treasury Board, who invited people to his office for lectures about data.

The late Mr. Alcock was a hefty, 6-foot-8 mountain of a man with two main interests: Liberal Party organizing in Manitoba and dragging the government into the digital age. Part of the lecture he gave in 2004 was a question: Why is it that corporate executives have computers that can tell them, for example, how many trucks their company owns, but a prime minister would need a year to get the same answer from government?

On Wednesday, Auditor-General Karen Hogan issued a report on the government’s handling of stockpiles of PPE that let it be known that Mr. Alcock’s question is still hanging in the air, nearly two decades later.

Ms. Hogan’s team reported that the Public Health Agency of Canada (PHAC) had a stockpile of personal protective equipment and medical devices, but it didn’t have a policy about what should be in it, or what was in it, or whether the equipment had expired.

When the biggest public-health crisis of modern times hit and provinces needed N95 masks and ventilators from the National Emergency Strategic Stockpile, well, there wasn’t enough useful stuff there. The data were so unreliable the auditors couldn’t tell how badly it fell short.

The haphazard management of the stockpile wasn’t a new thing. Internal audits in 2010 and 2013 raised those issues.

Citizens might think a decade of disregarded warnings is a scandal that will shake the halls of power in Ottawa. But for a politician, it is cause for relief. The best kind of failure is one that was going on long before you took office. Prime Minister Justin Trudeau’s advisers will be happy enough that the Auditor-General credited the government for responding after the crisis hit.

But note that PHAC did draft a proposal to develop a better inventory management system in January, 2020 – just as COVID-19 was spreading – but agency officials told auditors “it was put on hold because of budget constraints.”

Mr. Alcock, back in the day, didn’t just want government to get computer systems – they have a lot – but to manage data, to make more information available and usable, so that government knows better what is happening within government.

But politicians in charge aren’t good at driving change in long-term, systemic issues that voters don’t even see. Mr. Alcock, for example, was preaching for IT in a Paul Martin government busy with Liberal scandals and non-confidence votes in Parliament.

Two PMs later, and governments still have a hard time seeing what government is doing. The National Emergency Strategic Stockpile wasn’t much use in a crisis because it didn’t do the kind of information management that that happens at a grocery store: figuring out what you will need, buying it, tracking what goes in and out and what is going bad.

By now we know that bad data management, not knowing what you don’t know, raises risk in a crisis. And there’s something else: Most of that data can and should be made public.

Why not let the public see the running tally of N95 masks in inventory, or ventilators on the web? Most people won’t look at it, but perhaps a few experts in universities and elsewhere will analyze the policies, crunch the data and, we can hope, point out when they’re messed up. Or just missing. That applies to other kinds of data, too.

In Britain, this week’s remarkable testimony of Dominic Cummings, a former aide to Prime Minister Boris Johnson, about the chaotic initial response to the pandemic made it pretty clear that it’s no longer necessary, or wise, to leave the data inside government.

Mr. Cummings testified to a parliamentary committee that false assumptions, bad analysis, and groupthink inside government led Mr. Johnson’s government to a disastrous notion that it should try to reach herd immunity rather than slowing the spread of COVID-19. Scientists outside government, notably a mathematician, helped convince him that was “catastrophically wrong,” he said. He and the government’s top science adviser later agreed data should have been released earlier, to get input.

That’s not the same thing as PHAC’s failure to keep track of a stockpile. But then, if we want to encourage the government to keep tabs on the data, one good way is to demand to see it.

Source: https://www.theglobeandmail.com/politics/article-governments-failure-to-keep-stock-of-ppe-reserves-hurt-us-when-we/

Canada votes to collect data to document ‘environmental racism’

Interesting, likely correlates with lower income as well:

Canada will collect data on the impact of siting a disproportionate number of polluting industries and landfills in areas inhabited by racial minority communities, federal lawmakers voted Wednesday.

The bill aims to tackle “environmental racism,” where Indigenous, Black and other racial minority communities are exposed to higher levels of dirty air, contaminated water or other toxins and pollutants.

One of the most famous cases is in the Indigenous Grassy Narrows First Nation community in Ontario, where residents have since the 1960s suffered health impacts from mercury contamination produced by a former pulp and paper mill.

Source: Canada votes to collect data to document ‘environmental racism’

President Trump Reduced Legal Immigration. He Did Not Reduce Illegal Immigration

Usual solid analysis by Cato Institute:

President Trump entered the White House with the goal of reducing legal immigration by 63 percent. Trump was wildly successful in reducing legal immigration. By November 2020, the Trump administration reduced the number of green cards issued to people abroad by at least 418,453 and the number of non‐​immigrant visas by at least 11,178,668 during his first term through November 2020. President Trump also entered the White House with the goal of eliminating illegal immigration but Trump oversaw a virtual collapse in interior immigration enforcement and the stabilization of the illegal immigrant population. Thus, Trump succeeded in reduce legal immigration and failed to eliminate illegal immigration.

Figure 1 shows the monthly number of green cards issued to immigrants outside of the United States. In most years, about half of all green cards are issued to immigrants who already reside in the United States on another visa. Thus, the number of green cards issued to immigrants abroad is a better metric of the annual inflow of lawful permanent residents than the total number issued. Trump cut the average number of monthly green cards issued by 18.2 percent relative to Obama’s second term, but that average monthly decline hides the virtual end of legal immigration from April 2020 onward.

In response to the recession and the COVID-19 outbreak, President Trump virtually ended the issuance of green cards to people abroad. In the last 6 months of the 2020 fiscal year (April‐​September 2020) the U.S. government only issued about 29,000 green cards. In the same period in 2016, the U.S. government issued approximately 309,000 green cards. Compared to the last half of FY2016, the number of green cards issued in the last half of FY2020 fell by 90.5 percent (please see note at the end of this blog post for how I estimated these figures).

Before the COVID-19 pandemic during the period from January 2017‐​February 2020, the average number of green cards issued per month was only down about 0.5 percent under Trump compared to from January 2013‐​February 2016 under the Obama administration with cumulative numbers down just over 3.2 percent. Beginning in mid‐​to‐​late March, the Trump administration virtually halted the issuance of green cards to people abroad. Without the COVID-19 immigration restrictions unilaterally imposed by the President, the issuance of green cards to foreigners abroad would have barely declined relative to the second term of the Obama administration.

Figure 2 shows the monthly number of non‐​immigrant visas (NIVs) issued abroad. NIVs include tourist visas, work visas, student visas, and others that do not allow the migrant to naturalize. Trump cut the monthly average number of NIVs by about 27 percent relative to Obama’s second term, but that decline obscures the virtual end of NIVs from April 2020 onward.

As with immigrant visas, President Trump virtually ended NIV issuance in response to the recession and the COVID-19 outbreak. In the last 6 months of the 2020 fiscal year (April‐​September 2020) the U.S. government only issued 397,596 NIVs. In the same period in 2016, the U.S. government issued more than 5.6 million NIVs. Compared to the last half of FY2016, the number NIVs issued in the last half of FY2020 fell by almost 93 percent (please see note at the end of this blog post for how I estimated these figures).

Before the COVID-19 pandemic, during the period from January 2017‐​February 2020, the average number of monthly NIVs issued was down about 12 percent under Trump compared to the January 2013‐​February 2016 period under the Obama administration and the cumulative numbers were down by just over 14 percent. Beginning in mid‐​to‐​late March, the Trump administration virtually halted the issuance of NIVs to people abroad. The COVID‐​19‐​related restrictions were the most severe and impactful part of Trump’s immigration policy.

Looking at the decline in the number of visas issued abroad under Trump through November 2020 compared to the second term of the Obama administration, Trump reduced the number of green cards issued by approximately 418,453 green cards and the number of NIVs issued by about 11,178,668. That’s a roughly 18 percent decline in the number of green cards issued abroad and approximately a 28 percent decline in the number of NIVs issued during Trump’s only term relative to Obama’s second term.

Although Trump succeeded in cutting legal immigration more than he initially planned, he oversaw the collapse of interior immigration enforcement. In 2020, the removal of illegal immigrants from the interior of the United States was the lowest as an absolute number and as a share of the illegal immigration population since ICE was created in 2003 (Figure 3). Trump failed to increase removals because local jurisdictions refused to cooperate with his administration, continuing a trend begun during the Obama administration in response to their deportation efforts. As a result, the population of illegal immigrants remained about the same as when he took office (Figure 4).