Anil AnandPolicing is the only authority sanctioned to use force against its citizens. So democratic systems institute safeguards to ensure police impartiality and accountability, often with multiple and overlapping oversight.

But who’s keeping an eye on the growing use of technology in policing?

The police services board is one prevalent form of oversight responsible for ensuring community needs and values are reflected in the policing priorities, objectives, programs and strategies. And that they’re delivered in a manner consistent with community needs, values and expectations.

Boards also provide an alternative to direct governance of police by municipal councils, creating separation between the executive and police to prevent partisan use of police powers.

While some argue that an apolitical and autonomous model is best, others argue for more direct control by the executive.

Opponents of autonomy argue that a fear of partisan governments, in Canada and elsewhere, has led to the policing that has become remarkably unaccountable.

Canada has mostly taken a middle path by incorporating police services boards as intermediary authorities, insulating the police from elected political bodies while ensuring accountability.

In 2018, there were 141 stand-alone municipal police services and 36 First Nations self-administered services across Canada.

Boards can be responsible for budgets, collective bargaining, promulgation of rules and regulations governing the organization, structure and procedures of the force, recruitment and hiring, and general policy direction.

While boards may give orders and directions to the police chief, neither the board nor individual members can give an order or direction to any officer. Nor can boards give direction on specific operational decisions.

Boards in turn are accountable to their municipal councils and those councils have final responsibility for budgets.

So boards insulate against political interference and provide oversight on compliance of police services.

But the emergence of widespread surveillance systems, facial and voice recognition technologies, and widely available data on platforms like Google, Apple and Amazon pose new challenges for police boards.

Boards are handicapped and unprepared for the technological innovations being assimilated into mainstream policing, often without their prior knowledge or approval.

It’s an emerging reality that leads some to feel policing is becoming remarkably unaccountable. A lack of technological expertise and understanding of surveillance technology on boards has contributed to the widening gap.

And these modern changes impact human rights, privacy laws and constitutional freedoms.

Alok Mukherjee, a one-time Toronto Police Services Board chair and now a professor at Ryerson University, has raised alarms about Toronto police using technologies that fundamentally alter surveillance and privacy without the prior knowledge of the service’s board.

Mukherjee points particularly to Clearview AI, a controversial facial recognition system that uses public data. Toronto police also use StingRay, which allows them to track people’s whereabouts and communications through their phones.

According to the New York Times, more than 600 law enforcement agencies have started using Clearview in the past year without any public scrutiny.

The most striking example of the next approach to profiling is the application of artificial intelligence towards predictive policing or, as law enforcement agencies call it, PredPol. The analysis of data predicts locations, times and types of crimes to be committed; and even predicts the individual who will commit the crime at a designated time and place.

Practices like carding, which have been widely decried and legislated against in many jurisdictions, are quietly and quickly being replaced by even more invasive technologies for public surveillance and data retention – and they’re more difficult to monitor.

There has also been a lack of consistency in how police services across Canada view and implement technology. Calgary police use facial recognition software but Vancouver police say they have never used the software and have no intention of doing so. Ontario Provincial Police acknowledge they use the technology but the RCMP won’t say what tools they use.

Technological advances like these have relegated boards to rubber-stamping budgets and complaints.

Boards mustn’t leave policy decisions to the courts. They need to work with legislators, civil rights experts, and courts to consider policies and controls for these new tools.

Yet it’s unlikely that most boards have the expertise or information to grasp the scope and implication of such policing practices.

Most of us will never know how surveillance data is used, where it’s retained, for how long, if and how it’s shared with other agencies or jurisdictions, or how it impacts our security assessment. And, most importantly, we will never know who determines the relevance of the data collected. We must rely on the oversight boards provide on our behalf.

Understanding and safeguarding personal freedom and civil rights is more critical than ever. The pace at which artificial intelligence is being developed and incorporated far outpaces the regulatory and ethical frameworks needed to protect our civil liberties.

Boards must become more engaged in the day-to-day implications of policy decisions and services related to emerging technologies.

Frontier Centre for Public Policy contributor Anil Anand served as a police officer with a Canadian service for 29 years in a variety of roles, including being assigned to Interpol. He has a master of law degree, as well as an MBA, and has taught criminology and community policing courses. His book Mending Broken Fences Policing, looks at the role of contemporary policing in modern society.

Anil is a Troy Media Thought Leader. Why aren’t you?

© Troy Media


police technology

The views, opinions and positions expressed by columnists and contributors are the author’s alone. They do not inherently or expressly reflect the views, opinions and/or positions of our publication.