In one of our previous posts, we discussed biometric technology and the role it plays in Canadian law enforcement. It is, however, only one of the “predictive” tools utilized by the police in relation to criminal investigations.
A new report by the Citizen Lab at the University of Toronto goes into alarming detail regarding growth of algorithmic policing methods, and how this technology compromises the privacy rights of Canadian citizens. The report is incredibly thorough and comprehensive, delving into how this controversial technique offends various sections of our Canadian Charter of Rights and Freedoms. Firstly, though, it is important that our readers understand what algorithmic policing is.
The overall success of any algorithm is the system’s ability to gather, store, and analyze data – with law enforcement’s methodology being no different. A “location focused” algorithmic approach seeks to determine (predict) which areas are more likely to see criminal activity. The algorithmic system in these pursuits analyzes historical police data to identify geographical locations where crimes are, in theory, more likely to be committed. If this sounds familiar to you, then you’ve likely heard of, or accessed, the Vancouver Police Department’s GeoDash crime map – an online tool where you can navigate a map of the City of Vancouver by crime occurrence. You can choose from a variety of offences on the dropdown list, including homicide, break and enter, mischief, theft, and “offences against a person” which likely includes a variety of crimes such as sexual assault, assault causing bodily harm, and uttering threats. By looking at this map, you get an idea of which neighborhoods in Vancouver are most vulnerable to crime – except that it’s a little bit more sophisticated than that, and goes far beyond simply dropping a pin on the map. The public can see where the crime took place, but not who is alleged to have committed it. The offender’s personal information is logged, in as much detail as possible, and becomes part of a larger system dedicated to predictive surveillance – i.e., it creates a profile of which individuals are more likely to commit a particular crime. This profile can be used to identify people who are “more likely to be involved in potential criminal activity, or to assess an identified person for their purported risk of engaging in criminal activity in the future”.
While this information is definitely concerning, there is another issue: we have very little insight into the extent that this technology is being used. We know that the methods by which police gather information have historically discriminated against minority groups and those living in marginalized communities. This seems to guarantee that the VPD’s use of algorithmic investigative techniques relies on data that is often obtained through biased methods. We know that black and indigenous individuals are disproportionately represented in the correctional system, which can only mean that they are disproportionately represented in respect of these algorithms.
Although not everyone agrees that systemic racism exists within the VPD, the calls to address, unravel and mitigate the harm to marginalized groups continue to amplify. The idea that information collected under the apprehension of bias will not only remain on record, but will be used to further future investigations, is an indicator that Canadian law enforcement’s road to redemption will likely be a bumpy one.
“Not only is this a concern with the possibility of misidentifying someone and leading to wrongful convictions, it can also be very damaging to our society by being abused by law enforcement for things like constant surveillance of the public”
– Nicole Martin, Forbes contributor
Star Trek. Back to the Future. District 9. I, Robot. These are only a few examples of films that have relied on biometrics – more commonly referred to as Facial Recognition – as a theme for entertainment. All are fiction based and while you may have thought of biometrics as a tool used by elusive government agencies like the FBI and CIA, that isn’t the case at all. Advancements in biometric technology have been seized upon by various law enforcement and government agencies across Canada – creating serious concerns from privacy and civil liberty advocates, and of course, criminal defence counsel.
The Calgary Police Service began using Facial Recognition technology in 2014. The system they use, known as NeoFace Reveal, works by analyzing an uploaded image and translating it into a mathematical pattern known as an algorithm. The image is then logged in a database and used for comparison against other uploaded images.
The Toronto Police Service hopped on board too. They reported uploading 800,000 images into their Repository for Integrated Criminalistic Imaging, or RICI by 2018. Their use of biometrics began with a trial in 2014, and in 2018, the Service purchased a system at a cost of about $450,000. Between March and December of 2018, the Toronto Police Service ran 1,516 searches, with about 910 of those searches (or 60%) resulting in a potential match. Of those potential matches, approximately 728 people were identified (about 80%). There were no statistics provided in relation to ethnicity, age, or gender, however, research has raised concerns about disproportionate effects of biometrics as it relates to people of color.
Manitoba Police do not currently use biometric technology as an investigative tool, although the idea was floated around in 2019 after the commission of a report concerning growing crime rates in Winnipeg’s downtown core. The Provincial government in Manitoba went so far as to suggest that this technology could be used to identify violent behavior – which sounds a lot like active surveillance, an unethical use of biometrics, which demonstrates one of the most profound concerns surrounding use of this technology. And while it is only a matter of time until the Manitoba Police do use this technology, many retailers in the province are already using it.
At home here in British Columbia, the Vancouver Police Department denies using Facial Recognition technology as a mechanism to investigate crime – in fact, back in 2011, they turned down ICBC’s offer to assist in identifying suspects involved in the Vancouver Stanley Cup Riots with the aid of their software. The office of the BC Privacy Commissioner confirmed that any use of ICBC’s facial recognition data by the VPD would amount to a breach of privacy for its customers.The office of the Privacy Commissioner of Canada has been keeping track since at least 2013 – yet, there is little regulation of the use of biometrics in public and private sectors.
The same cannot be said for the RCMP in British Columbia, who, as recently as two weeks ago refused to confirm or deny use of biometrics as an investigative tool, but questions have been raised as to whether or not the RCMP is a client of Clearview AI, a facial recognition startup pioneered by US citizen Hoan Ton-That. Clearview’s work has not gone unnoticed – Facebook and Twitter have issued cease and desist letters, making it very clear that they do not support Clearview’s objectives. Google issued a cease and desist letter as well – however, their position on biometrics is fuzzy – especially since they are trying to make advancements in this area as well. So far, though, they have come under fire for their tactics and the results that have been generated.
The Canadian Government’s position on the use of biometrics is established on their website. When you submit your biometric information at Service Canada (for example), your information isn’t actually stored there, rather, it is sent to the Canadian Immigration Biometric Identification System, where it will remain for a period of 10 years. Further, your biometrics information will be shared with the United States, Australia, New Zealand and the United Kingdom. And yes – you can refuse to provide this information – but it will likely put a kink in your travel plans.
One important factor to consider about all of these agencies and their use of biometric technology is that this tool was never intended for use as active surveillance, or a method to intervene in incidents of crime in real-time. Whether it is a violent assault, sexual assault, theft under or over $5,000, murder or kidnapping, biometrics is an “after the fact” investigative mechanism. If used ethically and within parameters that preserve the privacy of all citizens 100% of the time, perhaps there would be no need for alarm – but that is incredibly unlikely. As more agencies begin to use this technology, the lack of regulatory oversight is bound to create an enormous pervasion of your privacy – and you may never know about it.