Hands typing on computer keyboard. Photo: Jeso Carneiro/Flickr

I enjoy reading science fiction, especially when it considers humanity’s struggle to deal with new technologies. Often these stories present a cautionary tale about how new technologies can be misused to oppress people. This idea of science fiction as cautionary tales was summed up by author Ray Bradbury, who wrote: “The function of science fiction is not only to predict the future, but to prevent it.”

One of my favourite science fiction writers is Philip K. Dick, who wrote a number of these cautionary tales. One of them, “The Minority Report” (which you may know instead as a Tom Cruise movie — the short story is better) presented a future where police did not investigate crimes that had occurred; instead, the “PreCrime” unit stops crimes before they occur, based on predictions from precognitive mutants.

Reality imitates fiction

So imagine my surprise when I came upon an article discussing police use of a computer program called PredPol (short for predictive policing) to identify areas that are more likely to experience crimes and to direct police resources to those areas. This program and others like it are apparently used by about 50 police forces across the United States. While Canadian police are watching what American forces are doing with this sort of technology, they have not yet adopted the use of the technology, at least in part because of the civil liberties concerns that it raises.

While PredPol focuses on identifying geographic areas that are allegedly more prone to crime, the Los Angeles Police Department (LAPD) is taking things one step further. The LAPD has developed a program called Los Angeles Strategic Extraction and Restoration (LASER) that assigns scores for individuals to try to identify people who are likely to commit crimes in the future. LASER assigns points for certain occurrences, including what the LAPD calls “field interviews” — the practice that we call carding in Canada. As a result, an individual who is targeted by this arbitrary practice will be assessed points in a determination that the individual is at higher risk of committing crimes. 

This result is problematic enough simply because of the arbitrary nature of carding, but when you consider the racial profiling that was an undercurrent of the carding practice it becomes a much more significant problem (and if you doubt that racial profiling was an element of carding, take a look at the regulation enacted in Ontario to try to rein in the practice. The drafters of the regulation felt that in addition to prohibiting police checks that were “arbitrary,” it was necessary to specifically prohibit such checks based on the fact that the individual in question is perceived to be a member of a particular racialized group, except where the officer is looking for a specific individual, part of the description of the individual is that they are a member of that racialized group, and the officer has additional information that further narrows down the description of the individual they are looking for).

Bias in, bias out

One argument that some might make to justify the use of these sorts of programs is that they will take such biases out of police work. The problem with that argument is best summed up by the computer science saying, “garbage in, garbage out.” In other words, any program, no matter how sophisticated or well constructed it is, is reliant on the data that is inputted into the program. If that data is faulty, then the program will produce faulty output. If the data going into the program is affected by systemic or institutional bias, then the results produced by the program will reflect that bias as well. So if the information fed to the program leans toward a bias that racialized people, or areas in which racialized people live, are more likely to be involved in crimes, one should expect that the program will “predict” that those people or areas are more prone to criminal activity. And the data will be faulty. It is a fiction to think that the police, and society at large, are not affected by institutional or systemic bias. Just last year, Ontario’s government passed the Anti‑Racism Act (which I have previously blogged about), whose entire goal is to recognize the existence of and to address systemic racism in Ontario (as distinct from addressing specific instances of discrimination, which is the purpose of the Human Rights Code).

Even if we accept that the development and use of these programs are motivated by the best of intentions, they are not acceptable if they have an adverse effect. It is well established in Canada that a discriminatory act or result is still discriminatory whether or not the discrimination was intended (see the case of Esposito v. BC, for one example). To apply this concept to the use of programs such as PredPol and LASER, if those programs generate results that targets racialized groups or individuals, then there will be a strong argument that the use of the programs is discriminatory toward those groups or individuals.

Until the institutional bias that exists in society is addressed, the use of programs such as PredPol and LASER will not help to eliminate bias in police work. Instead, they will continue the bias, but it will be hidden behind the faulty argument that a logical, dispassionate computer program cannot be affected by such bias. And, if those results are perceived as being unbiased (even though they are implicitly biased because the data fed into the programs is itself affected by institutional bias), the effect could be to undermine the argument that institutional bias exists and needs to be addressed.

In a world where the police need to work to gain the trust of racialized groups, and institutional bias needs to be opposed, we cannot afford to miss the cautionary tale of this science fiction story.

Photo: Jeso Carneiro/Flickr

Iler Campbell LLP is a law firm serving co-ops, not-for-profits, charities and socially-minded small business and individuals in Ontario.

Pro Bono provides legal information designed to educate and entertain readers. But legal information is not the same as legal advice — the application of law to an individual’s specific circumstances. While efforts are made to ensure the legal information provided through these columns is useful, we strongly recommend you consult a lawyer for assistance with your particular situation to obtain accurate advice.

Submit requests for future Pro Bono topics to [email protected]. Read past Pro Bono columns here.

Help make rabble sustainable. Please consider supporting our work with a monthly donation. Support rabble.ca today for as little as $1 per month!

pro bono 600 BW

Pro Bono

Pro Bono is a monthly column written by lawyers and legal experts at Iler Campbell LLP that explores the murky legal waters activists regularly confront in doing their work.

Michael Hackl

Michael Hackl

Michael Hackl is a contributor to rabble’s Pro Bono column. Hackl is a lawyer with Iler Campbell LLP where he practices civil litigation, providing advice and representation to charities, non-profit...