“The Minority Report” or its subsequent film and television adaptations are often invoked during discussions surrounding predictive policing, what Philip K. Dick dubs “Precrime.” In “The Minority Report,” the Precrime system relies on psychics who can predict an event before it happens, meaning there’s an impersonality to how it singles an individual out. The precogs, as far as the story tells us, do not rely on identifying information about potential killers, such as race, gender, nationality, socioeconomic status, or sexuality to make their predictions. This ideal of being able to identify potential criminals with scientific neutrality has long been a goal of criminologists and policing organizations. Just like in “The Minority Report,” these attempts at predictive policing start from a desire to reduce deaths from violent crime, but ultimately raise questions around human rights, freedom, and privacy.

The failed UCLA Center for the Study and Reduction of Violence, founded by psychiatrist Louis Jolyon West, as one of the earliest modern visions of a predictive policing system comparable to “The Minority Report.” West’s vision was to conduct psychiatric research on perpetrators of violence to create a database of human behavior. The police would be able to use this vast database to identify what West called “dangerousness” in people, and therefore prevent crime. Interestingly, just as Anderton and Lisa see Precrime as the humane alternative to punitive policing, so did West view his approach as progressive. By predicting violence, he believed he would not only save innocent victims but create an alternative to harsher policing methods. However, the center closed in the 1970s after a wave of public opposition. West’s first hire for the center, Dr. Frank Ervin, was a proponent of psychosurgery, the use of surgery on an otherwise healthy brain to try and enact behavior modification. The center could not overcome the catastrophic ethical implications.

With the rise of computing and the internet, those interested in creating databases for policing have an unprecedented amount of data at their fingertips. The National Institute of Justice began offering grants to U.S. police departments developing predictive programs in 2009, marking a new era of Precrime. Instead of psychic precogs, police departments use computer algorithms to attempt to predict criminal behavior. Almost universally, the variables that go into the algorithm’s equations are proprietary secrets, causing human rights advocates to worry about bias being built into the system. Beginning around 2012, the Chicago Police Department has used algorithms to create what they call a heat list, identifying individuals believed likely to commit crimes, and heat maps, identifying places where they believe crime is likely to occur. Their predictions have resulted in increased video and audio surveillance of neighborhoods and people identified by the algorithm, raising privacy concerns. Critics also worry that these measures could create a self-fulfilling prophecy, rail-roading innocents into criminality. Another common algorithm used in the U.S. justice system, COMPAS, predicts the likelihood of a convicted criminal to reoffend. However, a 2018 study found the algorithm was no more accurate at predicting recidivism than a random person asked to perform the same task. Even though the mechanism of these predictive policing systems is different, Philip K. Dick’s Precrime system predicted many of the ethical discussions that surround predictive policing, and its tightrope walk between social safety and individual liberty.