Predicting Crime, LAPD-style
The Los Angeles Police Department, like many urban police forces today, is both heavily armed and thoroughly computerised. The Real-Time Analysis and Critical Response Division in downtown LA is its central processor. Rows of crime analysts and technologists sit before a wall covered in video screens stretching more than 10 metres wide. Multiple news broadcasts
are playing simultaneously, and a real-time earthquake map is tracking the region’s seismic activity. Half-a-dozen security cameras are focused on the Hollywood sign, the city’s icon. In the centre of this video menagerie is an oversized satellite map showing some of the most recent arrests made across the city – a couple of burglaries, a few assaults, a shooting.
On a slightly smaller screen the division’s top official, Captain John Romero, mans the keyboard
and zooms in on a comparably micro-scale section of LA. It represents just 500 feet by 500 feet. Over the past six months, this sub-block section of the city has seen three vehicle burglaries and two property burglaries – an atypical concentration. And, according to a new algorithm crunching crime numbers in LA and dozens of other cities worldwide, it’s a sign that yet more crime is likely to occur right here in this tiny pocket of the city.
The algorithm at play is performing what’s commonly referred to as predictive policing.
Using years – and sometimes decades – worth of crime reports, the algorithm analyses the data to identify areas with high probabilities for certain types of crime, placing little red boxes on maps of the city that are streamed into patrol cars. "Burglars tend to be territorial, so once they find a neighbourhood where they get good stuff, they come back again and again," Romero says.
"And that assists the algorithm in placing the boxes." […] "A really good officer would be able to go out and find these boxes. This kind of makes the average guys' ability to find the crime a little bit better."
Predictive policing is just one tool in this new, tech-enhanced and data-fortified era of fighting and preventing crime. As the ability to collect, store and analyse data becomes cheaper and
easier, law enforcement agencies all over the world are adopting techniques that harness the
potential of technology to provide more and better information. But while these new tools have been welcomed by law enforcement agencies, they’re raising concerns about privacy, surveillance and how much power should be given over to computer algorithms.
P Jeffrey Brantingham is a professor of anthropology at UCLA who helped develop the predic-
tive policing system that is now licensed to dozens of police departments under the brand name PredPol.
"This is not Minority Report," he’s quick to say, referring to the science-fiction story often associated with PredPol’s technique and proprietary algorithm. "Minority Report is about predicting who will commit a crime before they commit it. This is about predicting where and when crime is most likely to occur, not who will commit it."
[…] Jennifer Lynch, senior staff attorney at the EFF (Electronic Frontier Foundation), worries that there’s too much submissive acceptance of these technologies by the public, without consideration of exactly how this data is collected and used. She says that predictive policing, with its claims of reducing crime, will be given something of a free pass.
"What starts to happen is people think the results that come out of that must be accurate be-
cause there’s technology involved," Lynch says.
"But what we forget is that the information that went in may have been the subject of bias, may have been based on inaccurate assumptions about people, may have been collected in certain communities more than other communities. The problem is technology legitimises somehow the problematic policing that was the origination of the data to begin with."
[…] "We’re pretty careful about what people do here," Romero says. "I care about civil liberties and freedom. And I know that our constitution was not written to protect us from gang members and thieves and thugs; it was written to protect us from the government and overreach of the government."
But concerns persist. Gary T Marx, professor emeritus of sociology at the Massachusetts
Institute of Technology, says technology such as predictive policing creates 'categorical suspicion' of people in predicted crime areas, which can lead to unnecessary questioning or excessive stopping-and-searching.
And as data-driven policing expands, Marx worries that analysis and decision-making by machine will lead to what he calls "the tyranny of the algorithm".
"The Soviet Union had remarkably little street crime when they were at their worst of
their totalitarian, authoritarian controls," Marx says. "But, my god, at what price?"
source: Nate Berg, www.guardian.com, 25/06/2014, accessed 05/07/2016
 to harness: control, make use of
 thug: a violent person