Название: We Humans and the Intelligent Machines
Автор: Jörg Dräger
Издательство: Bookwire
Жанр: Зарубежная прикладная и научно-популярная литература
isbn: 9783867938860
isbn:
Vacca was irritated by the lack of openness in administrative procedures as early as the 1980s. At the time, he was annoyed by what he considered a shortage of personnel at the Bronx police station which he oversaw as district manager. When he turned to the relevant government agency, he was told that the crime rate in his district was too low for more policemen. The underlying formula used to calculate the rate, however, was not given to him. Therefore, he could neither understand nor question the quota, nor take action against it.
Vacca wanted more transparency. In August 2017, he presented the first version of the bill to the City Council. It would have required all public authorities to disclose the source code for their algorithms. Yet the experts put the brakes on during the Committee on Technology hearing: The subject area is still too unknown, they said. Too much transparency would endanger public safety, make the systems vulnerable to hackers and violate software manufacturers’ intellectual property.
Vacca had to make concessions. A commission of academics and experts was set up to draft rules, due by the end of 2019, on how City Council members and the public will be informed about such automated decisions. Vacca was nevertheless satisfied because the commission has a clearly defined mandate: “If machines, algorithms and data determine us, they must at least be transparent. Thanks to the transparency law, we will have a better overview and understanding of algorithmic decision-making, and we will be able to make agencies accountable.”4 The trend towards more openness and regulation seems unstoppable.
The legislative initiative has already stimulated a number of changes. The use of algorithms is now on New York’s public agenda – in the City Council, in the media, among the city’s residents. Algorithms are a political issue. A debate is taking place about what they are used for. And they are already used very broadly.
In the service of safety
It is not only 911 emergency calls but also computer messages that send New York police officers out on their next assignment.5 No crime has occurred at the scene assigned to the police by the software. According to the automated data analysis, however, the selected area is likely to be the site of car theft or burglary in the next few hours – crimes that could be prevented by increased patrols.
Algorithms are managing law enforcement activities. In the 1990s, New York City was notorious for its high crime rate and gangsterism. Within one year, 2,000 murders, 100,000 robberies and 147,000 car thefts took place. New York was viewed as one of the most dangerous cities in the world. Politicians reacted. Under the slogan “zero tolerance,” tougher penalties and higher detection rates were meant to make clear: Crime does not pay.
But what if modern technology could be used to prevent crime before it even occurs? The New York police force also considered this, although it initially sounded like science fiction. The Spielberg thriller Minority Report, based on the short story by Philip K. Dick, played the idea through in 2002: In a utopian society, serious crimes no longer happen because three mutants have clairvoyant abilities and reliably report every crime – a week before it is committed. Potential offenders are detained. Chief John Anderton, played in the movie by Tom Cruise, leads the police department and is proud of its results until one day his own name is spat out by the system. He is now considered a murderer-to-be and desperately tries to prove his innocence.
In New York City, algorithms play the same role that the three mutants do for Dick and Spielberg: They provide crime forecasts. Yet with one decisive difference: The computer does not predict who will commit a crime in the near future but where it will take place. The term for this is “predictive policing.”
And it works like this: Software evaluates the history of crime for each district of New York in recent years and compares the identified patterns with daily police reports. Crime may seem random at first glance, but in fact certain crimes such as burglary or theft adhere to patterns that can be worked out. These patterns depend on demographics, the day of the week, the time of day and other conditions. Just as earthquakes occur at the edges of tectonic plates, crime takes place around certain hot spots, such as supermarket parking lots, bars and schools. The predictive policing software marks small quadrants of 100 to 200 meters in length, where thefts, drug trafficking or violent crimes have recently taken place, which – according to the analysis – are often followed by other crimes.
Since law enforcement officers started using predictive policing, their day-to-day work has changed. In the past, they were only called when a crime had already been committed and needed to be solved. Today, the computer tells them where the next crime is most likely to occur. In the past, they often took the same route every day, but now the software determines so-called crime hotspots where they need to be present to monitor what is going on. The police can thus better plan and deploy their resources and work more preventively. “The hope is the holy grail of law enforcement – preventing crime before it happens,” says Washington law professor Andrew G. Ferguson.6 New York Mayor Bill de Blasio sees this in a more pragmatic and less poetic way: Algorithmic systems, he argues, have made police work more effective and more trustworthy. The city is now safer and more livable.7 In fact, within 20 years the number of murders in New York City has fallen by 80 percent to only about 350 per year. Thefts and robberies also fell by 85 percent. It is not possible to determine exactly how much predictive policing has contributed to this. In any case, the software enables policemen to be where they are needed most.
The specific functioning of the algorithms, however, remains hidden from the public: How do these programs work? What data do they collect? There are lawsuits pending against the New York police for violating the Freedom of Information Act. People have just as little knowledge about where the algorithms are used, the plaintiffs argue, as they do about how the calculations take place. The first court to hear the case ruled in favor of the plaintiffs. Nevertheless, the police continue to refuse to publish detailed information about their predictive policing.
The New York Fire Department also prefers preventing fires to extinguishing them.8 But like the police, it struggles with limited resources. Not all of the 330,000 buildings in New York can be inspected every year. The firefighters must therefore set priorities and identify the buildings most at risk. But which ones are they? This selection process alone used to occupy an entire department. For a few years now, the firefighters have been using a computer program that algorithmically calculates the risk of each building catching fire. Taking into account the size, age, building material, pest infestation and inhabitant density as well as the history of fires in the neighborhood, the algorithm creates an inspection list for the next day (see Chapter 10).
In the service of justice
“Smaller, safer, fairer.”9 Using this motto, Mayor de Blasio presented his plan to close New York’s largest prison in June 2017.10 In the 1990s, most of the city’s then 20,000 prisoners were incarcerated on Rikers Island, once known as the new Alcatraz. By now, less than 10,000 New Yorkers are imprisoned and Rikers Island, which costs $800 million a year to run, is partly empty. Moreover, the prison has recently been shaken by a scandal about the mistreatment of a juvenile detainee. De Blasio therefore has several reasons for wanting to close the facility. He also would like to further reduce the number of prisoners: to 7,000 in five years and to 5,000 in the long term.
His biggest lever: algorithms. They are supposed to help New York’s judges better assess risks, for example, whether pre-trial СКАЧАТЬ