We Humans and the Intelligent Machines. Jörg Dräger
of the judges but also to their workload, since they only have a few minutes to decide what bail to set.
What promises advantages for society can, however, result in tangible disadvantages for the individual. Hardly anyone knows this better than Eric Loomis, a resident of the state of Wisconsin. In 2013, he was sentenced to six years in prison for a crime that usually draws a suspended sentence. The COMPAS algorithm had predicted a high probability of recidivism, contributing to the judge’s decision in favor of a long prison sentence. The discrimination that can result from the use of algorithms will be discussed in more detail in Chapter 4.
In the service of efficiency
Every autumn in New York City, the application phase for high school begins.12 For many parents this is a time of stress and uncertainty because there are too few places at the popular schools known for getting their students into good colleges and thus providing better career prospects. The teenagers and their parents research secondary schools for months, and some have taken admission tests or gone in for interviews. The right high school should be academically challenging, have good sports facilities and ideally be located in the neighborhood. Naturally, it would also have a high graduation rate and be seen as competitive. Approximately 80,000 young people and their parents have until December 1 to choose 12 schools from over 400 options on the application form. The following March, the Department of Education will tell them which school they can attend.
Until 2003, the department’s staff had to allocate slots manually – a complex task that took place under considerable time pressure. The amount of administrative work was immense, and the result was unsatisfactory because 41 percent of the students did not get a place at one of the four schools they could select back then. Dissatisfaction among students and families was correspondingly high. Children with poor grades or from poorer households were seldom given a chance, while highly committed parents always came up with some new way to get their offspring into one of the best schools.
Today, New York’s young people have a better chance of going to a school of their choice since neither administrators nor lotteries are selecting the secondary school. That is now the job of an algorithm. A method derived from game theory allows a much more accurate fit between students’ preferences and schools’ capacities. Today, 96 percent of the students in America’s largest city go to a high school of their choice, and not only because the wish list has been expanded from four to twelve. Half of the students receive a place at their most preferred school, another third at their second choice. The new system prevents instances such as those occurring in the past where some children were accepted at several of their chosen schools and others at none at all. The matching has become much more efficient.
New York City uses algorithms to optimize a standard distribution problem: Too many applicants have to be assigned to too few places. With other high-demand goods, such as tickets for a popular concert, the solution would be simple. Prices would simply be increased until supply and demand are balanced. But access to public goods such as school education needs to be determined by other criteria – which were developed for New York by a Nobel Prize laureate. Alvin E. Roth of Stanford University designed an algorithm that only makes a final allocation after several preliminary rounds of virtual matching, taking into account both students’ preferences and the schools’ capacities and selection criteria.
Nevertheless, this algorithm does not solve all the problems faced by the city’s education system: Social inequalities, for example, are not eliminated by efficient allocation, nor is the fact that pupils from different backgrounds tend to go to different schools. Furthermore, there are still not enough slots at the popular schools and there is a clear gap between the educational opportunities in New York’s richer and poorer areas. Children from socially disadvantaged households and with lower grades still tend to end up in underfinanced and poorer schools. Parents in underserved neighborhoods may be happier because their child is given a place at the nearest school, but that does not make the school the best choice for the child. Students from more affluent households, on the other hand, often receive intensive support, including from professional consultants, in drawing up their wish lists.
Algorithms that try to solve complex tasks more efficiently are not only used in New York City’s schools. How social welfare benefits are verified has also been automated.13 In 2009, 48,000 investigations into welfare fraud were carried out manually, with only $29 million seized as a result. Today, an algorithm recognizes the patterns of fraud much more reliably. The number of investigations has been reduced and, with that, the number of false accusations; at the same time the amount of money recovered has increased. In 2014, $46.5 million was recovered after only 30,000 investigations. However, the lack of transparency remains a problem here as well. Although fraud perpetrated at the expense of the general public can now be detected more efficiently than in the past, the individuals involved are given little insight into the criteria used to investigate them. Yet a high degree of transparency would be desirable, especially when it comes to distributing social benefits, since that would increase credibility and trust that administrative decisions are taken fairly.
Setting the course
New York City is not the only American city where algorithms are omnipresent. Chicago and Los Angeles provide their judges with support in the form of software or use predictive policing as well. Algorithmic systems are also used outside the US, for example in Australia, where they decide on social benefits and even automatically send reminders and warnings when potential fraud is perceived (see Chapter 9). Germany is not there yet but initial applications do exist: In Berlin, places at primary schools are allocated using software (see Chapter 10) and algorithms check tax returns for plausibility. Six of the country’s states use different forms of predictive policing (see Chapter 11). Especially in large cities, public administration has become so complex that municipal services from police patrols to waste collection can hardly be managed without technological support – including the use of algorithms. They are part of the daily life of every citizen. But most citizens do not know these algorithms exist, let alone understand how they function. People do not need to understand, you might say. They should be happy if the garbage is picked up on time and no unnecessary costs arise for them as taxpayers.
Yet with decisions about imprisonment, access to the best educational path or governmental support, algorithms intervene deeply in the fundamental rights of individuals. This makes the software and its design highly political. Such seemingly intelligent systems should not only be debated behind closed doors or among academics but also in a broad social and political discourse – especially since even well-designed algorithms can discriminate. In the fight against crime, they can be self-reinforcing: The police find the most crime in the areas they investigate the most. Minor drug offenses, for example, common in most parts of a city, are identified disproportionately frequently in certain neighborhoods, leading to even more police checks there. Or in the case of the courts: When an algorithm sends people to prison for a longer period of time, they are more likely to remain unemployed after their release. They will also have less contact with family and friends and will therefore be more likely to become repeat offenders which confirms the algorithm’s predictions. Critics argue that all this reinforces the discrimination against and stigmatization of certain social groups.
As New York City shows, algorithms can solve tasks that are too complex for humans. They can be useful helpers for us and our societies. But whether or not they are successful depends on the goals we set for them. They are neither inherently good nor bad. Ideally, they result in more safety, justice and efficiency. At the same time, however, they can reinforce existing social inequalities or even create new forms of discrimination. It is up to us to set the course so that things develop in the right direction.
James Vacca now teaches at Queens College, City University of New York. His years on the City Council are over since its members can serve a maximum of two consecutive terms. He proudly looks back on December