Find all related Progressive Post
Progressive Post

We are in the midst of a digital transformation. One of its consequences is that countless automated decisions, powered by algorithms, are made every day. This may sound abstract, but the Swedish context shows how a seemingly straightforward algorithmic system resulted in extensive detrimental consequences for thousands of school children. This is but one example of public sector automated decision-making, commonly used for determining eligibility for economic and social benefits, and for such diverse purposes as taxation, policing and healthcare.
While these new digitally empowered ways of administration are necessary for delivering modern welfare, we should not be blind to the increased risks. Some recent cases are the childcare benefit scandal in the Netherlands and the post office scandal in the UK, wrongly criminalising and punishing ordinary citizens. Research – and experience – also shows that Automated decision-making (ADM) systems carry dangers of discrimination, bias and unequal distribution. A Swedish ADM application for school placements perfectly illustrates how algorithmic injustice can arise and persist.
In the spring of 2020, the city of Gothenburg decided to use an algorithm to place children in public schools, but it went terribly wrong. Blinded by technological trust (though likely without intent), the city coded the system incorrectly. Instead of using parents’ preferences (their elected schools) and children’s walking distances to the schools as the decisive principles for school placement (stipulated by the Swedish law), they coded the system to optimise placements only on geographical distances as the crow flies. No surprise this caused massive problems in a city divided by a large river, because children cannot fly. The incorrect algorithm resulted in several hundred children ending up in schools at the opposite riverbank, resulting in lengthy commutes. To make matters worse, these initial faulty placements meant that children were now taking up school spots far from home, forcing an equal number of children (who lived in the actual vicinity of these schools) to other placements further away, also against their will. In this way, every error was multiplied several times over.
This is an example of a systematic error, where the problem can be traced to the code, not individual decisions. However, our legal systems primarily allow for individual corrections. This means they call for individual appeals if you do not believe your decisions are correct. But systematic errors can never be corrected by individual redress. Also, how would you even know if a decision is correct, when the decision-making is hidden, and your decision may depend on another faulty decision five steps down the chain? In fact, ADM systems bring an opacity that makes errors particularly difficult to see and contest.
Motivated by the injustice of the situation – and one of my sons being placed by the algorithm – I decided to sue the city. Not with the intent to correct the individual decision concerning my son, but to address the systematic aspect and every faulty decision resulting from incorrect code. Hence, I asked the administrative court to assess the legality of the code and the implementation of the ADM system. In fact, with a background in law and a PhD in digital transformation, I was intrigued by this digital enigma and decided to make it into a research project. This project would explore what happens when algorithms end up in court. Really, I wondered if the judges would understand algorithms as evidence, and whether they would grasp the legal implications of ADM?
To my despair, the case was lost. The court had, in fact, placed the burden of proof on me as the applicant. The evidence I had provided to support my claims included a list and analysis of actual placements, where I tried to deductively argue how the algorithm must have looked – given these particular results. The court, however, demanded hard proof. To win the case, I would have needed to show the unlawful elements of the code and prove its illegal application. But how could I do that? How can a civilian without access to the code ever prove that it is illegal? This puts the burden of proof on the party that has no way to access it. In this case, I had requested access to the algorithm in about 20 emails to the city, but it was never provided. This shows how even a fairly simple ADM system can still be blackboxed, whether intentionally or unintentionally, by its users.
A year later, however, the truth came to light. Triggered by media attention, the city auditors of Gothenburg had decided to scrutinise how the ADM system had been used, and what errors the faulty implementation had caused. Their report confirmed the points that I had already made through reverse engineering. For most of the affected children in Gothenburg though, this was too little and too late. About 700 of them still had to complete their entire junior high school years at the wrong school.
The Gothenburg case – whose analysis is part of a forthcoming book by FEPS, Algorithmic Rule. AI and the Future of Democracy in Sweden and Beyond – shows that our current legal systems have become outdated, which effectively prevents legal redress when algorithms fail. Without a fair opportunity for systematic appeals, efficient scrutiny of the legality of code, and the burden of proof being placed where it belongs, we will never accomplish algorithmic justice, no matter how many new laws we make. Instead, we need to make sure that the rulebook still fits.
Photo credits: Shutterstock.com/3rdtimeluckystudio
| Cookie | Duration | Description |
|---|---|---|
| cookielawinfo-checkbox-advertisement | 1 year | Set by the GDPR Cookie Consent plugin, this cookie is used to record the user consent for the cookies in the "Advertisement" category . |
| cookielawinfo-checkbox-analytics | 11 months | This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Analytics". |
| cookielawinfo-checkbox-functional | 11 months | The cookie is set by GDPR cookie consent to record the user consent for the cookies in the category "Functional". |
| cookielawinfo-checkbox-necessary | 11 months | This cookie is set by GDPR Cookie Consent plugin. The cookies is used to store the user consent for the cookies in the category "Necessary". |
| cookielawinfo-checkbox-others | 11 months | This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Other. |
| cookielawinfo-checkbox-performance | 11 months | This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Performance". |
| csrftoken | past | This cookie is associated with Django web development platform for python. Used to help protect the website against Cross-Site Request Forgery attacks |
| JSESSIONID | session | The JSESSIONID cookie is used by New Relic to store a session identifier so that New Relic can monitor session counts for an application. |
| viewed_cookie_policy | 11 months | The cookie is set by the GDPR Cookie Consent plugin and is used to store whether or not user has consented to the use of cookies. It does not store any personal data. |
| Cookie | Duration | Description |
|---|---|---|
| __cf_bm | 30 minutes | This cookie, set by Cloudflare, is used to support Cloudflare Bot Management. |
| S | 1 hour | Used by Yahoo to provide ads, content or analytics. |
| sp_landing | 1 day | The sp_landing is set by Spotify to implement audio content from Spotify on the website and also registers information on user interaction related to the audio content. |
| sp_t | 1 year | The sp_t cookie is set by Spotify to implement audio content from Spotify on the website and also registers information on user interaction related to the audio content. |
| Cookie | Duration | Description |
|---|---|---|
| CONSENT | 2 years | YouTube sets this cookie via embedded youtube-videos and registers anonymous statistical data. |
| iutk | session | This cookie is used by Issuu analytic system to gather information regarding visitor activity on Issuu products. |
| s_vi | 2 years | An Adobe Analytics cookie that uses a unique visitor ID time/date stamp to identify a unique vistor to the website. |
| Cookie | Duration | Description |
|---|---|---|
| NID | 6 months | NID cookie, set by Google, is used for advertising purposes; to limit the number of times the user sees an ad, to mute unwanted ads, and to measure the effectiveness of ads. |
| VISITOR_INFO1_LIVE | 5 months 27 days | A cookie set by YouTube to measure bandwidth that determines whether the user gets the new or old player interface. |
| YSC | session | YSC cookie is set by Youtube and is used to track the views of embedded videos on Youtube pages. |
| yt-remote-connected-devices | never | YouTube sets this cookie to store the video preferences of the user using embedded YouTube video. |
| yt-remote-device-id | never | YouTube sets this cookie to store the video preferences of the user using embedded YouTube video. |
| yt.innertube::nextId | never | This cookie, set by YouTube, registers a unique ID to store data on what videos from YouTube the user has seen. |
| yt.innertube::requests | never | This cookie, set by YouTube, registers a unique ID to store data on what videos from YouTube the user has seen. |
| Cookie | Duration | Description |
|---|---|---|
| COMPASS | 1 hour | No description |
| ed3e2e5e5460c5b72cba896c22a5ff98 | session | No description available. |
| loglevel | never | No description available. |