is a former lawyer with ten years of experience in the legal field and holds a PhD from Chalmers University of Technology. She is currently conducting research on digital transformation at the Department of Applied IT at the University of Gothenburg
24/09/2025
We are in the midst of a digital transformation. One of its consequences is that countless automated decisions, powered by algorithms, are made every day. This may sound abstract, but the Swedish context shows how a seemingly straightforward algorithmic system resulted in extensive detrimental consequences for thousands of school children. This is but one example of public sector automated decision-making, commonly used for determining eligibility for economic and social benefits, and for such diverse purposes as taxation, policing and healthcare.
While these new digitally empowered ways of administration are necessary for delivering modern welfare, we should not be blind to the increased risks. Some recent cases are the childcare benefit scandal in the Netherlands and the post office scandal in the UK, wrongly criminalising and punishing ordinary citizens. Research – and experience – also shows that Automated decision-making (ADM) systems carry dangers of discrimination, bias and unequal distribution. A Swedish ADM application for school placements perfectly illustrates how algorithmic injustice can arise and persist.
In the spring of 2020, the city of Gothenburg decided to use an algorithm to place children in public schools, but it went terribly wrong. Blinded by technological trust (though likely without intent), the city coded the system incorrectly. Instead of using parents’ preferences (their elected schools) and children’s walking distances to the schools as the decisive principles for school placement (stipulated by the Swedish law), they coded the system to optimise placements only on geographical distances as the crow flies. No surprise this caused massive problems in a city divided by a large river, because children cannot fly. The incorrect algorithm resulted in several hundred children ending up in schools at the opposite riverbank, resulting in lengthy commutes. To make matters worse, these initial faulty placements meant that children were now taking up school spots far from home, forcing an equal number of children (who lived in the actual vicinity of these schools) to other placements further away, also against their will. In this way, every error was multiplied several times over.
This is an example of a systematic error, where the problem can be traced to the code, not individual decisions. However, our legal systems primarily allow for individual corrections. This means they call for individual appeals if you do not believe your decisions are correct. But systematic errors can never be corrected by individual redress. Also, how would you even know if a decision is correct, when the decision-making is hidden, and your decision may depend on another faulty decision five steps down the chain? In fact, ADM systems bring an opacity that makes errors particularly difficult to see and contest.
Motivated by the injustice of the situation – and one of my sons being placed by the algorithm – I decided to sue the city. Not with the intent to correct the individual decision concerning my son, but to address the systematic aspect and every faulty decision resulting from incorrect code. Hence, I asked the administrative court to assess the legality of the code and the implementation of the ADM system. In fact, with a background in law and a PhD in digital transformation, I was intrigued by this digital enigma and decided to make it into a research project. This project would explore what happens when algorithms end up in court. Really, I wondered if the judges would understand algorithms as evidence, and whether they would grasp the legal implications of ADM?
To my despair, the case was lost. The court had, in fact, placed the burden of proof on me as the applicant. The evidence I had provided to support my claims included a list and analysis of actual placements, where I tried to deductively argue how the algorithm must have looked – given these particular results. The court, however, demanded hard proof. To win the case, I would have needed to show the unlawful elements of the code and prove its illegal application. But how could I do that? How can a civilian without access to the code ever prove that it is illegal? This puts the burden of proof on the party that has no way to access it. In this case, I had requested access to the algorithm in about 20 emails to the city, but it was never provided. This shows how even a fairly simple ADM system can still be blackboxed, whether intentionally or unintentionally, by its users.
A year later, however, the truth came to light. Triggered by media attention, the city auditors of Gothenburg had decided to scrutinise how the ADM system had been used, and what errors the faulty implementation had caused. Their report confirmed the points that I had already made through reverse engineering. For most of the affected children in Gothenburg though, this was too little and too late. About 700 of them still had to complete their entire junior high school years at the wrong school.
The Gothenburg case – whose analysis is part of a forthcoming book by FEPS, Algorithmic Rule. AI and the Future of Democracy in Sweden and Beyond – shows that our current legal systems have become outdated, which effectively prevents legal redress when algorithms fail. Without a fair opportunity for systematic appeals, efficient scrutiny of the legality of code, and the burden of proof being placed where it belongs, we will never accomplish algorithmic justice, no matter how many new laws we make. Instead, we need to make sure that the rulebook still fits.
Algorithmic management in Europe: from key features to governance and beyond
In today’s increasingly digitalised workplaces, automated systems orchestrate and monitor tasks, measure performance, and even […]
XThis website uses cookies. Some cookies are necessary for the proper functioning of the website and cannot be refused if you wish to visit the website.
Other cookies are used for Advertisement and Analytics (Sharing on social networks, video playing, analysis and statistics, personalized advertising ...) You can refuse them if you want to. REJECTACCEPTCookie settings
Manage consent
Privacy Overview
This website uses cookies to improve your experience while you navigate through the website. Out of these cookies, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may have an effect on your browsing experience.
Necessary cookies are absolutely essential for the website to function properly. These cookies ensure basic functionalities and security features of the website, anonymously.
Cookie
Duration
Description
cookielawinfo-checkbox-advertisement
1 year
Set by the GDPR Cookie Consent plugin, this cookie is used to record the user consent for the cookies in the "Advertisement" category .
cookielawinfo-checkbox-analytics
11 months
This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Analytics".
cookielawinfo-checkbox-functional
11 months
The cookie is set by GDPR cookie consent to record the user consent for the cookies in the category "Functional".
cookielawinfo-checkbox-necessary
11 months
This cookie is set by GDPR Cookie Consent plugin. The cookies is used to store the user consent for the cookies in the category "Necessary".
cookielawinfo-checkbox-others
11 months
This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Other.
cookielawinfo-checkbox-performance
11 months
This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Performance".
csrftoken
past
This cookie is associated with Django web development platform for python. Used to help protect the website against Cross-Site Request Forgery attacks
JSESSIONID
session
The JSESSIONID cookie is used by New Relic to store a session identifier so that New Relic can monitor session counts for an application.
viewed_cookie_policy
11 months
The cookie is set by the GDPR Cookie Consent plugin and is used to store whether or not user has consented to the use of cookies. It does not store any personal data.
Functional cookies help to perform certain functionalities like sharing the content of the website on social media platforms, collect feedbacks, and other third-party features.
Cookie
Duration
Description
__cf_bm
30 minutes
This cookie, set by Cloudflare, is used to support Cloudflare Bot Management.
S
1 hour
Used by Yahoo to provide ads, content or analytics.
sp_landing
1 day
The sp_landing is set by Spotify to implement audio content from Spotify on the website and also registers information on user interaction related to the audio content.
sp_t
1 year
The sp_t cookie is set by Spotify to implement audio content from Spotify on the website and also registers information on user interaction related to the audio content.
Performance cookies are used to understand and analyze the key performance indexes of the website which helps in delivering a better user experience for the visitors.
Analytical cookies are used to understand how visitors interact with the website. These cookies help provide information on metrics the number of visitors, bounce rate, traffic source, etc.
Cookie
Duration
Description
CONSENT
2 years
YouTube sets this cookie via embedded youtube-videos and registers anonymous statistical data.
iutk
session
This cookie is used by Issuu analytic system to gather information regarding visitor activity on Issuu products.
s_vi
2 years
An Adobe Analytics cookie that uses a unique visitor ID time/date stamp to identify a unique vistor to the website.
Advertisement cookies are used to provide visitors with relevant ads and marketing campaigns. These cookies track visitors across websites and collect information to provide customized ads.
Cookie
Duration
Description
NID
6 months
NID cookie, set by Google, is used for advertising purposes; to limit the number of times the user sees an ad, to mute unwanted ads, and to measure the effectiveness of ads.
VISITOR_INFO1_LIVE
5 months 27 days
A cookie set by YouTube to measure bandwidth that determines whether the user gets the new or old player interface.
YSC
session
YSC cookie is set by Youtube and is used to track the views of embedded videos on Youtube pages.
yt-remote-connected-devices
never
YouTube sets this cookie to store the video preferences of the user using embedded YouTube video.
yt-remote-device-id
never
YouTube sets this cookie to store the video preferences of the user using embedded YouTube video.
yt.innertube::nextId
never
This cookie, set by YouTube, registers a unique ID to store data on what videos from YouTube the user has seen.
yt.innertube::requests
never
This cookie, set by YouTube, registers a unique ID to store data on what videos from YouTube the user has seen.