The Progressive Post
The European Digital Agenda: can there be trust when workers are ignored?
To address the global challenges that have been exacerbated by the Covid-19 crisis, the EU needs a digital transformation that will achieve more than a single market. The European Commission says what is needed is an ecosystem based on trust, but this can be achieved only if all stakeholders are involved, including workers. Worker participation, social dialogue and collective bargaining must be the key ingredients of a fair, sustainable and forward-looking digital Europe.
The EU‘s digital agenda: ambitious plans…
When she took office in 2019, European Commission President Ursula von der Leyen stated that she wanted the EU to lead the transition to a healthy planet and a new digital world. In her recent State of the Union address, she insisted on the need to realise “Europe’s digital decade”, citing how the coronavirus pandemic has not only “proved the essential benefits of digitalisation” but has shown “the need to further accelerate the digital transformation of Europe“.
The Commission has tabled four main regulations to drive this transformation: the Data Governance Act (DGA), the Digital Services Act (DSA), the Digital Markets Act (DMA) and the Artificial Intelligence Act (AI Act). All have data in common, with data being seen as the “lifeblood of economic development“.
The first of these regulations, the DGA, focuses on increasing the quantity of data available for re-use and sharing. It introduces concepts such as ‘sector-specific data spaces’, and ‘data altruism’ – which encourages individuals to donate personal data for the general interest.
The DSA for its part updates outdated obligations for online intermediaries like social media and online marketplaces. The goal of the DSA is to keep users safe, remove illegal content,and protect users’ fundamental rights online. The DMA then adds a layer to this by targeting the ‘gatekeepers’, which are essentially platforms that have a strong, durable and entrenched intermediation position (eg, Amazon marketplace, Facebook messenger, and the Google and Apple app stores).
The AI Act meanwhile prohibits some unacceptable uses of AI and regulates high-risk AI systems – given that the uptake of AI will radically change the way we live, work and interact within society. Here, however, ‘regulating’ very much means ‘deregulating’ and promoting the uptake of AI within society.
The DSA, DMA and AI Act all share an extraterritorial dimension – a clear sign of the EU’s intention to influence the rest of the world, but also to remain in charge and in control of Europe’s digital environment. With the Covid-19 pandemic and the rapid shift to a more digital life, which has in turn increased the power gained by big tech, a major priority for the European Commission is now to ensure the EU’s digital sovereignty.
… but where are the workers?
Building a strong digital Europe that is able to use the power of new and emerging technologies to deliver growth, and that is able to exchange knowledge and tackle environmental challenges, is both welcome and necessary. Unfortunately, the legislative digital train put in motion by the European Commission has been coloured by an obsession to create a data-centred digital market, as opposed to a digital ecosystem centred around people and that involves people in its governance.
Can the Commission’s approach work? Will people trust and embrace a market-oriented digital Europe? Several plans and strategies – such as the European Democracy Action Plan, the Reinforced Skills Agenda and the Action Plan on G5 and G6 – are in the pipeline that target citizens and consumers in an attempt to gain social acceptance – but a key player has been overlooked: workers. The subordination relationship between employers and workers de facto places workers in a world of their own. Yet the four EU regulations include no specific provision on workers, or on the role they can expect to play in this brave new digital world, or on the impact of digitalisation on labour.
If workers are left out in a digital grey zone – where they do not have sufficient protection, and where they are deprived of privacy, and at risk of falling through the cracks of the digital world of work – trust will never be achieved. Platform work is the epitome of this. After a consultation of European social partners on how to improve working conditions for people working through digital labour platforms, and after a resolution adopted by the European Parliament, the Commission is now due to propose a legislative initiative in December.
Looking closely at the legislative proposals, all share an internal market legal basis (Article 114 TFEU) but none of them contain a chapter on employment. New digital services and AI applications appear every day. Workers, as active users and passive subjects of these applications, should therefore be able to shape the laws on how these services and applications are implemented in the workplace. Algorithmic management, which automates certain components of management, is not adequately limited and regulated by existing legal provisions. Yet algorithmic management is the driving force behind the platform business model. The use of this model consequently needs to be discussed, and possibly discouraged, given the social harm it provokes and its lack of sustainability.
Furthermore, none of the proposals mention the involvement of social partners – in particular, worker representatives – in the governing bodies that the proposals establish, nor do they mention the contribution that social partners can make to the future evolution of legislation. The governance frameworks established by the DSA and the AI Act rely on ‘expert groups’, whose members usually originate from industrial stakeholders rather than civil society. If the goal of the European Commission is, as stated, to build a fair, open and safe digital environment, workers should be involved as empowered actors, rather than remaining as passive onlookers. Safeguards are needed to help workers be in control of technology and co-implement it. When algorithms make mistakes – and they do – it needs to be possible to contest their recommendation. This is not yet the case. There is thus a need for redress mechanisms. In the case of the AI Act, these mechanisms do not exist.
What is more, the regulatory approach chosen by the European Commission is heavily based on self-assessment by industrial actors. The AI Act, again, relies on conformity assessments through internal checks and standardisation – with no requirements to entrust these to independent third parties. The DSA meanwhile allows Very Large Online Platforms (VLOPs) both to identify systemic risks and to put in place necessary mitigation measures. Here again, the circle needs to be opened up to involve citizens, workers and external actors.
For the digital transformation to be a success, more focus is needed on democracy at work and workers’ agency. By giving workers a genuine say in the deployment of technology in the workplace and the ability to exercise their rights, the disruptive impact of new and emerging technologies can be brought under some form of collective shared control. Trust then becomes a realistic option – which is what society, and the digital market, need to be able to thrive.
This is article is also available in German here
Photo credits: Party people studio/Shutterstock