Shoshana Zuboff’s new book ‘Surveillance Capitalism’ explores a new step in the history of capitalism, where big tech, and increasingly other branches of the economy, are making profits with data, extracted from citizens without their consent, and transformed into raw material for behaviour predictions – with destructive effects on the economy, democracy and individual lives.
Progressive Post: Your new book is called ‘Surveillance capitalism’ what precisely do you understand by this concept?
Shoshana Zuboff: The way capitalism evolves is by taking things that live outside of the marketplace and bringing them into the market dynamic, in order to be sold and purchased. And in this respect, surveillance capitalism emulates this traditional pattern of capitalist history. But it does so with a dark twist. Surveillance capitalism unilaterally claims private experience and brings it into the marketplace, rendering it as behavioural data, as raw material for computational processes, where predictive patterns are discerned. And these new ‘prediction products’ are then sold into a new kind of marketplace that trade exclusively in these future bets on human behaviour.
PP: How did it come about?
SZ: Surveillance capitalism was invented at Google in 2001 as a reaction to a financial emergency. It was invented to quickly monetise the online search services. And it became so successful that it migrated to Facebook and then within the next few years became the default option for most of the tech sector start-ups: applications and so forth. But we can no longer say that surveillance capitalism is confined to the tech-sector because now we see it spreading across the entire economy: it’s in the insurance sector, the automobile sector, it’s in the finance, health, education and now in virtually every product you encounter that has the word ‘smart’ in front of it. And every service that has the word ‘personalised’ in its name is participating in these ecosystems that define surveillance capitalism supply chains.
PP: Let me be naïve: they are not after my online banking details, neither are they judging or blackmailing somebody who watches porn online or even reads subversive political ideas. Why should we really fear this?
SZ: The unilateral claiming of private human experience is the essence of the surveillance relationship. There’s no one coming to you and say: ‘here’s what we want to do – do you allow us to do this?’ Surveillance capitalists understand that the more people know about these kinds of practices, the more they protest and want ways to protect themselves. If these new entities are going to collect data in order to predict our future behaviour, they have to do it secretly. This is the fundamental social relationship of surveillance capitalism: it’s a one-way mirror. And it has a variety of implications. At the societal level, with surveillance capitalism and its secret ways of universally collecting every kind of depth and breadth of information about us, we have created private institutions that exist outside of constitutional governance – certainly in the United States, even if it is somewhat different in Europe. So until now, they have largely existed outside the rule of law, outside of democratic oversight and values and they produce tremendous asymmetries of knowledge: that they know everything about us. But we know almost nothing about them. They use their knowledge for other’s commercial purposes.
PP: We haven’t named them yet, but it’s about the big ones: Facebook, Google and so on. Google still claims: ‘don’t be evil’ – but aren’t they?
SZ: This is not about people being evil, which is extremely important when it comes to issues of law and regulation. And it’s not even about bad people versus good people. This is about a new economic logic, with specific economic imperatives. These are companies that are now bound to these economic imperatives if they want to be successful.
PP: Karl Marx once wrote that if you have a hand mill, you get a society with a feudal lord and if you have a steam mill, you get a society with an industrial capitalist. Is there a determinism in technology here too? If you manage to lock people up in zillions of tiny, isolated and virtual treadmills, you get surveillance capitalism?
SZ: I think this is a fundamental category
error: the conflation of technology with
surveillance capitalism. I want to make very
clear that surveillance capitalism is not the
same as the digital.
Let me give you an example: back in 2000s, before the invention of surveillance capitalism, a very elite group of designers, data scientists and engineers at Georgia Tech University had the idea of what they called the ‘aware home’ – very similar to what we call the ‘smart home’ today. But it had a single, closed loop: all the information went directly to the occupant of the home. And they were very explicit: because these data are so intimate and personal, only the occupants could decide what to do with them.
Fast forward to 2017: The University of London has analysed one single ‘smart home’ device: the ‘Nest thermostat’ – owned by Google. ‘Nest’ is an eco-system with a thermostat and other devices in your home that can be connected to that thermostat. And it’s collecting a lot of data from all kinds of aspects of your behaviour in your home. The researchers found out that when installing one Nest thermostat, a conscientious consumer would need to review a minimum of one thousand privacy contracts. Because all these behavioural data are now streaming through ‘Nest’ to third parties.
So here we have the same technologies, but each one inhabited by a fundamentally different economic logic. And it’s the economic logic here, as Max Weber warned us so long ago, that is the determinant of how these technologies are brought into our lives, of their uses and their consequences.
PP: The question in the run-up to the European elections is how these means of surveillance capitalism interfere with democracy?
SZ: Here, the second category error comes into play: we can’t reduce surveillance capitalism to any single company. Right now, there’s a lot of focus on Facebook because most of what has disfigured our election processes in Europe and in America came through the channels of social media. But I think it’s important to bear in mind that the methods that have been used in the Cambridge Analytica case to hijack our election processes are the same methods that surveillance capitalists use every day to shape our behaviour towards their commercial ends.
We have a set of means of behavioural modification that we know now pivot to political outcomes. And in the most visceral way: the political discourse and information come to us as if it were constructed by the Fourth Estate, by journalists, who have specific standards and criteria of truthfulness and a professionalism. But is has been corrupted intentionally to trick us as, to shape our behaviour in secret ways toward others ends. This obviously is a major challenge to democracy.
PP: Are there other challenges to democracy?
SZ: And there are more subtle challenges as well: our democratic society is also eroded from the inside by these methodologies. Because life is more and more defined by stimulus response and by subliminal rewards and punishments that saturate our environments in this new digital media age. And this slowly erodes our capacity for moral autonomy.
And we have seen this intervention in our autonomy being experimented with lite- rally at population level. In 2012, Facebook launched its massive online ‘contagion’ experiment, to see if they could use subliminal cues and awareness-shaping mechanisms to change our voting behaviour in the real world. A year later there was another contagion experiment, also with subliminal cues, to see if they could change our emotional valence to make us sadder or happier. Both experiments were successful. And when they wrote these up in scholarly journals, they bragged about the fact that these methodologies were successfully evading user awareness.
PP: But if these companies are already so deep under our skins, or rather inside our heads, is there still room to even think of resistance?
SZ: I don’t think that resistance is going to be the problem. Today, it’s impossible for us to know exactly what aspect of our experience is being rendered, where those data are going and who is using them to what end. So, the first thing is that we must name these things because we know that when people find out about these kinds of activities, they do feel resistance. They do want to say no. So, the first thing is to open the curtains, shine light on all of this and then resistance will come as a very natural response.
It will produce a sea change in public opinion and that will bring demands for action. It will bring demands to our elected officials to become more rigorous in developing the next generation of law and regulation that will protect us from these kinds of activities.
Obviously, the European Union’s General Data Protection Regulation (GDPR) has already taken us much further ahead than we’ve been during the last 20 years. Now we have the possibility of standing on the shoulders of the GDPR in order to develop the kinds of regulatory regimes that are specifically targeted at these mechanisms.
We talk about data ownership as a solution for privacy. But when we understand the voraciousness of surveillance capitalism and how it takes, without asking, from every aspect of our experience, is data ownership really enough? Do we really want to be arguing about owning data that should not exist in the first place? I liken this to arguing about how many hours a 7-year-old should work in a factory when in fact we should be arguing about the fact that there should be no child labour at all.
We have to ask the questions of principles here: Is it legitimate for our experience to be taken without any form of meaningful consent of our part? Is it legitimate, for our experience to be rendered as behavioural data, as raw material for predictions? Is it legitimate for those predictions to be sold into secondary markets to business customers who have a stake in predicting our future behaviour? And for those operations to be inaccessible to us so that our futures are being auctioned off to others for their profit for their commercial aims and we have no say or oversight or protection from those activities?
PP: Beyond the public outrage that may come when people understand how their reality is being shaped around them and even inside them, what is your message to policy makers?
SZ: The first message for our lawmakers is that we have to understand that as important as it may be to regulate a specific company, as important as it may be to apply our antitrust laws and our privacy laws, we have to go further: we have to understand that surveillance capitalism is now pervading our economy. We have to understand its specific mechanisms and we have to have a public conversation as to whether or not these mechanisms are consonant with individual sovereignty and with democratic sovereignty and then we have to understand what are the ways in which we can specifically interrupt and outlaw these mechanisms.
PP: But how to do that, in your view?
SZ: My view is that surveillance capitalism is a rogue mutation of capitalism. In the 20th century, we found a way for markets and democracies to create an equilibrium. But that was only because we had created the laws and the regulations that bound the excesses of capitalism and limited them and tethered them to the needs of a democratic society and to the well-being of individuals, both the social and the economic well-being of individuals. This is where we are now in history.
We’re in a world now where we can’t be effective in our daily lives without marching through these channels that are also surveillance capitalism’s supply chains, giving them our experience for behavioural data for these secondary operations that we have no knowledge of or control over. Hence, we must to create alternatives for that. And as soon as those alternatives exist, we are all going to move to that side of the ship.
PP: There are already some alternatives: Telegram instead of WhatsApp or alternative search engines like DuckDuckGo instead of Google. But these things haven’t really taken off yet.
SZ: These things require scale. We do have a search engine like DuckDuckGo that conserves our privacy and that’s terribly important. People may say that Google has a better search engine, but what they don’t understand is that Google might have a better search engine just because of the very practices we’ve been describing and that improvement in its search ability comes at a cost that is invisible to most of us. We need to be aware of the real costs you buy into Google and its search and its practices that take us all the way down the road where eventually we find Cambridge Analytica.
We have two tremendously different alternatives here. And when those two alternatives are confronted, they have to be confronted in their fullness with full knowledge and transparency of what each one entails. And as I said in the beginning: when people do have that full knowledge and transparency, they reject these practices.
No breakthrough for women’s representation in Ireland’s 2024 general election
Find all related Magazine
Magazine
#Issue 26Progressive Post
#Issue 26
Brace for impact
#Issue 25Progressive Post
#Issue 25
EU 2024: the unpredictable well-known
Issue #24Progressive Post
Issue #24
The future is social
XThis website uses cookies. Some cookies are necessary for the proper functioning of the website and cannot be refused if you wish to visit the website.
Other cookies are used for Advertisement and Analytics (Sharing on social networks, video playing, analysis and statistics, personalized advertising ...) You can refuse them if you want to. REJECTACCEPTCookie settings
Manage consent
Privacy Overview
This website uses cookies to improve your experience while you navigate through the website. Out of these cookies, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may have an effect on your browsing experience.
Necessary cookies are absolutely essential for the website to function properly. These cookies ensure basic functionalities and security features of the website, anonymously.
Cookie
Duration
Description
cookielawinfo-checkbox-advertisement
1 year
Set by the GDPR Cookie Consent plugin, this cookie is used to record the user consent for the cookies in the "Advertisement" category .
cookielawinfo-checkbox-analytics
11 months
This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Analytics".
cookielawinfo-checkbox-functional
11 months
The cookie is set by GDPR cookie consent to record the user consent for the cookies in the category "Functional".
cookielawinfo-checkbox-necessary
11 months
This cookie is set by GDPR Cookie Consent plugin. The cookies is used to store the user consent for the cookies in the category "Necessary".
cookielawinfo-checkbox-others
11 months
This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Other.
cookielawinfo-checkbox-performance
11 months
This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Performance".
csrftoken
past
This cookie is associated with Django web development platform for python. Used to help protect the website against Cross-Site Request Forgery attacks
JSESSIONID
session
The JSESSIONID cookie is used by New Relic to store a session identifier so that New Relic can monitor session counts for an application.
viewed_cookie_policy
11 months
The cookie is set by the GDPR Cookie Consent plugin and is used to store whether or not user has consented to the use of cookies. It does not store any personal data.
Functional cookies help to perform certain functionalities like sharing the content of the website on social media platforms, collect feedbacks, and other third-party features.
Cookie
Duration
Description
__cf_bm
30 minutes
This cookie, set by Cloudflare, is used to support Cloudflare Bot Management.
S
1 hour
Used by Yahoo to provide ads, content or analytics.
sp_landing
1 day
The sp_landing is set by Spotify to implement audio content from Spotify on the website and also registers information on user interaction related to the audio content.
sp_t
1 year
The sp_t cookie is set by Spotify to implement audio content from Spotify on the website and also registers information on user interaction related to the audio content.
Performance cookies are used to understand and analyze the key performance indexes of the website which helps in delivering a better user experience for the visitors.
Analytical cookies are used to understand how visitors interact with the website. These cookies help provide information on metrics the number of visitors, bounce rate, traffic source, etc.
Cookie
Duration
Description
CONSENT
2 years
YouTube sets this cookie via embedded youtube-videos and registers anonymous statistical data.
iutk
session
This cookie is used by Issuu analytic system to gather information regarding visitor activity on Issuu products.
s_vi
2 years
An Adobe Analytics cookie that uses a unique visitor ID time/date stamp to identify a unique vistor to the website.
Advertisement cookies are used to provide visitors with relevant ads and marketing campaigns. These cookies track visitors across websites and collect information to provide customized ads.
Cookie
Duration
Description
NID
6 months
NID cookie, set by Google, is used for advertising purposes; to limit the number of times the user sees an ad, to mute unwanted ads, and to measure the effectiveness of ads.
VISITOR_INFO1_LIVE
5 months 27 days
A cookie set by YouTube to measure bandwidth that determines whether the user gets the new or old player interface.
YSC
session
YSC cookie is set by Youtube and is used to track the views of embedded videos on Youtube pages.
yt-remote-connected-devices
never
YouTube sets this cookie to store the video preferences of the user using embedded YouTube video.
yt-remote-device-id
never
YouTube sets this cookie to store the video preferences of the user using embedded YouTube video.
yt.innertube::nextId
never
This cookie, set by YouTube, registers a unique ID to store data on what videos from YouTube the user has seen.
yt.innertube::requests
never
This cookie, set by YouTube, registers a unique ID to store data on what videos from YouTube the user has seen.