We need to talk about the surveillance economy. Now.

May 1, 2019

Large shadowy figure watches two people looking at their smartphonesIllustration by Jessica Fortner

This article requests access to your
Location, Pictures, Microphone, Camera,
Audio, Contacts, Calendar.



It is the turn of the 20th century and the second industrial revolution continues to transform the human environment. The world no longer looks the same as it did 80, 40 or even 20 years ago. Concrete and steel landscapes fill the skyline. Nearly half the U.S. population now lives in cities—catching up to rates seen in Britain a half-century earlier—with railways and telegraph lines shortening the distance between them. Mortality rates are decreasing and lifespans slowly lengthening as the exchange of money and consumer goods increases in a first wave of globalization.

While the industrial revolution was paving the way for a new standard of living it was not without its costs, some immediate and some slowly creeping up on us. Only now are the majority of people and most governments beginning to realize that this leap forward was the beginning of climate change. Back then, humanity was the proverbial frog in a pot of boiling water: it would be generations before we took note of deteriorating air, water and soil quality, global temperature increases, and the existential threats to fragile ecosystems from our ways of producing, trading and consuming. Like all revolutions, the consequences of industrial capitalism were knowable only in hindsight.

If Shoshana Zuboff is right, we are in the midst of a new revolution nestled within the digital leap forward of the Information Age. The data gathering activities of online communications giants, starting with Google, are transforming capitalism as radically as Henry Ford did with mass production. This Surveillance Capitalism, the title of Zuboff’s new book, presents a novel set of problems some of which, much like climate change, threaten our human potential on this Earth. The good news, she writes, is that if we start talking about these threats now we might avoid another unforeseen catastrophe.


Just as industrial capitalism claimed the world’s natural resources for economic benefit, now in the digital age multi-billion-dollar industries are expropriating our private experiences to produce behavioural data that will ultimately be used in the marketplace. Instead of slowly changing our climate, Zuboff and other scholars of the digital economy fear that this time around we are changing human nature. This new threat is called surveillance capitalism.

Companies profit under surveillance capitalism by collecting information on our social-psychological behaviour, running it through opaque algorithms and using it to predict and even promote human behaviour that is valuable to other companies. Corporate actors are saturating online spaces, vying for attention from the endless possibilities of niche markets their products can cater too. The economic pressures have created an information ecosystem of monitoring and sharing data from our personal online social spaces with the goal of turning a profit through the regulation of our actions.

Just as the Industrial Revolution came with many advantages, such as more cost-effective production leading to cheaper goods, and eventually improved quality of life, so does data collection and processing in the Information Age. Most of the benefits come down to the quantity and quality of the information we can now gather about almost everything.

Self-optimization: Philosopher and media theorist Marshall McLuhan posited that technology is an extension of ourselves. Cars, by helping us get around faster, are extensions of our feet. Computers, by helping us work and process information more effectively, are extensions of our brains.. In the digital age new tools like the Fitbit or Apple Watch are extensions of our self-knowledge. They help us keep track of the steps we take, laps we swim and hours we sleep. This is known as self-logging or, as Wired editor and writer Gary Wolf dubbed it, the “quantified self.” These devices, through data collection often operated by GPS technology, have the potential to improve personal well-being in some cases: a new asthma inhaler can track when and where someone has had an asthma attack, for example, which helps us better understand the environmental factors contributing to these events.

Societal optimization: The so-called internet of things (IoT), a term coined in 1999, refers to the extension of internet connectivity—not between people but between physical everyday objects—creating an ecosystem of devices that can communicate and interact amongst themselves. Most common applications are found in the home (the “smart” TV is now ubiquitous, but wider “smart home” services can connect power usage, security and voice-command features). According to the United Nations, the global population is expected to rise to 9.6 billion people by 2050. If nothing else changes in the way we produce and commute, that will mean more cars on the street, less land, and complicated food logistics. Engineers are developing self-driving cars connected through 5G networks that have the potential to take people where they need to go without everyone having to purchase vehicles of their own. Additionally, “smart” agriculture is promising farmers the ability to monitor their soil remotely through ground sensors that relay the information to smartphones.

Optimized services: Applications such as Google Docs, Facebook Messenger and the iPhone wallet app help us work, communicate and store information more efficiently. There are even more complex applications that make once highly specialized tasks such as video editing and web design easier for the average person to do themselves. Where we once had to know how to code to build a blog, today people can use sites with predesigned interfaces that let you drag and drop content where you want it.

It’s easy enough to list the benefits of a networked, online world, which Silicon Valley prophets do regularly, loudly and often unrealistically. But people are also coming to know and understand the dark underbelly of this new reality, in which most of the information, the “smart” technology and networks through which we communicate are owned privately, by corporations with no scruples about violating basic privacy norms—and frequently breaking the law—in the pursuit of rapid growth. Our consent and societal control of these technologies is severely compromised where the market and profit motive determine the shape of things to come.


The mass production of products in the age of industrial capitalism and the collection of data in the age of surveillance capitalism differ, according to Zuboff, in that producers and consumers were once dependent on each other for the profit motive to function. Corporate owners controlled the means of production and had most of the power under industrial capitalism, but workers and consumers could keep that power in check by organizing for a fair share of the wealth they generated, on the one hand, and boycotting unethical products on the other. In the age of surveillance capitalism, Zuboff claims, these counteracting forces are no longer interdependent.

Chart showing selected government lawsuits against tech companies

“Instead of labour, surveillance capitalism feeds on every aspect of every human’s experience,” writes Zuboff. To be more precise, she says that “behavioural surplus,” and not labour power, is the new raw material from which profits are taken—from all of us rather than the workers on the shop floor, in the retail outlet, at the call centre, or any other place where employers direct the work of employees. In a world where data collection is always turned on we have lost our bargaining chip (our labour) because Big Data does not need it for the profit motive to function. Furthermore, this ecosystem of information is controlled by only a handful of companies. How can we, the public, even stage a boycott when we have so little understanding of how the exchange of data works, or where the pressure points might be?

The power structures in the marketplace are shifting, and conversely we are relinquishing what little control we once had. In addition to accumulating behavioural capital (our private data) and surveillance assets, firms are also accumulating our rights. The system “operates without meaningful mechanisms of consent,” Zuboff explains. If you want to use your GPS, you must give up your location. If you want to download iTunes, video-call friends or use any of Google’s apps you must agree to their terms and privacy agreements. There is no negotiation in the digital marketplace. It is a unilateral relationship, one in which we have no guarantees of safety or protection, and no way of claiming our information back once it has been collected.


The term corporatocracy has emerged in recent years to describe a society that is governed by and for large corporations. Corporate influence on the nation-state and governance is not a new idea, but digital capitalism is furthering that power shift. Most firms have collaborative agreements with states, be they democratic or authoritarian.

In March 2016, China’s National Development and Reform Commission revealed that more than nine million airline and speed-train tickets had been denied to citizens with a low social credit. The social credit score was adopted a few years prior in order to determine which citizens were trustworthy and which were not. Each person starts off with 1,000 points and every time they litter, smoke in a non-smoking area or get in a fight they lose points. If they volunteer or pick up garbage from the street they gain points. Those with low scores have been blocked from purchasing high-end travel options and receive higher mortgage rates or are flat out denied loans.

While most of this surveillance has been done the old-fashioned (human) way in China, as the country develops its own “smart” cities, including a network of facial recognition security cameras, the Communist Party will eventually have the option to spy on citizens wherever they go. The potential of these systems to chill dissent and undermine pro-democracy movements in China is significant. But wherever they’re applied, these technologies are ultimately designed to control and alter human behaviour.

Democratic states are also developing technology-enabled blacklists and have enlisted the help of the private sector to aid in their campaign. In 2013, Edward Snowden revealed that the National Security Agency (NSA) was tapping into users’ private Yahoo and Google accounts through approved court orders. The U.S. surveillance agency had also been secretly tapping into the companies’ data centres—evidence that even if private consent and privacy practices are improved, our information is at risk of being leaked to, or stolen by, government officials. As Zuboff points out, state surveillance agencies learned how to do this from the private sector: Google, Facebook and other Silicon Valley firms make billions on data mining of the legal and illegal variety.


In the age of surveillance capitalism we are no longer the customers of data-based goods and services. For Zuboff we are the raw material from which Big Data’s profits are extracted. This, she claims, is a first in the history of capitalism. Google first served the public as the top internet search engine on the web. It now exists primarily to serve advertisers. Information about customers and subscribers, collected across a vast network of Google apps and services, is now marketed to advertisers to precisely predict and shift our behaviours. “We no longer exist to be employed and served, we exist to be harvested,” writes Zuboff.

But if the raw materials are different than they were under industrial capitalism, the driving force behind surveillance capitalism—endless growth—is exactly the same. As companies seek to expand their profits they are compelled to expand the quantity and types of data they collect. This new age of capitalism is now a part of the fabric of our everyday lives. It has moved beyond the workplace to infiltrate our homes, our cars, our bodies and our relationships.

Engineers are developing “smart” homes that have the capacity to aid seniors, like those living with dementia, but also the potential to make the surveillance of human life complete. As a fridge door opens a speaker comes on informing them to close it. Lights sense when they get out of bed and motion detectors inform them that it is still nighttime, so they should go back to sleep. Most impressive is the use of machines that can monitor breathing levels and movements throughout the house, and take a person’s pulse—information that is sent to caregivers but also stored in the cloud.

One of the most common features of a “smart” home is the Nest thermostat, bought by Google in 2014 for US$3.2 billion, which is Wi-Fi enabled and networked. According to the company, Nest “aims to learn a user’s heating and cooling habits to help optimize scheduling and power usage.” But the value of the device, according to Wired, “lies not in the hardware itself but the interconnectedness of that hardware. As the devices talk to each other, by building an aggregate picture of human behaviour, they anticipate what we want before even we know.”

Should a user choose not agree with a data miner’s terms of service agreement, be it Nest or any other connected device or app, they will no longer be able to receive important updates. A detailed analysis of Nest and the companies in its ecosystem, mentioned in Zuboff’s book, revealed that a user would have to review nearly 1,000 contracts to determine who can have access to their data. Furthermore, engineers have proven that they can bypass the Nest software and alter the behaviour of a participating home. Our most private space, and the information on what we do there, is no longer safe.


The 20th century housewife was one of the first mass marketing successes in an age of broadcast communication. Ad men imparted upon them the desire to buy more household products to live up to the image they had created of the modernized American home. The first stage of advertising instilled feelings of inadequacy, to convince us to want the manufactured life they presented before us.

As competition from near-identical products emerged, companies had to distinguish their products by playing off people’s personalities and feelings. Are you a Subaru Outback or Toyota Venza kind of person? Adventurer or family person? Only end-of-year sales numbers could determine if the branding was working.

List of Internet of Things products for sale

With the advent of online targeted marketing, companies can predict with better accuracy that their desired demographic will be exposed to their advertisements. Click-through rates are better, because they show advertisers who is clicking on a banner ad promoting their goods or services. But techniques have moved well beyond that simplistic function. Advertising evolved from making us believe we wanted something to knowing exactly what we want, and from catering to our general personalities to pinpointing exactly when and where we are most likely to do something. Where is the computer user mousing over? How long are they holding their eyes on your ad as they scroll through an article. All of this and more data can help Google, Facebook, Amazon and other surveillance capitalists discover a single user’s probability that they will click on a link, and to make predictions on their future behaviour.

“Prediction products are sold into a new kind of market that trades exclusively in future behaviour,” writes Zuboff. “Surveillance capitalism’s profits derive primarily from these behavioural futures markets” Advertisers can predict that expecting parents might be looking for a new car and display to them an ad of that brand new Venza on Facebook, while their neighbour next door might be getting an ad for the Forester because they just finished shopping online for hiking gear.

Zubanoff claims Google’s behavioural data is accurate enough that it could figure out the buying habits of a person with mental health issues; advertisers are then sold the potential that this person will be more likely to make impulse buying decisions during lower moods or manic episodes, and pinpoint exactly when those episodes will occur. Without our consent, Google is trading exclusively in these predictions about our most vulnerable emotional states. The search engine began collecting this “behavioural surplus” in 2001. But the company became tied irreversibly to the exploitation of this information as a business model after going public in 2004. Between those two dates, Google revenues increased by 3,590%.

“The new prediction systems are only incidentally about ads, in the same way that Ford’s new system of mass production was only incidentally about automobiles,” proposes Zuboff. “In both cases the systems can be applied to many other domains.” Prediction is good. Control is better. The “internet of things” is so intelligent that basic sensors can now become actuators, intervening in real-time to put us on a different path. In an experiment with people who use Facebook, Adam D.I. Kramer, one of the company’s data scientists, found that emotional contagion does not simply occur during human-to-human interactions.

“When positive expressions were reduced, people produced fewer positive posts and more negative posts; when negative expressions were reduced, the opposite pattern occurred. These results indicate that emotions expressed by others on Facebook influence our own emotions, constituting experimental evidence for massive-scale contagion via social networks.”

These findings suggest that in-person interactions and nonverbal cues are not necessary for emotional manipulation. We can now be automated remotely. As one data scientist told Zuboff, “we can engineer the context around a particular behaviour and force change that way.”

This process can and has had real impacts on democracy. Since the 2016 U.S. election, the public and government officials have become increasingly concerned about the role and influence that platforms like Facebook, Twitter and Google are having on the integrity of democratic processes. Cambridge Analytica had access to 87 million Facebook users during the 2016 campaign, information that was used for targeted ad campaigns. The additional problem lies in the fact that Facebook does not yet have to disclose who is buying ads and where. By and large the surveillance economy is grossly underregulated, though this is slowly changing.

A relatively higher number of Canadians, approximately 42%, report receiving their news and information from social media sites. The number of Canadians younger than 45 who rely on Facebook as a source of breaking news is even higher than those who rely on television. Luckily, the Elections Modernization Act, passed late last year in time for the upcoming Canadian elections, bans foreign spending on partisan ads during Canadian elections and requires online platforms to keep detailed accounts of who is buying political ads, and who they target. Not all countries have or will pass similar legislation, and in any case it leaves the underlying data-gathering business model of these platforms completely intact.

Outside of elections, political interest groups, notably far-right and white supremacist organizations, have benefited from the behavioural nudge features of online platforms. For example, YouTube’s algorithms are designed in a way to keep viewers on the site as long as possible by suggesting videos they might like to watch. Tuning occurs shortly after and involves, writes Zuboff, “subliminal cues designed to subtly shape the flow of behaviour at the precise time and place for maximal efficient influence.” So if a person has just finished watching a speech on anti-immigration policies, the next video suggested will be slightly more provocative—a pattern that repeats until viewers potentially find themselves immersed in white supremacist hate. Users remain unaware of the choice architecture designed to shape their behaviour, disrupting the process of self-awareness and individual autonomy.


The internet is the new final frontier. Google, Facebook, Apple, Amazon, Microsoft and other tech giants have colonized it and hijaked our information without consent. They have been permitted to suck in overwhelming amounts of information, keep algorithmic sorting and targeting systems away from public scrutiny (using ever-stronger intellectual property rights and business information secrecy laws), and cannibalize all potential competition for our online attention through mergers and takeovers. These companies know much more about how this system works than we do and their monopoly powers make it likely our calls for privacy will fall on deaf ears.

“Demanding privacy from surveillance capitalists,” says Zuboff, “or lobbying for an end to commercial surveillance on the internet is like asking old Henry Ford to make each Model T by hand. It’s like asking a giraffe to shorten its neck, or a cow to give up chewing. These demands are existential threats that violate the basic mechanisms of the entity’s survival.”

As the second Industrial Revolution was humming into the 20th century, Swedish chemist Svante Arrhenius pointed out the impact that burning fossil fuels was having on the environment, but no one took notice. Over 100 years later there are still a handful of people debating the science of climate change and those who do believe are slow to act. Where did we go wrong? Why didn’t we try to avert the catastrophe at the earliest warning signs?

No one wanted to listen. If Zuboff is right—and she is hardly alone in raising the alarm about the power of the tech giants—we face a similar moment. Surveillance capitalism has the potential to fundamentally change who we are. Left to their own devices, the data harvesters will continue to take in vast amounts of our personal information, with or without our consent. They are forced to follow the profit motive to its ultimate destination, however anti-democratic it is or who is harmed along the way. The time to start setting limits and finding solutions is now.

Jenna Cocullo is interning at the Monitor as part of her master’s program in journalism at Carleton University. She will be moving to British Colombia this May to pursue her passion for environmental reporting. Jenna writes a weekly column titled “Uninterrupted: Thoughts on Politics and the Patriarchy” for The Daily Times in Blantyre, Malawi.