The Age of Surveillance Capitalism

Authors(s): Shoshana Zuboff
Published: 2019

Overall Thoughts & Roadmap

In this book, Zuboff explains the new economic and social system that has allowed the likes of Google, Facebook, Amazon to become some of the largest companies in the world today. She then explains the implications of such a system and why we need to be cautious about its current and future developments.

This book caught my eye because it was recommended on multiple best book lists in 2019 (including the TIMES, Bloomberg, New York Times, as well as Obama's personal book list). Is it worth your time? I think the ideas contained in this book are very important. But it is a BIG book, at times unnecessarily wordy, and in parts poorly structured. I hope these notes help distill its core ideas.

This book summary is divided into the following sections:
(1) What is surveillance capitalism?
(2) Three Stages of Surveillance Capitalist System  
(3) Should we be worried about Surveillance Capitalism?

1. What is Surveillance Capitalism?

"Surveillance capitalism" (SC) is the name that Zuboff gives to the economic system that has promoted the growth of firms that profit from owning and selling data about human behaviour (e.g. our consumption patterns). These firms are "surveillance capitalist" - e.g. Google, Facebook, and Amazon. She defines SC as follows:  

"A new economic order that claims human experience as free raw material for hidden commercial practices of extraction, prediction, and sales."

Let's break this down. Zuboff's model of the SC machinery looks like this:

  • Inputs: Data from our consumption behaviour (e.g. the posts we like on Facebook, the links we click on Amazon, what we type into the search bar on Google).
  • Production process: This data goes into a manufacturing process called "machine intelligence" (i.e. artificial intelligence) - producers want accurate predictions which they can achieve by "training" these machines on more and more data inputs.
  • The output: The output are "predictions" that will anticipate what we will do now, soon, and later (e.g. what we will buy next on Amazon, watch next on Netflix).
  • Market: The prediction products are traded in a new kind of marketplace called the behavioral futures market.
  • Surveillance capitalists are people who become wealthy from this trading operation (e.g. Bezos, Zuckerberg etc). There's huge demand for these outputs because knowing consumer behaviour is the core of marketing today.

How does SC differ from standard market capitalism (MC)?

To see the difference between SC and MC, let's think about the process of manufacturing cars (a typical manufacturing process in a MC system):

  • Inputs: Capital (machines) and labor (workers)
  • Production process: Machines, hand stitching/weaving - producers want high volume and low unit cost.
  • Output: Physical products like cars.
  • Market: Cars are traded in some market by car companies and bought by users.

Here are two key differences between SC and MC:

  1. The inputs into the SC model is information about us, most of which is often collected without us being conscious about it. To the extent that we're involved in the MC production model, it's our labor and machines that are being used.  
  2. In SC, the end consumers are not us. Instead, it's firms and businesses who want to anticipate and shape the behaviour of populations, groups, and individuals (e.g. companies that want to place ads on Amazon/Google). In contrast, in MC, the end-user is the consumer. In other words, workers buy cars but people who use Facebook/Google/Amazon don't buy prediction products.

2. Three Stages of the Surveillance Capitalist System

Stage 1. Behaviourial Surplus (Data extraction)

The first stage of surveillance capitalism is the extraction of data from users. Crucially, Zuboff makes a distinction between data that is collected for the purpose of improving the product (e.g. how well the search engine works) and data that is collected for reasons unrelated to product improvement (i.e. for the benefit of the firms that the surveillance capitalists are selling prediction products to). She calls the latter "behavioural surplus".

A big claim of this book is that:

"Surveillance capitalism unilaterally claims human experience as free raw material for translation into behavioral data. Although some of these data are applied to product or service improvement, the rest are declared as proprietary behavioural surplus."

Example: Google

As people started to search online more, Google started to produce new data. For example, in addition to key words, each Google search query produces data such as the number and pattern of search terms, how a query is phrased, its spelling, punctuation, dwell times, click patterns, locations.

In its early days, Google started using these data-by-products to improve the search engine. By figuring out what people searched for, what people clicked on when they searched, Google's algorithms could learn to produce more relevant and comprehensive search results for its users. The logic is as follows - more queries mean more learning; more learning means more relevance; more relevance means a better search engine.

However, Google soon figured out that these data could provide a broad sensor of human behaviour. In the late 2000s, Google started to feed this "behavioural surplus" into a predictive model called "matching" in order to target ads at specific users. The basic idea is to match ads with queries. Each time a user queries Google’s search engine, the system simultaneously presents a specific configuration of a particular ad, generated by the matching system.  

Google could then track when users actually clicked on an ad (the "click-through" rate). The higher the click-through rate, the more other firms would want to advertise with Google. This matching system was formalized in a patent that they submitted in 2003 titled "Generating User Information for Use in Targeted Advertising". Zuboff argues that this is problematic because the main beneficiaries of this targeted ad were other firms, rather than the person doing the searching on Google.  

Source: Soshona Zuboff The age of surveillance capitalism

Stage 2. Behaviour Prediction

The second stage of surveillance capitalism is the prediction of our behaviour based on the raw data extracted in stage 1. (This is also called "economies of scope" in the book.)

Surveillance capitalist companies build models to predict what you're going to do next based on the data extracted in stage 1. This is the backbone of the Netflix/Youtube "recommendation" system - based on what you've watched in the past, the  model can suggest what you will like in the future. This prediction is therefore often spoken about under the banner of "personalization".  

The key to remember is that for these models to make accurate predictions, surveillance capitalist firms need lots of data to "train" the model (i.e. you need to feed in past information about how users behave in order for these models/machines to learn how the current set of users will behave in the future). So as competition between surveillance capitalist companies grow, their incentive to extract more data from us in order to build better prediction models also increase.

Stage 3. Behavior Modification

The third stage in surveillance capitalism is the modification of our behaviour. (This is also called "economies of action" in the book.)

At this stage, machine processes are configured to intervene in the real world among real people and things. These interventions are designed to enhance the certainty that the user will do certain things - they nudge, tune, herd, and modify behaviour in specific directions.

There are three main paths to behavioural modification:

(1) Tuning

  • What: Subliminal cues designed to shape the flow of behaviour at the precise time and place for maximally efficient influence.
  • Examples: Nudging you to buy a particular product by timing the appearance of a BUY button on your phone.

(2) Herding

  • What: Controlling key elements in a person's immediate context.
  • Examples: use data on users' emotions, cognitive functions, vital signs to inform machines about what to do in surrounding environment; shutting down a car engine when the machine senses you are tired or drunk; the fridge locking up itself when it realizes that you are overeating.  

(3) Conditioning

  • What: Reinforcements to shape specific behaviours.
  • Examples: use data from wearable devices (e.g. iWatch) to capture behaviour and identify good/bad behaviour; when the user is inactive, encourage them to move; when the user is staying up too late, discourage further activity. Essentially, build a system of reward and punishment based on this data.  

Example of herding: Pokemon Go

This game planted pokemons inside cafes to get customers in there. This is an example of gamification as a way to change behaviour. Companies would pay Niantic (the owners of Pokemon Go) to be locations within the virtual game board as a way of getting foot traffic!

“Niantic’s distinctive accomplishment was to manage gamification as a way to guarantee outcomes for its actual customers: companies participating in the behavioral futures markets that it establishes and hosts. Hanke’s game proved that surveillance capitalism could operate in the real world much as it does in the virtual one, using its unilateral knowledge (scale and scope) to shape your behavior now (action) in order to more accurately predict your behavior later.”

3. Should we be worried about Surveillance Capitalism?

Zuboff's answer is a massive "yes". I think all of her arguments boil down to the following two main concerns:

(1) Surveillance capitalism is anti-democratic and anti-egalitarian

The source of this anti-democratic and anti-egalitarian nature is twofold.

  • Asymmetry of knowledge: Surveillance capitalists know a lot about us but we don't know anything about how their operations work. With knowledge comes power, so an asymmetry of knowledge leads to an imbalance of power.
  • Lack of reciprocity between user and company: Recall that in the case of SCs like Google, there is no economic exchange (no price or profit) between the user (you) and the company (Google). The main economic exchange takes place between the surveillance capitalist firm and other companies that want to make use of the data they've collected. In the case of MC, there is some exchange between the user and the company (e.g. Ford/GM). Without this feedback between user and company, this might give rise to a structure where the users are excluded from the main economic exchange.

(2) Surveillance capitalism takes away our freedom/rights

Zuboff claims that SC operations are "a challenge to the right to the future tense, our ability to imagine, intend, promise, and construct a future." The idea here is that if SC operations are able to predict what we're going to do and to use its machinery to eventually modify what we're going to do, then our freedom is gradually being stripped away.