Surveillance capitalism: Three perspectives to consider

The technology-evangelists’ goal is to provide products and services that are so compelling, easy to access, and intuitive to use that we can’t help but adopt them.

14 SEPTEMBER 2021 · 13:00 CET

Photo: <a target="_blank" href="https://unsplash.com/@a_d_s_w">A. Swacnar</a>, Unsplash CC0.,
Photo: A. Swacnar, Unsplash CC0.

(Part 1 of this paper was published last week. Part 3 will be published next week)

 

Three perspectives to consider

 

1 Delusions of purely technological hope

Beguiled by utopian dreams

Technology innovation reflects the values and ideals of its makers. Guy Brandon, in his book Digitally Remastered [1] writes: ‘a technology like a social media platform is implicitly the expression of the spiritual values of its creators and users.’ What are the ideas, values, and aspirations of the authors of Big Tech, from Jeff Bezos and Steve Jobs, to Larry Page and Mark Zuckerberg?

Eric Schmidt, the former CEO, and executive chairman of Google has remarked: ‘Our goal is to change the world…and monetization is a technology to pay for it.’[2] His comments resonate with the quasi-theological visions of the biggest technology companies. (Facebook wants to ‘bring the world closer together’, Amazon offers ‘Everything from A to Z’, Google’s ‘Change the world’). Data is a new unit of currency in this new world, fuelling the informational economy with real-time insights, actions, and preferences.

The technology-evangelists’ goal is to provide products and services that are so compelling, easy to access, and intuitive to use that we can’t help but adopt them. They want to offer a form of frictionless living, enabled by their products, and built around their digital architecture, that encourages us to use their services frequently, while enabling them to harvest our data. That data provides insight into human living, which in turn provides the means to exert influence over our lives. Yet these same leaders conflate a form of technicism (including the inevitability of technological advance) with consumerism. Monetisation is no longer a technology to support techno-utopian goals, it has become the goal, driving adoption, revenue, and profit.

The deception of democratised (digital) relationships

We are told that social media ‘give[s] people the power to build community’[3], but research repeatedly demonstrates that social media rapidly and permanently polarises users[4]. Instagram and Facebook encourage parents to create ‘managed’ accounts for children younger than thirteen[5], but industry executives (including former Facebook executives) fiercely shield their children from social media[6]. Research into the harms of social media, particularly on young adults, is reflected in the sites’ own FAQs addressing abuse and eating disorders[7]. Despite this, these firms continue to harvest trillions of behavioural observations from billions of users every day. How do the big technology companies manage to preserve the simulacrum of relationships while being so anti-relational?

Monetisation is no longer a technology to support techno-utopian goals, it has become the goal, driving adoption, revenue, and profit.

The design of social media platforms intentionally redefines common relational paradigms. Facebook transformed ‘friend’ from a noun to a verb: you now ‘friend’ (or ‘unfriend’) someone to open access to curated personal information. To maximise my friending ability, Facebook collates my friends’ information, so I don’t have to digitally ‘go’ anywhere to participate in the relationship. Twitter and Instagram go further, abandoning the pretence of symmetry by defining relationships in terms of ‘followers’, encouraging asymmetry and voyeurism. Unlike symmetric, two-way conversations, social media relationships involve one party posting an artefact (e.g. an image) and a multitude of public recipients asynchronously reacting (e.g. a comment or ‘like’). This is the infrastructure that supports growing addiction.

 

2 The hook model and addiction

Guy Brandon writes: ‘The spiritual danger posed by social media [is] that it almost subconsciously takes precedence over everything else in our lives’.[8]

How is this subliminal addiction achieved with so little resistance? This is the glory and shame of the hook model: having convinced us to accept digital relationships, those relationships are now mediated by platforms scientifically designed to maximise our engagement. Instagram does this by leveraging our strong visual bias. Instagram launched in 2010 as a ‘fast, beautiful, and fun’ way to ‘capture and share the world’s moments’[9] and currently has 1.1 billion users[10]. The basic premise is that you post a picture or video that is pushed to your followers, who get an alert and are encouraged to open the app. Your followers can respond with comments or ‘likes’, which are tracked and prominently displayed. As anyone who has written a letter to a newspaper editor or commented on an online article knows, there is a strong temptation to return and see how people have responded. Instagram makes that feedback nearly instantaneous, giving users almost infinite capacity to post, generating that temptation to see people respond, and then immediately gratifying it and inviting the user to post again. All the elements of the hook model are here: trigger, action, uncertain reward, and further investment.

Surveillance capitalism: Three perspectives to consider

  Photo: F. Chamaki, Unsplash, CC0
 

Brandon calls this ‘sensitising the mind to distraction’ and warns that ‘this distractibility compromises our humanity’[11]. This addiction is formed early: in the West, the average 2–4 year old spends 48 minutes per day on a digital device[12]. Over 70 per cent of all children and nearly 90 per cent of adolescents in the US sleep with a digital device connected to social media, which demonstrably reduces sleep quality and correlates with rises in depression[13]. This particularly affects young girls: the incidence of depression among 13–18 year olds increased 65 per cent between 2010 and 2017 after decades of decline, directly corresponding with the availability of social media sites like Facebook and Instagram for this age group[14]. Adults are not exempt, though their addiction tends to play out in terms of polarisation[15] and marked declines in ability to empathise[16].

Beyond these immediate impacts, addiction to distraction opens the door to exploitation.

3 The costs of exploitation

At this point, we ask, ‘Why should I care? I get excellent, helpful services for free, and I never click the adverts. This seems like a good trade.’

In the movie The Matrix (The Wachowskis, 1999) humans are crops from which energy is harvested, and the ‘crop’ is maintained in a dream-like simulation to keep them productive and more-or-less content. Most people find this to be a poor trade, regardless of our dream-like contentment. Why? What has been taken from us?

The more data they collect, the more refined our avatar, the more accurately they can test and select inputs to manage our responses.

Surveillance capitalism is more interested in keeping us ‘content’ and connected to harvest information. Specifically, they observe and record millions of our behaviours and responses to (strategically) varied inputs to create something like an avatar (a virtual representation) of each of us that mimics our responses to given inputs. Creating accurate avatars requires maximising engagement with the company’s actual platforms or with their advertising networks, which uniquely identify and track us around the web even if we don’t have accounts with those companies.[17]

The more data they collect, the more refined our avatar, the more accurately they can test and select inputs to manage our responses. As an example, and as a piece of self-advertisement, Facebook published studies demonstrating their ability to selectively manipulate voter turnout[18] and user emotions[19] through messages and promoted content.

‘Advertising works by creating patterns of associations…through “low attention processing”.’ As discussed above, social media is designed for distraction, the ‘undirected mental state where images, music, and emotional responses pass into long-term memory without conscious learning.’[20] When these inputs have been refined against my avatar, what chance do I stand against the well-financed effort to nudge my behaviours, emotions, and beliefs in one direction or another? If we insist on defining what is being stolen, we might not be too far off the mark if we point to self-determination, intellectual freedom, choice, and, eventually, responsibility.

Part 3 will be published next week.

 

Jonathan Ebsworth has spent his career working with Information Technology. He has recently established a small consulting practice focused on human-centred technology innovation and co-founded www.TechHuman.org, a website aimed at offering insights into how we can live well in a digitally-dominated world.

Samuel Johns writes on identity, immediacy, and technology in the late-modern world, with a particular interest in human personhood. He studied at the University of Oxford before pursuing a Master of Arts at UBC, Vancouver, on the philosopher Charles Taylor's work The Malaise of Modernity.

Michael Dodson is a PhD candidate at the University of Cambridge, studying Computer Science. His research focuses on security and privacy where digital equipment meets the physical world, such as in water and power utilities, medical devices, and automotive applications.

This paper was first published on the website of the Jubilee Centre and re-published with permission.

 

Notes

1. Guy Brandon, Digitally Remastered: a Biblical Guide to Reclaiming Your Virtual Self (Edinburgh: Muddy Pearl, 2016), p.2.

2. Janet Lowe, Google Speaks: Secrets of the World’s Greatest Billionaire Entrepreneurs, Sergey Brin and Larry Page (Chichester, England: John Wiley & Sons, 2009).

3. Facebook Investor Relations, FAQ  (accessed May 10, 2021).

4. Anne Ford, ‘The Surprising Speed with Which We Become Polarized Online, Kellogg Insight, 6 April 2017  (accessed May 10, 2021).

5. Instagram, Terms of Use  (accessed 10 May 2021) Messenger Kids (accessed 10 May 2021).

6. Zameena Mejia, Apple CEO Tim Cook: Don’t let your kids use social media, CNBC, Make it, Leadership, 23 January 2018 (accessed 10 May 2021).

7. Instagram, Terms of Use (accessed 10 May 2021).

8. Guy Brandon, op. cit., pp.5–6.

9. Mike Issac, ‘Instagram takes a page from Snapchat, and takes aim at it, too’, 2 August 2016 (accessed 15 February 2021).

10. H. Tankovska, ‘Distribution of Instagram users worldwide as of January 2021, by age group’, Statista, 10 February 2021 (accessed 15 March 2021).

11. Guy Brandon, op. cit.

12. ‘The Common Sense Census: Media Use by Kids Age Zero to Eight, 2017’, h (accessed 10 May 2021).

13. B. Carter, P. Rees, L. Hale, D. Bhattacharjee, M. S. Paradkar, ‘Association Between Portable Screen-Based Media Device Access or Use and Sleep Outcomes: A Systematic Review and Meta-analysis’, JAMA Pediatr., (2016), 170(12):1202–1208 .

14. Jean M. Twenge, Thomas E. Joiner, Megan L. Rogers, and Gabrielle N. Martin, ‘Increases in Depressive Symptoms, Suicide-Related Outcomes, and Suicide Rates Among U.S. Adolescents After 2010 and Links to Increased New Media Screen Time,’ Clinical Psychological Science, 6, no. 1 (January 2018), 3–17. ‘Corrigendum: Increases in Depressive Symptoms, Suicide-Related Outcomes, and Suicide Rates Among U.S. Adolescents After 2010 and Links to Increased New Media Screen Time’, Clinical Psychological Science, 7, no. 2 (March 2019), 397–397 . The nature of this causation is disputed by other researchers, such as the Oxford Internet Institute: see for example: Matti Vuorre, Amy Orben, and Andrew K. Przybylski, ‘There Is No Evidence That Associations Between Adolescents’ Digital Technology Engagement and Mental Health Problems Have Increased,’ Clinical Psychological Science, (May 2021) .

15. Anne Ford, ‘The Surprising Speed with Which We Become Polarized Online’, Kellogg Insight, 6 April 2017 (accessed 10 May 2021).

16. Bernd Lachmann et al., ‘The role of empathy and life satisfaction in internet and smartphone use disorder,’ Frontiers in Psychology, 9 (2018), 398.

17. Gennie Gebhart, ‘Facebook, This Is Not What “Complete User Control” Looks Like’, Electronic Frontier Foundation, 11 April 2018 (accessed 10 May 2021).

18. Robert Bond, Christopher J. Fariss, Jason J. Jones, Adam D. I. Kramer, Cameron Marlow, Jaime Settle, ‘A 61-million-person experiment in social influence and political mobilization’, 13 September 2012 (accessed 10 February 2021).

19. Charles Arthur, ‘Facebook emotion study breached ethical guidelines, researchers say’, Guardian, 30 June 2014 (accessed 10 February 2021).

20. Paul Feldwick, ‘How does advertising work?’, Advertising Association, (accessed 10 May 2021).

Published in: Evangelical Focus - Jubilee Centre - Surveillance capitalism: Three perspectives to consider