Your Privacy Is Important to Us! – Restoring Human Dignity in Data-Driven Marketing

about & support

Foreword by Eric K. Clemons

Preface, Acknowledgements and Abbreviations

Bibliography


PART I – INTRODUCTION


1. Why this book?
    (#methodology #delimitations #structure)


2. Data-Driven Business Models
    (#surveillancecapitalism #valueextraction #harm)

PART II – LAW


3. Regulating Markets
    (#law #markets #architecture #consumerprotection)


4. Data Protection Law
    (#gdpr #personaldata #lawfulprocessing #legitimatebasis)


5. Marketing Law
    (#ucpd #professionaldiligence #averageconsumer)

PART III – PSYCHOLOGY AND TECHNOLOGY  


6. Human Decision-Making
    (#agency #psychology #boundedrationality #willpower)


7. Persuasive Technology
    (#technology #choicearchitecture #friction #prompts)


8. Manipulation
    (#coercion #deception #subliminalmarketing #paternalism)


9. Transparency
    (#information #communication #complexity #asymmetry)

PART IV – SOCIETY


10. Human Dignity and Democracy
      (#humanwellbeing #privacy #discrimination #proportionality)


PART V – CONCLUSIONS AND BEYOND


11. Conclusions
      (#humandignity #datadrivenmarketing #beinghuman)


12. Next Steps
      (#action #conversations #future)

CHAPTER TWO

Data-Driven Business Models

#surveillance capitalism  #value extraction  #harm

Data-driven business models come in many shapes and forms, and are characterised by having the processing of data as a core element. In this context we focus on data-driven business models that rely on data for creating revenue in the guise of data-driven marketing. Marketing being the broader concept that includes ‘advertising’, as discussed in Chapter 5 (marketing law).

In addition to being dependent on the processing of (large quantities of) personal data, data-driven business models offered to consumers often rely on personalisation and the extraction of value from users.1 Often, data-driven business models comprise two separable products, where at least one product is provided free of charge.

These business models exist in both online and offline environments and usually rely on the monetisation of (1) surveillance, (2) attention and (3) behaviour modification by means of personalised marketing, including targeted advertising. The surveillance, attention and behaviour modification capabilities can be made available to third parties, including for both commercial and political purposes.

Loyalty and rewards programs are examples of data-driven business models that are offered both offline and online. This ‘service’ is ancillary to the trader’s selling of products, including goods (e.g. grocery shops) and services (e.g. airlines). The service may consist of providing discounts, bonuses and other benefits to the user. In essence, the program is a marketing technique that connects the consumer more closely to the trader, who may increase revenue and profits by better understanding their consumers—including their behaviour—and offering personalised marketing.

Social media services and other platforms are typical online examples of data-driven business models—the bazaar is an offline platform that is not data-driven. Platforms may monetise attention and data in many ways. Here we focus on revenue from personalised marketing, including targeted advertising. The ancillary product is offered to third parties (advertisers), but reasonable arguments can be made for the targeted advertising being the primary product, i.e. the users (consumers) are the real product being sold to advertisers.

1.1. Personalisation and surveillance

Personalisation is at the core of many data-driven business models, either because personalisation is required for delivering the primary product, or because it is a ‘convenient’ reason to justify the collection of personal data for the ancillary product, such as personalised marketing.

For instance, loyalty/rewards programs and social media services justify the collection and use of personal data—which includes surveillance of behaviour for one purpose—that subsequently may be used for personalised marketing.2 A decision to personalise the primary product may thus be driven by a goal of increasing revenue from ancillary services, including data-driven marketing—often framed as ‘enhancing the product’ or ‘improving the user experience’.

1.2. Value extraction and behaviour modification

The idea of monetising attention is far from new,3 and the idea of the attention economy was introduced in the 1970s by Herbert A. Simon:4

‘in an information-rich world, the wealth of information means a dearth of something else: a scarcity of whatever it is that information consumes. What information consumes is rather obvious: it consumes the attention of its recipients. Hence a wealth of information creates a poverty of attention and a need to allocate that attention efficiently among the overabundance of information sources that might consume it.’

Traders may monetise the attention of its users by displaying advertising. Advertising and other sorts of marketing techniques may be used to further commercial (products), political (parties/interests) and societal (e.g. safety) aims. Advertising constitutes a subset of behaviour modification techniques that is comprised in the broader concept of ‘marketing’.

The value of each user is directly proportionate to the amount of exposure. Advertising may be ‘targeted’ by means of personal data, or ‘contextual’ by making assumptions (possibly informed by data; personal or not) based on context. Advertising saddles in a forum for horse riders is an example of contextual advertising.

Advertising is valuable to the advertiser because it may influence the behaviour of the targeted users. The value of an advertisement may be increased by having more detailed information about the user: Advertising saddles to horse riders (outside forums for horse riders) may be more efficient than targeting motorbike riders.

The trader has an economic incentive to increase (1) the number of users, (2) their engagement (amount and nature of attention) and (3) knowledge about individual users (personal data). As we will discuss later, personal data coupled with insights from psychology and technology can be used to (a) increase the value of user experiences (through influence) and (b) increase attention and engagement, including by means of creating addictive behaviours.

We use the terms ‘behaviour modification’5 and ‘influence’ interchangeably and understand them to comprise both persuasion and manipulation. We use persuasion as the lawful form of influencing behaviour, and manipulation, which includes deception and coercion, as unlawful influencing.6 The distinction is normative and not always easy to establish, as we discuss in Chapter 5 (marketing law).

Other types of value extraction include benefiting from the users’ content creation, relationships, anxieties, psychological needs, etc. Some of these aspects are dealt with in Chapter 10 (human dignity and democracy).

1.3. ‘Free’

Often, data-driven business models rely on products that are offered free of charge, i.e. without the consumer having to make a ‘payment’. This makes it more likely that consumer’s will sign up for something like a loyalty program or a social media service. In Chapter 7 (persuasive technology), we introduce the concept of ‘friction’, which is removed when the user does not have to make a payment.

Some loyalty programs take the shape of a fee-based subscription service—like Amazon Prime—but nevertheless still rely on surveillance and behaviour modification. Other service providers offer paid-for commercial-free versions of their service.

2. Providing access to ourselves . . . and others

We leave traces when we act in social or societal contexts, whether online or offline. The most significant attributes of the traces we leave when technology is involved are the amount of data points and the (often unlimited) storage thereof. In addition to the scale, technology also allows for observing behaviour in the private sphere, as we discuss further in Chapter 7 (persuasive technology) and Chapter 10 (human dignity and democracy).

In contrast to traditional landline telephones, mobile phones are usually personal (one user per phone).7 With the advent of ‘smart’ (phones, watches, health devices, televisions, homes, cars, etc.), information technology has turned into a massive web of surveillance, tracking and recording devices. Also the use of cookies and other tracking techniques on the internet has allowed for the tracking of individuals across services, platforms, technologies, etc.8

By nudging users to access services through appsinstead of, e.g., a web-browserthe trader may obtain more data and gain more control over the user experience and, possibly, circumvent privacy measures implemented on the browser of the privacy-aware user.

Personal data may be deliberately revealed by the user or derived from behaviour, including interactions with other people. By means of algorithms, even simple data points may reveal much, as expressed by Jamie Bartlett,

‘[algorithms] can take your music preferences or your book preferences and extract from this seemingly innocent information very accurate predictions about your religiosity, leadership potential, political views, personality and so on.’9

Whenever we use information technology, we inevitably leave information that is necessary for the communication. But especially for the use of apps, we often provide a plethora of information that is not necessary for the communication. For illustration, when installing Facebook’s Messenger App on an iPhone, one can click through to the following explanation of data used for particular purposes (‘depending on feature use and age’):

Third-Party Advertising

  • Purchases (purchase history)

  • Financial Info (Other Financial Info)

  • Location (Precise location, Coarse Location)

  • Contact Info (Physical Address, Email Address, Name, Phone Number, Other User Contact Info)

  • Contacts (Contacts)

  • User Content (Photos or Videos, Gameplay Content, Other User Content)

  • Search History (Search History)

  • Browsing History (Browsing History)

  • Identifiers (User ID, Device ID)

  • Usage Data (Product Interaction, Advertising Data, Other Usage Data)

  • Diagnostics (Crash Data, Performance Data, Other Diagnostic Data)

  • Other Data (Other Data Types)

Developer’s [Facebook, Inc.] Advertising or Marketing, Product Personalisation, Other Purposes

[Same as under Third-Party Advertising]

Analytics [different from Third-Party Advertising in italics]

  • Health & Fitness (Health, Fitness)

  • Financial Info (Payment Info, Other Financial Info)

  • User Content (Photos or Videos, Audio Data, Gameplay Content, Customer Support, Other User Content)

  • Sensitive Info (Sensitive Info)

App Functionality [different from Third-Party Advertising in italics]

  • Health & Fitness (Health, Fitness)

  • Financial Info (Payment Info, Credit Info, Other Financial Info)

  • User Content (Email or Text Messages, Photos or Videos, Audio Data, Gameplay Content, Customer Support, Other User Content)

  • Sensitive Info (Sensitive Info)

By providing access to a number of features, including contacts, the user indirectly gives access to information about third parties that may not be using the services in question.10 Access to information about others can also be derived from access to information we share about others via platforms and other services.

We may also learn about how traders process data by announcements. For instance, in 2017 Google ceased its practice of scanning Gmail messages to sell targeted advertisements.11 And Facebook announced in 2021 that it as of 19 January 2022 will remove ‘Detailed Targeting options that relate to topics people may perceive as sensitive, such as options referencing causes, organizations, or public figures that relate to health, race or ethnicity, political affiliation, religion, or sexual orientation.’12

On 9 September 2021 Ray-Ban and Facebook launched smart sunglasses (Ray-Ban Stories)—that resembles the abandoned Google Glass from 2011—‘that give you an authentic way to capture photos and video, share your adventures, and listen to music or take phone calls—so you can stay present with friends, family, and the world around you.’13 In the launch video, Mark Zuckerberg emphasises that ‘Ray-Ban Stories are an important step toward a future when phones are no longer a central part of our lives and you won’t have to choose between interacting with a device or interacting with the world around you.’ It is argued that the glasses will ‘let people live in the moment’ and ‘make people’s life better’ [by recording and sharing more data].

On 20 April 2021 Apple announced its AppTrackingTransparency feature, which allows users to choose what third-party tracking they will tolerate.14 This may be a (long-awaited) first step for markets to limit the current pervasive surveillance for marketing purposes. The fix only applies to Apple products (iOS), with roughly a quarter of the market. Google’s Android platform sits atop almost three quarters of the market for smartphone operating systems.15

2.1. Online and offline

Traders may make efforts to link activities in the online and offline worlds, e.g. by offering traceable discounts, using phone numbers as identifiers and/or involving smartphones in the off-line ‘purchase experience’. These efforts may also be disguised as benefits for registering products (extra battery, extended warranties, etc.) or joining a loyalty/rewards program, and as the encouragement to share ‘purchase experiences’ with the trader, friends or other people.

3. Artificial Intelligence (AI)

We use the term ‘Artificial Intelligence’ to include the use of big data, algorithms, machine learning and deep learning. AI systems rely on computer programs that can handle large quantities of data and are designed to achieve (optimise for) particular objectives.

AI systems are designed to interact with their surroundings, including human beings, by receiving and delivering signals. Data is the raw material used by AI systems to achieve their objectives. With the digitisation of our societies, computers have access to vast amounts of data—in terms of both scale and scope—that may be useful for behaviour modification.16

Correlations can be identified through pattern recognition to develop strategies, including in the guise of algorithms, which can be applied, tested and refined in the interaction with their surroundings. Causations are important to human intelligence and can be applied to the design of AI systems by humans. AI systems can ‘learn’ from patterns in observations and feedback from their actions. AI allows for pattern recognition beyond human perception,17 and therefore also includes an element of unpredictability.

Breakthroughs in AI have come from cheap parallel computation, big data and better algorithms,18 and AI systems are often sufficiently powerful to allow for real-time human–computer interaction, including by means of adaptive algorithms. Even though processing power does not equal intelligence,19 it does allow to build systems that may pass the ‘Turing test’, i.e. exhibit behaviour indistinguishable from that of a human.

In data-driven marketing, AI systems can be used to increase the effectiveness of marketing, and allow targeted advertising based on ‘how [consumers] will behave, what they will buy, and what they will think.’20 Facebook may, for instance, predict future behaviour to allow advertisers to ‘target people on the basis of decisions they haven’t even made yet’, including the identification by means of so-called loyalty prediction to identify people ‘who are “at risk” of jumping ship from one brand to a competitor’.21

3.1. A proxy for understanding people

The overarching idea behind data-driven marketing is that models are trained to recognise and draw inferences from information and behaviour that, for instance, may reveal that riders of particular motorbike brand are actually likely to buy horse saddles. The information may be used to predict not only interests and preferences but also how best to persuade individual consumers, including by building a ‘persuasion profile’, as we discuss in Chapter 7 (persuasive technology).

From big data analyses of personal data, it is possible to obtain knowledge about how people are likely to behave; thus providing traders with more precise information regarding certain aspects of the individual’s preferences or expected behaviour. Such analyses may reveal correlations that are not likely to be discovered through mere logic. Actual causations are not necessarily important as long as the correlations in probabilistic terms provide sufficient insights to influence individuals.

In the 1960s, privacy concerns also revolved around the use of personality tests and other means to invade ‘psychological privacy’.22 It seems fair to assume that the current framework of pervasive surveillance is akin to an automated, real-time personality test at unprecedented scope and scale. The surveillance is likely to expand with new innovative technologies to record and share experiences (e.g., Ray-Ban Stories), measure biological states (smart health devices) and actually brain-to-text communication23 that we may voluntarily adopt to improve skills and personal efficiency.

There is likely a need for more popular conversations about these issues as discussed in Chapter 12 (next steps). The following analyses aim to suggest how marketing law and data protection law can be interpreted in light of fundamental rights to mitigate harms from predatory data-driven business models.

4. Harms from attention extraction and behaviour modification

The Center for Humane Technology has identified and published the following harms from the ‘extractive attention economy’ relying on persuasive technology:24

  1. Digital Addiction: Digital slot machines occupy more and more space in our lives.

  2. Mental Health: We constantly face a battle for our attention, social comparison, and bullying.

  3. Breakdown of Truth: It’s become harder than ever to separate fact from fiction.

  4. Polarization: Stronger ideological rifts make compromise and cooperation more difficult.

  5. Political Manipulation: Creating discord through cyberwarfare is far more cost-effective than military action.

  6. Superficiality: A social system built on likes and shares prioritizes shallowness over depth.

We will revisit these issues in Chapter 10 (human dignity and democracy) and Chapter 12 (next steps). In this vein, it could be appropriate to include the following quotation from the EDPS:25

‘[…] the Digital Single Market cannot uncritically import the data-driven technologies and business models […]. The internet has evolved in a way that surveillance—tracking people’s behaviour—is considered as the indispensable revenue model for some of the most successful companies. This development calls for critical assessment and search for other options.’

The EDPS is the European Union’s independent data protection authority. Its general mission includes advising EU institutions and bodies on all matters relating to the processing of personal data, including being consulted by the European Commission on proposals for legislation etc., as well as intervening before the CJEU to provide expert advice on interpreting data protection law.26

The importance of the regulation of data-driven business models can also be illustrated by the U.S. Federal Trade Commission settlement with Facebook resolving charges that Facebook deceived consumers by telling them they could keep their information on Facebook private, and then repeatedly allowing it to be shared and made public.27 In July 2019, a violation of this settlement lead to a $5 billion penalty and ‘significant requirements to boost accountability and transparency’.28

More recently, the European Parliament has in the context of the proposed Digital Services Act emphasised the need to ensure that ‘what is illegal offline is also illegal online’, and encouraged stricter rules on ranking, non-discrimination, understandable explanation of algorithms, and targeted advertising. The latter to favour ‘less intrusive forms of advertising that do not require extensive tracking of user interaction with content’.29

In a society where ‘smart’ equals surveillance, the area of privacy law is facing a renaissance akin to that of intellectual property rights a few decades ago, when the digitalisation of products (music, movies, etc.) took off. This time, it is citizens—including their behaviours and experiences—that are being digitalised, which calls for consideration of human dignity among other fundamental rights as explored in Chapter 10 (human dignity and democracy).

In that chapter, we also take into account Frances Haugen’s revelations documented by The Wall Street Journal in their ‘Facebook Files’.30 This whistleblowing is a commercial equivalent to Edward Snowden’s revelations of government surveillance.31 It may also compare to revelations about how the tobacco and oil industries knowingly have prioritised profits over public safety by suppressing and disputing evidence of harm.


1. See, e.g., Shoshana Zuboff, The Age of Surveillance Capitalism (Profile Books 2019) emphasising that value is extracted through surveillance and manipulation of users.

2. Joseph Turow, The Aisles Have Eyes (Yale University Press 2017), p. 22. See also Joseph Turow, The Daily You (Yale University Press 2011).

3. See Tim Wu, The Attention Merchants (Alfred A. Knopf 2017).

4. Herbert A. Simon, ‘Designing Organizations for an Information-Rich World’, in Martin Greenberger (ed.), Computers, Communications, and the Public Interest (The Johns Hopkins Press 1971), pp. 40–41.

5. See also Albert Bandura, Principles of behavior modification (Holt, Rinehart and Winston 1969).

6. See also Douglas Rushkoff, Coercion (Riverhead 1999), p. 270; and Allen W. Wood, The Free Development of Each (Oxford University Press 2014), chapter 12 (coercion, manipulation and exploitation).

7. See also Joseph Turow, The Aisles Have Eyes (Yale University Press 2017), p. 101.

8. Chris Jay Hoofnagle, Ashkan Soltani, Nathaniel Good, Dietrich J. Wambach & Mika D. Ayenson, ‘Behavioral Advertising: The Offer You Cannot Refuse’, Harvard Law & Policy Review, 2012, pp. 273–296.

9. Jamie Bartlett, The People Vs Tech (Ebury Press 2018), p. 23.

10. See also EDPB, ‘Binding decision 1/2021 on the dispute arisen on the draft decision of the Irish Supervisory Authority regarding WhatsApp Ireland under Article 65(1)(a) GDPR’ further discussed in Chapter 4 (data protection law).

11. Google Blog, ‘As G Suite gains traction in the enterprise, G Suite’s Gmail and consumer Gmail to more closely align’, 23 June 2017: ‘G Suite’s Gmail is already not used as input for ads personalization, and Google has decided to follow suit later this year in our free consumer Gmail service.’

12. Graham Mudd, ‘Removing Certain Ad Targeting Options and Expanding Our Ad Controls’, Facebook for Business News, 9 November 2021, <https://www.face
book.com/business/news/removing-certain-ad-targeting-options-and-expanding-our-ad-controls>: Emphasising that the interest targeting options are not based on people’s physical characteristics or personal attributes, but on ‘things like people’s interactions with content on our platform’ (emphasis added).

13. Ray-Ban and Facebook introduce Ray-Ban Stories, first-generation smart glasses, <https://tech.fb.com/ray-ban-and-facebook-introduce-ray-ban-stories-first-
generation-smart-glasses/>.

14. <https://developer.apple.com/news/?id=ecvrtzt2>. See also Gennie Gebhart & Bennett Cyphers, ‘Apple’s AppTrackingTransparency is Upending Mobile Phone Tracking’, Electronic Frontier Foundation, 27 April 2021, <https://www.eff.org/deeplinks/2021/04/apples-apptrackingtransparency-upending-mobile-phone-tracking>.

15. <https://www.statista.com> (accessed October 2021).

16. See also Council of Europe, ‘Guidelines on Artificial Intelligence and Data Protection’, 25 January 2019.

17. See also Yuval Noah Harari, Homo Deus (Harper 2017), pp. 322–323: ‘Humans have two basic of abilities: physical and cognitive. […] The idea that humans will always have a unique ability beyond the reach of non-conscious algorithms is just wishful thinking.’

18. Kevin Kelly, The Inevitable (Viking 2016), pp. 38–40.

19. Rutger Bregman, Utopia for Realists (Bloomsbury 2017, first published 2014), p. 190.

20. Sam Biddle, ‘Facebook Uses Artificial Intelligence to Predict Your Future Actions for Advertisers, Says Confidential Document’, The Intercept, 13 April 2018, <https://theintercept.com/2018/04/13/facebook-advertising-data-artificial-
intelligence-ai/>.

21. Ibid.

22. See, e.g., Alan F. Westin, Privacy and Freedom (Atheneum 1967) and Sarah E. Igo, The Known Citizen (Harvard 2018).

23. F.R. Willett, D.T. Avansino, L.R. Hochberg et al., ‘High-performance brain-to-text communication via handwriting’, Nature 593, 2021, pp. 249–254. <https://doi.org/10.1038/s41586-021-03506-2>.

24. <https://humanetech.com/problem/> (visited December 2019). Now (November 2021) replaced with a similar informative ‘ledger of harms’.

25. EDPS, Opinion 7/2015: Meeting the challenges of big data: a call for transparency, user control, data protection by design and accountability’.

26. <https://edps.europa.eu/>. Note that this authority is different from the EDPB introduced in Chapter 4 (data protection law).

27. <https://www.ftc.gov/news-events/press-releases/2012/08/ftc-approves-final-
settlement-facebook>.

28. <https://www.ftc.gov/news-events/press-releases/2019/07/ftc-imposes-5-billion-penalty-sweeping-new-privacy-restrictions>.

29. European Parliament, ‘MEPs spell out their priorities for the Digital Services Act’, press release, 28 September 2020.

30. <https://www.wsj.com/facebookfiles>.

31. See Glenn Greenwald, No Place to Hide (Metropolitan Books 2014).