LittleLaw looks at… Data Protection

Keep your friends close, your enemies closer and keep your data in a locked safe

May 29, 2020

11 min read

Sign up to our mailing list! 👇

What is data?

Many have hailed data as the new oil. In half a century the internet has rapidly invaded the lives of its 4.33bn users which has created a new widespread commodity: data. According to the World Economic Forum, the planet produces 2.5 quintillion1 bytes of it every day2

As the capacity of data analytics software has advanced, so has the profitability of collecting large quantities of data. For example, personal data (information not limited to: your health records, banking providers and products, location, beliefs and personal preferences) is worth a lot to advertisers who can use it to specifically target a desired audience. 

Deriving value from data has helped to create powerful world tech leaders. Some of the most valuable companies in the world (by market value) are tech companies like Alphabet (Google’s parent company), Amazon and Facebook, collectively referred to as GAFAM companies3. These companies’ business models are, at least in part, driven by the monetisation of data, which they collect and sell to various buyers.

The personal data economy

There are three types of data traders: platforms and publishers, intermediaries, and brands. The flow of data in the personal data economy is between these three types of traders who create value from the data through targeted advertising.

Brands pay for advertising to particular audiences (banner ads, sponsored posts etc…)

AUDIENCE

INFORMATION

PRODUCT

Platforms and publishers

They offer an audience to which brands may advertise content.

Intermediaries

They use personal data to help to match brands to their most effective audience/platform/publisher.

(Intermediaries use this data to suggest the best audience for advertisers)

Brands

They want to advertise to those who might want to buy their products.

(Brands pay intermediaries for personal data that can tell them more about their targeted clients)

 

Facebook and Google act as “Intermediaries”, as well as “Platforms and Publishers”, because they have both large audiences and a large amount of information about internet users. This is a potent pairing in the data economy. The fact that few companies can offer such an expansive proposition as Facebook and Google has led to their domination in this sector.

At face value, it seems fair that Google and Facebook can monetise their users’ data given that their services are free. Moreover, we are all used to adverts and most citizens in capitalist economies are quite content with the concept of advertising. Having said this, the flow of information becomes troubling when intermediaries sell the data to organisations such as political parties, security forces, law enforcers and foreign nations, in some cases dealing through third parties.

Data sharing

Cookies – the perennial enemy of internet users – are records of your online activity and information.


BRANDS (First party cookies)

First-party cookies allow the website you are on to collect data on you. 

For example, if I look at buying some shoes online, the seller (whose website I am on) can remember some information you give it, such as your shopping carts and username and password. This makes your online experience more efficient.


INTERMEDIARIES (Third-party cookies)

Third-party cookies allow other websites that partner with the website that you are on to access data about you. 

It allows advertising companies to detect your online preferences (in this example buying shoes) and therefore they can target you with online adverts (for better shoes!).


Companies can collect a lot of information about your online activity over a period of time. This is known as tracking. The most effective trackers are social media companies like Facebook and search engines like Google. The more searches you conduct through Google, and the more Facebook pages and posts you interact with, the more information these intermediaries have about you

For brands, it is then an essential prerequisite to advertising their product to purchase this data from an intermediary to ensure their target audience is maximised. The result is that a few intermediaries (most prominently Facebook and Google) have become very successful at knowing a lot about you, and brands are forced to bargain with them for this crucial data.

Data scandals

Two major privacy scandals alerted people to the danger of data tracking: the Snowden leaks in 2013 and the Cambridge Analytica scandal that was revealed in 2018. 

Edward Snowden infamously blew the whistle on America’s National Security Agency (NSA) and the UK’s GCHQ when he revealed they had used large tech companies’ data to carry out mass surveillance on millions of Americans, Europeans and Latin Americans. This included tapping the phones of many prominent politicians.

Cambridge Analytica, a political consultancy company who advised Vote Leave in the Brexit referendum, was revealed to have illegally used Facebook data during its work for the 2016 Trump Presidential campaign. The failure of the American privacy activist David Carroll’s subsequent legal case to get his data back from Cambridge Analytica highlighted the unscrupulous use of personal data and the difficulty of enforcing data rights. 

Consequently, users – beyond privacy experts – increasingly began wondering, “who is using my data?” and “how is my data being used?”. 

The best legislative response: the EU’s General Data Protection Regulation (GDPR)

The EU’s GDPR, which came into force on 25 May 2018, is widely considered the gold standard for data protection. It does three key things:

  1. Allows you to order anyone in possession of EU citizens’ data to delete it (right to be forgotten);
  2. Ensures that when EU citizen’s data goes abroad, it is handled at a level that is equal to EU protection (data adequacy for overseas transfers);
  3. Places obligations on companies who manage EU citizen’s data outside of Europe (extraterritorial power).

Because GDPR is widely considered the gold standard for data protection, many other nations are now following suit. In America, the state of California has led the way with the California Consumer Privacy Act (CCPA) which came into force on 1 January 2020. Data privacy has also been a focus for other states in recent years including New York, Massachusetts, Texas and Washington. On a federal level, American Congress is known to be debating and drafting privacy legislation. 

The motivation for many other nations to meet EU data privacy standards is to obtain “data adequacy” (see table below). This has been the case for countries such as Australia and Canada which have plans to review and improve their data protection laws. Moreover, Brazil has recently passed the Brazilian General Data Protection Law (LGPD) and India’s Personal Data Protection Bill is in the process of being scrutinised and could be on track for approval sometime next year. 

Globally, half the world’s population is set to have privacy regulations that are in line with GDPR by 2020, according to Gartner.


The right to be forgotten 

GDPR enshrines the right to be forgotten4 which allows EU citizens to ask a data controller to delete data that is held about them. This gives EU citizens control over their data and follows Europe’s right to privacy as enshrined in the European Convention on Human Rights.

Data adequacy for overseas data transfers

GDPR outlaws data transfers to non-EU nations who have not been granted “data adequacy5 by the EU. Data adequacy means that the level of protection of personal data adequately meets the EU’s standard to allow the transfer outside the EU.

Extraterritorial powers

GDPR gives EU citizens rights over their data “regardless of whether the [data] processing takes place in the [European] Union or not”. These extraterritorial powers6 have been put in place to ensure that large tech companies cannot dodge their legal responsibilities if they are headquartered outside the EU. This will help to combat a tactic known as “jurisdiction shopping”, where companies select the most advantageous jurisdictions for their operations. 

Large fines and more aggressive regulators

The most severe penalty is a fine of up to 4% of global turnover or €20 million7, whichever is higher. We have not yet seen this punishment dished out, however there have been some recording-breaking figures such as British Airway’s £183m fine by the Information Commissioner’s Office (ICO), the British regulator (to see our article on that, click here).

Key cases that test GDPR

Case 1: Schrems I and II and La Quadrature du Net – testing “data adequacy”

The data privacy activist Max Schrems has been involved in data privacy lawsuits since 2011 and many consider him and Mark Zuckerberg, who is just three years Schrems’ senior, to be parallel opposites. 

In 2013, Schrems filed a suit to the Irish Data Protection Commission (DPC) complaining that Facebook should not be able to transfer data from Ireland to the US. When the case was passed to the Court of Justice of the European Union (CJEU), he successfully proved the failure of data adequacy regulation. The case came to be known as Schrems I.

Data adequacy in the pre-GDPR era concerned two key legal articles:

  1. Article 8 of the Charter of Fundamental Rights – which enshrined the protection of personal data as a fundamental freedom.
  2. US-EU Safe Harbour Framework – which was the legal mechanism for transferring data between the EU and the US ensuring that the level of protection of personal data was in accordance with (1).

Schrems argued that the Safe Harbour Framework was an inadequate legal mechanism to enforce Article 8 of the Charter of Fundamental Rights. The case ended up being the perfect storm for Schrems, as the 2013 Snowden leaks proved what might have otherwise been considered spurious allegations by the young Austrian. Therefore, in 2015 the CJEU ruled in favour of the one man prepared to take on the multi-billion-dollar tech giant Facebook. The CJEU’s decision invalidated the Safe Harbour Framework

In the aftermath of this decision, the US and the EU worked quickly, under great lobby pressure, to negotiate a replacement framework. They came up with the EU-US Privacy Shield Framework that was approved and deemed adequately compliant with EU law by the European Commission in 2016. 

Whilst Schrems was pleased that the court found that, under Articles 7 and 8 of the Charter of Fundamental Rights of the European Union, mass surveillance is illegal, the new Privacy Shield was the next challenge. Under the Privacy Shield, any EU-US data transfers could be subject to Standard Contractual Clauses (SCCs) in order to be legitimate under EU law.


Standard Contractual Clauses are legal clauses by which businesses transferring personal data into or out of the EU agree to meet European Privacy Standards.


Following the invalidation of “Safe Harbour Principles” (a commitment to protect personal data)8 by the Court of Justice of the European Union (CJEU), Schrems is now asking the CJEU to clarify the appropriate method to assess a non-EU country’s data protection laws and SCCs’ ability to “ensure a sufficient level of  data protection” of European data rights. 

Schrems’ case, known as Schrems II, was heard on 9 July 2019 and the CJEU’s Advocate General Henrik Saugmandsgaard Øe gave his opinion on 19 December 2019. From this, it looks like SCC’s will not be invalidated, but businesses and regulators may have to take additional measures. This places more emphasis on data exporters to evaluate the security around the data transfer, and encourages data regulators to react quickly.

There has been no progress since December in this case, so we face a wait to see what the CJEU’s final verdict is. But businesses and the Privacy Shield cannot rest easy yet, because of another legal challenge from the French privacy group La Quadrature du Net that more directly challenges the Privacy Shield. The question remains as to whether non-EU countries can be trusted to process data in-line with EU laws and avoid a repeat of the Snowden leaks.

Case 2: Google v the right to be forgotten – testing extraterritorial powers

The right to be forgotten, enshrined in article 17 of GDPR, promises global privacy for EU citizens owing to GDPR’s extraterritorial powers. 

However, it has been given a geographical border in the landmark judgement Google Inc. v Commission nationale de l’informatique et des libertés (CNIL) (2019). Google was contesting a fine of €100,000 from the French privacy regulator CNIL after the search engine refused to remove links globally. The search engine was vindicated as the ECJ ruled that personal rights concerning data were not absolute rights but rather rights for society to determine. 

Thus, without a legal mechanism to balance the European standard on privacy and personal data with the rest of the world, the ECJ decided the right to be forgotten must remain confined within European borders. 

The judgement by the ECJ, provokes questions rather than providing clarity regarding the extraterritorial scope of European data protection. In the ruling in favour of Google, the court notes that “EU law does not provide for cooperation instruments and mechanisms as regards the scope of a de-referencing outside the EU”. The ruling is a pointer for European legislators to consider such a mechanism in the future. 

But, even if the ECJ had ruled against Google, what would have been the reality? One week later, the ECJ defined very different legal borders for European online defamation laws (derived from Directive 2000/31/EC) for Facebook. In Eva Glawischnig-Piesczek v Facebook Ireland Limited, the ECJ ruled that Facebook must search for and remove “identical” or “equivalent” illegal posts worldwide

But there is a vital footnote to this ruling: the ECJ explained that worldwide injunctions are limited by “the framework of the relevant international law”. In other words, it will consider other nations’ values. Consequently, the effectiveness of the EU’s extra-territorial reach in this case is another battle to be fought. Either the relevant EU state accepts non-EU states’ differences in sovereign values, or they persist in pursuit of global compliance. 

The crux of this debate is about the point at which it is necessary to enforce extra-territorial liability to protect European citizens’ data. GDPR is arguably insufficiently expansive to adequately protect its citizens’ rights on the global scale of the internet. However, laws with extraterritorial powers risk undermining non-EU nation’s sovereignty by setting a standard for data privacy that was not decided by the society in question. 

LittleLaw’s verdict: Regulators, catch me if you can!

It is clear that protecting citizen’s data is an ongoing battle. From issues concerning current legislation to hopes that the world can develop some form of cohesive global protection, there is much to be done. Nevertheless, a great deal has been achieved thus far. Businesses now have reason to assess their data management for fear of large fines and class action suits. Scandals have powerfully exposed the importance of ethical data handling and the potential threat it bears for our freedom and democracies.  

And data privacy is just one aspect of the regulatory challenge surrounding new technology. Taxation, intellectual property, competition and so on are all being forced to adapt to the rapid rise of new technology. The message to regulators is: “catch me if you can!”.

Report written by Will Holmes

If you’d like to write for LittleLaw, click here!

Share this now!

Footnotes

  1. That’s 1029!
  2. World Economic Forum “The value of data” (weforum.org, 22 September 2017).
  3. Google, Amazon, Facebook, Apple, and Microsoft.
  4. Chapter 3, Section 3, Article 17 “Right to Erasure (‘right to be forgotten’)” (2016/697 (GDPR).
  5. Chapter 5 “Transfers of personal data to third countries or international organisations” (2016/697 (GDPR).
  6. Chapter 1, article 3 “Territorial scope” and Chapter 5 “Transfers of personal data to third countries or international organisations” (2016/697 (GDPR).
  7. Chapter 8 “Remedies, liability and penalties” (2016/697 (GDPR).
  8. 2000/520/EC: Commission Decision of 26 July 2000 pursuant to Directive 95/46/EC of the European Parliament and of the Council on the adequacy of the protection provided by the safe harbour privacy principles and related frequently asked questions issued by the US Department of Commerce (notified under document number C(2000) 2441) (Text with EEA relevance.), Eur-lex, 26 July 2000 (see https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX%3A32000D0520).