Computers, Privacy & the Constitution

How the GDPR’s Failure is a Feature, Not a Bug

-- By AvrahamTsikhanovski - 21 July 2024

Introduction

When the European Union adopted the General Data Protection Regulation in 2016, many celebrated its adoption as a milestone in information privacy and human rights, calling it the “world’s strongest set of data protection rules.” Indeed, there was seemingly a lot to celebrate with the passage of this regulation, as it imposed seemingly strict guidelines on entities processing personal data of individuals within the European Union. These guidelines included storage limitations (restricting the storage of personal data to only the amount necessary), confidentiality (restricting access to data only to those processing it), and data minimization (gathering and keeping only the exact amount of data that is needed to provide a particular service), as well as others. The GDPR would also impose harsh penalties on anyone violating its terms. Evidence of the groundbreaking nature of the GDPR was evident shortly after its passage. In the two year period leading up to its implementation, companies that fell under the scope of the GDPR complained heavily about the burdens of complying with its regulations, and fear of the harsh penalties that the regulation would impose on violators.

Before long, other governments followed the lead of the EU and began passing regulations that either copied or closely resembled the GDPR. Examples include Turkey, the United Kingdom, and the State of California. In the last few years, other U.S. states, such as Colorado, Virginia, and Utah have passed similar laws to the GDPR or its California equivalent, the California Consumer Privacy Act.

As is often the case whenever certain states pass regulations targeting a specific sector, the conversation regarding federal intervention inevitably reignites. In this case, proponents of stronger privacy laws in the United States have argued that there is a strong need for the federal government to pass its own privacy laws, as it would protect more people and create a level of uniformity for privacy laws nationwide, instead of having inconsistent laws in different states. Although the passage of federal regulations that mirror the GDPR would be an enormous leap forward for regulating privacy in the United States, the prospect of a federal implementation should have us evaluate where the GDPR failed to deliver, why it failed to deliver, and what a federal privacy regulation can do to re-envision privacy rights in the United States.. This paper will argue that while the GDPR had the optics of being a step forward for privacy rights, it was actually designed to allow the status quo of systemic and omnipresent data harvesting to persist and thrive, and merely served as an act of political theater, duping the privacy-conscious into thinking that meaningful change was being enacted. In short, this paper will argue that the GDPR’s shortcomings are a feature, and not a bug.

The GDPR's Loopholes

The first piece of evidence pointing us to the conclusion that the GDPR was designed to be a piece of political theater concerns a loophole that companies exploit to harvest data known as “dark patterns.” Although there is currently no set legal definition for this term, “dark patterns” are commonly understood as “practices in digital interfaces designed to direct, deceive, coerce or manipulate users into making choices against their best interests.” That means that a user, in addition to being overwhelmed by the fine print that demands for their consent before they access a website, also has to contend with a deceptive user interface that would trick them into giving data harvesters consent to use their data. An example of this would be a cookie consent notice that does not have a clear “reject” button. If the GDPR truly wished to grant users agency in determining where their data goes, wouldn’t it have proscribed this practice from the get-go?

The second loophole that companies exploit concerns the vagueness surrounding much of the language used in the regulation. There are six bases for data processing to be lawful. One of them is consent, which was discussed in the previous paragraph, but there are five other bases that provide data harvesters with opportunities to exploit legal loopholes with which they can harvest more data. For example, another justification for the “collection, handling, and/or storage of people’s personal data” is when there is a “legitimate interest to process someone's personal data.” The vague nature of this basis is ripe for abuse, and companies, armed with armies of lawyers, can quickly use this justification to harvest data that should otherwise be prohibited. Again, if the GDPR was truly concerned with creating a new standard and regime for privacy, would it really use language that corporations will gleefully abuse?

Reimagining Privacy Law in the U.S.

Mark Zuckerberg published an op-ed in 2019 calling for the adoption of GDPR-style regulations in the United States. Now, why would someone who has become one of the richest men in the world harvesting data, call for regulations that would seemingly hurt the very industry he pioneered? It must be because the GDPR and its progeny do little to actually limit or extinguish data harvesting. When reimagining what data privacy laws look like in the United States, we need to go far beyond laws that act as political theater. Instead, we need to re-envision privacy rights as a form of human rights, and create laws that ban any loopholes in data harvesting, empower citizens with knowledge that will allow them to use the internet in a way that protects the sanctity of their privacy, enforce penalties that would destroy companies that violate data privacy laws, and reframe the conversation about data privacy as a battle between freedom and despotism. Anything else will always just be political theater.

I don't quite understand why we should conclude that GDPR has failed because it has loopholes. On that basis all tax law has always failed. I think that's a red herring. The US doesn't have an absence of data protection law: it has a carefully-engineered no-law zone, a system of immunity and subsidy through reduced legal liability like that benefiting the railroads and other "active users" in the antebellum us economy that Morton Horwitz described nearly half a century ago in The Transformation of American Law. It's not an oversight or a legal failing. It's a political decision coherently maintained for decades and apparently very successful as national industrial and strategic policy. To describe that policy as having the shortcomings of not being yours is a politicsal category error. A draft that granted the current system its intellectual integrity would actually make more headway in showing how it could change.


You are entitled to restrict access to your paper if you want to. But we all derive immense benefit from reading one another's work, and I hope you won't feel the need unless the subject matter is personal and its disclosure would be harmful or undesirable. To restrict access to your paper simply delete the "#" character on the next two lines:

Note: TWiki has strict formatting rules for preference declarations. Make sure you preserve the three spaces, asterisk, and extra space at the beginning of these lines. If you wish to give access to any other users simply add them to the comma separated ALLOWTOPICVIEW list.

Navigation

Webs Webs

r4 - 21 Jul 2024 - 17:19:47 - AvrahamTsikhanovski
This site is powered by the TWiki collaboration platform.
All material on this collaboration platform is the property of the contributing authors.
All material marked as authored by Eben Moglen is available under the license terms CC-BY-SA version 4.
Syndicate this site RSSATOM