Europe’s pending General Data Protection Regulation (GDPR) threatens free expression and access to information on the Internet. The threat comes from erasure requirements that work in ways the drafters may not have intended — and that are not necessary to achieve the Regulation’s data protection purposes.
The GDPR’s “Right to Be Forgotten” or “Erasure” provisions serve an important goal, establishing enforceable rights and procedures to delete personal data held in companies’ back-end storage systems and used for purposes such as profiling. But the GDPR’s streamlined erasure process, which makes sense for this data, can also be used to erase other Internet users’ online expression. The process, and the threat of high penalties, encourage companies to comply with most or all requests — erasing information that, under the law’s own terms, is legitimately processed and should not be deleted. Minor revisions or clarifications in the GDPR could protect Internet users’ online expression from being erased without good reason, and better serve the goal of proportionality under EU law.
Improper removal requests are common.
Protecting online expression from legally invalid removal requests is important, because erroneous or malicious requests to delete online content are very common. Widely reported examples include attempts, using intermediaries’ legal notice and takedown systems, to suppress online information based on religious or political disagreement, or to silence negative consumer reviews.
Data protection-based removal requests are no exception to this pattern of over-reaching claims. According to Google, 58% of “Right to Be Forgotten” requests it receives do not actually state valid claims under EU law. The 58% figure is presumably roughly accurate — Data Protection Authorities reviewing these cases have generally agreed with the company’s legal assessment. The many additional intermediaries covered by the GDPR’s expanded “Right to Be Forgotten” will inevitably receive many invalid or abusive requests as well.
Standard tools derived from intermediary liability law can protect online expression from improper removal requests.
The EU already has laws and norms intended to solve the problem of invalid content removal requests. Intermediary liability rules under the eCommerce Directive are designed to facilitate removals for people with legitimate grievances, while preventing abusive or over-reaching requests from succeeding. The most basic protection for Internet users’ rights comes from the eCommerce Directive’s “knowledge” standard for removal, which ensures that intermediaries need not comply with clearly groundless removal demands. More detailed rules in Member State implementing law and in the civil-society-endorsed Manila Principles include penalties for bad-faith removal requests and opportunities for the accused online speaker to defend her rights.
The GDPR encourages intermediaries to comply with legally invalid erasure requests.
The GDPR does not apply established rules and norms from other notice and takedown systems. Instead, it tells intermediaries to follow a new process with minimal checks and balances to protect online expression against groundless accusations. Among other things,
- Intermediaries must take content offline immediately upon receiving a removal request – before even assessing the legal claim asserted.
- Intermediaries are generally barred from notifying the accused speaker or giving her a chance to defend her online expression before it is erased.
- Intermediaries may be compelled to disclose the accused speaker’s personal information to the person seeking removal – surely an unintended suggestion, given the law’s larger pro-privacy purpose.
Companies also have financial incentives to honor most or all removal requests: noncompliance with the GDPR can cost a company up to 5% of a its annual global turnover or €100 million per violation. It is unreasonable to expect small companies — or indeed, almost any companies — to take on such risks to defend users’ rights.
GDPR drafters can solve this problem by more clearly incorporating EU intermediary liability standards to balance Internet users’ fundamental rights.
The GDPR’s erasure provisions are probably not intended to work this way, and they don’t have to. Clearly invoking standards from existing European notice and takedown law would improve protections for expression, without undermining the GDPR’s privacy provisions in the process — and without taking a side in long-running debates about how other aspects of data protection and eCommerce law relate to one another.
Alternately, for erasure requests targeting online expression, lawmakers could eliminate the “restriction” requirement for controllers to temporarily take content offline before even assessing the legal claim against it. This provision may be particularly harmful in practice, because it shifts intermediaries’ default behavior toward deletion and moves a large number of requests away from the “knowledge” standard for removal.
If adding other new amendments at this late date proves impossible, the best hope for at least some improvement will rest with DPAs, courts, and Member State legislatures as they interpret and implement the law. But it would be far better to fix it now.
Other posts in this series lay out the problems discussed here in greater detail.
- Introduction. Discusses current convergence between legal frameworks of data protection and intermediary liability;and FAQs covering more detailed concerns.
- GDPR Notice and Takedown Overview. Briefly reviews the GDPR erasure process, and identifies tensions with intermediary liability and free expression principles.
- GDPR Notice and Takedown Details. Provides a deeper dive into GDPR text and operational requirements.
- Drafting Solutions. Proposes drawing on principles of intermediary liability under the eCommerce Directive, without weakening privacy and data protection rights under the GDPR.
- GDPR Free Expression Provisions. Identifies weaknesses in Article 80 as a mechanism to prevent excessive content deletion; discusses disproportionate advantages for privacy/data protection rights, as compared to free expression/information rights, under the GDPR’s regulatory and judicial review processes.
Daphne Keller is Director of Intermediary Liability at The Center for Internet and Society at Stanford Law School
Disclosure: Daphne Keller previously worked on “Right to Be Forgotten” issues as Associate General Counsel at Google. The version of this post available through Inforrm contains proposed language different from that which appeared in a footnote to the original Stanford CIS post.
Request: Comments and feedback on this analysis are very welcome, here or to on Twitter @daphnehk