european-commission-building-flagsAs you might remember, a draft version of the European Commission’s Communication on Online Platforms and the Digital Single Market was leaked at the end of April. From digesting it at that time, it seemed to be clear that the Commission had taken the view that content regulation should be sectorial and the liability exemptions to be found in the e-commerce Directive (2000/31/EC) had to be preserved. [See my earlier post here].

Scroll forward a month, the official version of the Communication released on the 25 May 2016 is not really an exact copy of the leaked document. But is it simply a change of writing style or is it the case that the substance of the Communication has also changed? [Substance is usually said to be more durable than form…]

In particular, reading the Communication together with the proposal for a Directive amending the Audiovisual Media Services Directive (AMSD), could it be that surreptitiously or indirectly, the domain or the effects of the liability exemptions to be found in the e-commerce Directive will progressively shrink?

Let’s start with the Communication itself:

  • The European Commission acknowledges the variety of online platforms. “Online platforms come in various shapes and sizes and continue to evolve at a pace not see in any other sector of the economy”. Online platforms include:“online advertising platforms, marketplaces, search engines, social media and creative content outlets, application distribution platforms, communications services, payment systems, and platforms for the collaborative economy”.
  • Online platforms do good [remember they are a “magnet for data-driven innovation”] and, as such, it is essential to create “the right framework conditions and the right environment… to retain, grow and foster the emergence of new online platforms in Europe”. Besides, it is also essential to have one trans-EU set of rules and not 28 sets of different national rules. Harmonisation is thus an appropriate strategy.
  • Nevertheless, online platforms MUST play by the rules. Said otherwise “online platforms are subject to existing EU rules in areas [the list is not exactly short] such as competition, consumer protection, protection of personal data and single market freedoms”. Compliance with these rules is the only way to ensure fair play! Therefore, the stick of enforcement [when these rules are broken] is crucial.
  • So much for existing regulatory measures! Future regulatory measures can only target “clearly identified problems relating to a specific type of activity of online platforms”, as this is “in line with better regulation principles”.
  • And here come the 4 key principles which should guide the law-makers when designing future regulatory measures: “a level playing field for comparable digital services; responsible behaviour of online platforms to protect core values; transparency and fairness for maintaining user trust and safeguarding innovation; open and non-discriminatory markets in a data-driven economy”.

What is the second principle? …Once again “responsible behaviour of online platforms to protect core values”. This is politician jargon but what does this really mean once translated into legalese? The following two paragraphs are crucial to understand what the European Commission has in mind:

  1. “The present liability regime for intermediary service providers, as set out in the e-Commerce Directive, was designed at a time when online platforms did not have the characteristics and scale they have today. However, it did create a technology-neutral regulatory environment that has considerably facilitated their scaling-up. This is in part due to the harmonisation of the exemption of certain types of online platforms from liability for illegal content and activities, in respect of which they have neither control nor knowledge. While certain concerns were raised on liability issues the consultation showed broad support for the existing principles of the e-Commerce Directive”.
  2. “Given this background, the Commission will maintain a balanced and predictable liability regime for online platforms. This is crucial for the further development of the digital economy in the EU and for unlocking investments in platform ecosystems. At the same time, a number of specific issues relating to illegal and harmful content and activities online have been identified that need to be addressed to render this approach sustainable”.

What do these two paragraphs mean? Is it the intention to reduce the domain [by excluding a certain number of actors from the list of intermediary providers] or to limit the effects of existing liability exemptions [by limiting the effects of Article 15]?  So, how could we make the liability regime [isn’t it a regime of liability exemptions?] ‘sustainable’? In 5 (3+2) ways argues the Commission:

  • In relation to content that is harmful to minors and hate speech, thanks to the proposed amendments to the ASMSD, online video sharing platform services will have a duty to take appropriate measures under Article 28a, such as: educating users by defining prohibited activities in their terms of use; implementing notice-and-action procedures; age verification systems; systems allowing users to rate content; and, supplying parental control systems.
  • In relation to the protection of intellectual property rights, fair remuneration of creators should be ensured. This is because of growing concerns that the value generated by some of the new forms of online content distribution may not be fairly shared between distributors and rights-holders. The full list of measures contemplated by the European Commission in developing this stated aim has not been unfolded yet. Hence, more to come!
  • In relation to incitement to terrorism, child sexual abuse and hate speech [it is thus the second time that hate speech is mentioned] online platforms will be encouraged to take voluntary action. In fact, voluntary actions has already been encouraged as a Code of Conduct on Countering Illegal Hate Speech Online has been drafted in collaboration with Facebook, Microsoft, Twitter and YouTube and released a few days ago. This code comes after the work of the German task force against illegal online hate speech. [Note that the reaction of civil rights group to this has not been that welcoming – see here]. Although IT companies are meant to work in collaboration with Civil Society Organisations in this respect – plus, it would seem law enforcement agencies (the role of which would be to detect hate speech and notify the IT companies) –  the notice-and-action procedure to be put in place does not seem to be restricted to a limited number of complainants.
  • In addition, the Commission will monitor existing procedures for notice-and-action “to ensure the coherence and efficiency of the intermediary liability” [intermediary liability exemptions to be more precise once again] and review the need for formal notice-and-action procedures, during the second half of this year.
  • Finally (in the Commission’s list, this is in reality the 4th point and not the last one), the Commission is now trying to give something back. The Commission is promising to give more clarity in the domain of intermediary liability exemption, as arguably if intermediaries engage into effective self-regulatory measures they run the risk of losing the benefit of liability exemptions! At first glance both the mandatory and voluntary measures introduced or encouraged by the European Commission do not seem to require intermediaries to take the initiative to detect unlawful content. National cases (such as the French Dailymotion Case of 2014, for example) seem indeed to suggest that if the moderation/regulation is undertaken at the initiative of the service provider (and not the community of users) a service provider cannot avail itself of liability exemptions. However, when it comes to incitement to terrorism, child pornography and hate speech (the list is actually longer…), access providers and hosting providers are under a special obligation to contribute to the fight against these criminal activities. While the text of Article 6 of the French Act (Loi sur la confiance dans l’économie numérique) does not seem to require service providers to take the initiative to systematically screen their systems, judges have not always shared this view (See e.g. a 2008 Dailymotion decision).

But the story does not stop here (for good or bad, depending on one’s perspective on such matters) as the Communication on Online Platforms must be read together with the proposed amendments to the AMSD. What does one find in the proposal? Two things are important to note:

  • The first one has already been mentioned: “the amended Directive would introduce an obligation on Member States to ensure that, within their field of responsibility, video-sharing platform providers put in place, preferably through co-regulation, appropriate measures to: 1) protect minors from harmful content; and ii) protect all citizens from incitement to violence or hatred”. As long as these measures do not require these hosting providers to take the initiative to detect unlawful content, there is an argument that such a duty is compatible with Articles 14 and 15 of the e-commerce Directive. The Commission does not exactly use these terms, though, to explain the compatibility between the amended AMSD and the e-commerce Directive. Here is what it writes: The system would be compatible with the liability exemption for hosting service providers set out in Article 14 ECD, in as far as that provision applies in a particular case, because these obligations relate to the responsibilities of the provider in the organisational sphere and do not entail liability for any illegal information stored on the platforms as such”.
  • The second thing to note is that the definition of ‘audiovisual media service’ is ‘modernised’ so that it includes services“where the principal purpose of the service or a dissociable section thereof is devoted to providing programmes, under the editorial responsibility of a media service providers, in order to inform, entertain or educate, to the general public by electronic communications networks”… What is the European Commission doing here? Is it indirectly condemning the activity-based approach underlying the e-commerce Directive, which entails that – even if a dissociable section of a service requires editorial control – it is still possible for a service provider to avail itself of Article 14 of the e-commerce Directive, as long as the activity at stake does not require editorial control? So what about hybrid services [where did we last see a hybrid service… although not an audiovisual media service….was it in the Delfi case discussed here?], which would have both edited content and user-generated content? Examining the definition of ‘video-sharing platform service’ it could seem [couldn’t it?] that the activity-based approach is saved, as a video-sharing platform service is defined as a service the principle purpose of which or a dissociable section thereof “is devoted to providing programmes and user-generated videos to the general public, in order to inform, entertain or educate”. [I am wondering why introducing the notion of ‘a dissociable section of a service? Couldn’t ‘service’ be understood in the sense of ‘activity’?].

So in the end, is the e-commerce Directive’s shield really as bright and shiny as it once seemed [at least in the books]?

This post originally appeared on the Peep Beep! blog and is reproduced with permission and thanks