In September, the European Commission presented a legislative proposal to address the removal of terrorist content online. There has been significant political pressure, particularly as the EU elections of 2019 approach, towards internet companies taking on increased responsibility in the area of terrorist propaganda online. This proposal would be a marked move from various voluntary initiatives taken up by some social media companies in recent times towards a legal responsibility framework for many.
While appreciating the concerns around terrorism, Cloudflare is not only troubled by the late presentation of this proposal – which leaves inadequate time for a thorough review before this EU legislative term expires – but also much of the substance. Along with others such as CDT, GSMA/ETNO and Mozilla, we have significant concerns around the legal implications, practical application and possible unintended consequences of the proposal, some of which we outline below. Furthermore, we believe that little evidence has been presented as to the necessity of the proposed measures.
Concerns and shortcomings
The Commission’s proposal does not account for the complexity and range of information society services having a storage component – not all services have the same reach and impact, and so a one-size-fits-all approach is not justified. This has been a concerning trend overall with EU legislative proposals: in wishing to regulate the behaviour of a few large social media platforms, many other providers of differing sizes and types are brought within scope with sweeping and clumsy definitions. The reality is that only the largest providers have the resources and means to address many of the requirements, which further cements their position as gate-keepers and dominant players in the marketplace.
We also have concerns that the proposal will chill legitimate online speech. The overly broad definition of “terrorist content” covers the “inciting or advocating, including by glorifying, the commission of terrorist offences” and other activities related to the encouragement, instruction or promotion of terrorist-related activities. Under this definition, almost any type of terrorism-related content, including of an educational, journalistic or academic-based nature, could be within scope of a removal order. This poses very real risks of removal of legitimate content, with its consequent effects on freedom of expression and information, as companies are incentivized to avoid any possibility of liability. The proposal is simply not robust enough to ensure that legitimate content actually remains online.
Cloudflare would suggest that given the high risk of unintended consequences, if a proposal like this is to go forward as a result of political pressure, additional due process should be required and the proposal should be significantly narrowed. Judicial authorities alone should be empowered to issue removal orders. From a practical standpoint, having all EU countries, each with multiple potential authorities, issuing orders for content removal seems likely to exacerbate concerns about the effect on freedom of expression. Each EU country should instead elect just one authority to issue the orders or better still, a one-stop-shop model could be established with one European entity serving as the single interface for hosting service providers. Given that providers are in turn asked to appoint one company and legal representative, this seems like a practical way to streamline the process and reduce the likelihood of divergent views about what constitutes content deemed appropriate for removal.
The proposal calls for the removal of terrorism content within a one-hour timeframe. Regardless of the fact that a legal assessment of the content is not required, an operational discussion still has to take place within a company as to the appropriate type of removal measure and how and where it is effected. As noted in the Commission’s Impact Assessment, over 90% of European companies are SMEs and so asking all providers to fulfil the removal requirements within 60 minutes is highly unreasonable.
The privatisation of law enforcement – a troubling trend
A growing number of regulatory and policy initiatives at European level have seen Internet service providers encouraged to proactively decide on the legality and nature of content online, undertake risk assessments along with the balancing of fundamental rights and freedoms, and evaluate any conflicts of law, all while potentially facing liability if they make those assessments incorrectly. This has effectively resulted in a privatisation of law enforcement, with the additional risk that smaller providers will look to crude, untested tools in order to help meet compliance. The shifting of the burden of responsibility from the State to the provider is also seen here in the ask to providers to manage complaints from content providers, with no role foreseen for the competent authorities.
Furthermore, this proposal foresees a scenario whereby companies could be asked to take on proactive measures – which could include filtering – if they receive even one removal order. Not only is this not a proportionate ask but it is a departure from the well-established legal principle of “no general monitoring” as set out in Article 15 of the EU eCommerce Directive. The idea of internet filtering has been creeping into a range of legislative proposals and should be a concern for all.
Member States are advancing at pace with their review and may already reach an agreed position this week. Meanwhile the European Parliament, which also has to undertake a review, has yet to commence its work in earnest. Cloudflare will be working to ensure that our concerns around this proposal are heard and that due process, legality and proportionality are not sacrificed in the political rush. You can follow progress of this file here.