Background
Most of the widespread criticism of the legislation centers around two provisions: Article 13 and Article 11.
- Article 13 makes sweeping changes to publisher liability for user posts. Traditionally, internet users are liable for content they post to platforms like YouTube, not the platforms themselves. As in the U.S., the platform is not held liable for copyright infringement or other illegal content if the platform swiftly removes infringing content when notified. With a few exceptions, Article 13 would hold hosting platforms accountable for the content they post, requiring publishing platforms to make sure that text and photos found in posts do not violate copyrights. This would also impact both text sites like WordPress as well as photo-sharing sites like Instagram.
- Article 11 would require Twitter and other sites that share snippets of content to pay the publishers of that content or limit the text used in links to a few “individual words.” Though referred to by opponents as a “link tax,” basic links and search engine listings are exempt from these payment mandates—the Article is actually aimed at changing the text that goes along with links. Also of note, non-commercial encyclopedias (like Wikipedia) and “small and micro” platforms have been exempted from the law. (Despite being exempt, Wikipedia still opposes the law.)
Issues with the Legislation
Opponents of Article 13 contend the legislation will force website operators to use “upload filters” to stop users from uploading any type of copyrighted work because making platform publishers responsible for policing all uploaded content constitutes a de facto requirement for filters.
Even the gold standard of upload filters—YouTube’s Content 10 (with its 60 million price tag)—is not a perfect system. Like all filters, Content 10 yields false positives. The software has wrongly tagged unprotected works and rejected them for posting. YouTube has also improperly removed user’s videos. It’s also worth noting that it has become more common for companies or individuals to employ copyright takedown notices to silence their critics.
Critics worry that the legislation will limit sharing online because businesses may find adopting filters prohibitively expensive to develop and instead opt to close down. When other European countries adopted similar laws, they have had devastating impacts. In Germany, Google stopped including snippets of text from publishers trying to enforce the relevant law. As a result, the German publishers lost traffic to their sites. In response to a similar law in Spain, Google shuttered its news division. If the EU’s proposal becomes final, Google could close its news division altogether.
Nevertheless, proponents of the law contend that the law prevents behemoth tech companies from profiting from ads appearing next to infringing material. Proponents contend the legislation is required to protect artists whose work is pirated online, as well as newspapers and journalists at risk of their business models being undermined by the titans of social media.
Next Steps
The version of the legislation that Parliament initially and overwhelmingly approved has not yet been released to the public. The European Parliament must finalize the legislation with the European Council and European Commission. That final version goes back before the European Parliament for a final vote in January. From there each EU member state will have two years to adopt its own law implementing the legislation. Given the overwhelming initial vote in favor of the law—438 to 226—it seems likely the legislation, including Articles 11 and 13, will ultimately be adopted.
Impact
European governments have traditionally been tough on businesses viewed to infringe their citizens’ rights—Europe lacks the tradition of personal freedom so conspicuous in American jurisprudence. Consider, for example, the EU’s sweeping privacy rules and the “right to be forgotten” and the multi-billion dollar antitrust fines and tax bills levied by the EU on technology companies. Likewise, Germany passed a law ordering social media companies to delete hate speech within 24 hours of it being published.
For now, there is ample gray area when it comes to just how exactly this legislation’s finer points will be enforced. Will this effort to get the profits of copyrighted material into the pockets of the artists and creators achieve its aim or, no matter how well meaning, trigger a host of unintended consequences? For the individual online curator or black belt meme-deployer, there’s little to be done beyond the wait and see. (Use those memes while you’ve got them!)
It looks increasingly clear that this latest directive will have far-reaching ramifications for tech giants, mid-sized publishers, small businesses and individuals alike. As a result, everyone with a stake in online content—creators, consumers and all the middle men in between—has good reason to take note and adjust strategies (and perhaps resource allocation) accordingly.