The stated goals of the EU'S proposed Digital Single Market (DSM) Directive are laudable: Who could object to modernizing the EU's digital copyright rules, facilitating cross-border uses of in-copyright materials, promoting growth of the internal market of the EU, and clarifying and harmonizing copyright rules for digital networked environments?
The devil, as always, is in the details. The most controversial DSM proposal is its Article 13, which would require online content-sharing services to use "effective and proportionate" measures to ensure user uploads to their sites are non-infringing. Their failure to achieve this objective would result in their being directly liable for any infringements. This seemingly requires those services to employ monitoring and filtering technologies, which would fundamentally transform the rules of the road under which these firms have long operated.
Big media firms can use this new rule to extract more compensation from platforms.
A more positive part of the DSM Directive is its Article 3. It would require EU member states to adopt a copyright exception to enable research and cultural heritage institutions to engage in text- and data-mining (TDM) for scientific research purposes. This is good so far as it goes, but critics argue that for-profit firms and independent researchers should enjoy similar TDM privileges, and scientific research should not be the only legitimate purpose for TDM.
This column explains the rationales for these new measures, specific terms of concern, and why critics have argued for changes to make the rules more balanced. (Column space limitations preclude attention to other controversial provisions, such as the new press publishers' rights to control online services' displays of press contents.)
Article 13's Changes to Online Service Liability Rules
For approximately the past two decades, the European Union's E-Commerce Directive, like the U.S. Digital Millennium Copyright Act, has provided Internet service providers (ISPs) with "safe harbors" from copyright liability for infringing uses of their services about which the ISPs had neither knowledge nor control.
Under these rules, ISPs must take down infringing materials after copyright owners notify them of the existence and location of those materials. But they do not have to monitor for infringements or use filtering technologies to prevent infringing materials from being uploaded and stored on their sites.
Because online infringements have greatly proliferated, copyright industry groups have strongly urged policymakers in the EU (as well as the U.S.) to impose stronger obligations on ISPs to thwart infringements. Their goal has been the adoption of legal rules requiring ISPs to use monitoring technologies to detect in-copyright materials and filtering technologies to block infringing uploads.
In proposing the DSM Directive, the European Commission has responded to these calls by proposing certain ISPs should take on greater responsibilities on to help prevent infringements. Article 13 is aimed at those ISPs that enable online content sharing (think YouTube).
While not directly requiring the use of monitoring or filtering technologies, Article 13 can reasonably be interpreted as intending to achieve this result.
The DSM Directive states Article 13 is intended to target only those online content-sharing services that play an "important role" in the online content market by competing with other services, such as online audio or video streaming services, for the same customers.
If the "main purpose" (or "one of the main purposes") of the service is to provide access to "large amounts" of copyrighted content uploaded by users and it organizes and promotes those uploads for profit-making purposes, that service will no longer be protected by the E-Commerce safe harbor. It will instead be subjected to the new liability rules.
Concerns about the overbreadth of Article 13 led the Commission to narrow the definition of the online content-sharing services affected by the rules. It now specifically excludes online encyclopedias (think Wikipedia), repositories of scientific or educational materials uploaded by their authors, open source software repositories, cloud services, cyberlockers, and marketplaces engaged in online retail sales of digital copies.
The most significant regulation in Article 13 is its subsection (4):
Member States shall provide that an online content sharing service provider shall not be liable for acts of communication to the public or making available to the public within the meaning of this Article when:
it demonstrates that it has made best efforts to prevent the availability of specific works or other subject matter by implementing effective and proportionate measures ... to prevent the availability on its services of the specific works or other subject matter identified by rightholders and for which the rights holders have provided the service with relevant and necessary information for the application of these measures; and
upon notification by rights holders of works or other subject matter, it has acted expeditiously to remove or disable access to these works or other subject matter and it demonstrates that it has made its best efforts to prevent their future availability through the measures referred to in point (a).
The italicized language signals terminology that is vague and open to varying interpretations, but anticipates the use of technologies to show those "best efforts."
Copyright industry groups can be expected to assert that it is necessary to use monitoring and filtering technologies to satisfy the requirements of Article 13(4). They will also point to an alternative way that online services can avoid liability: by licensing uploaded copyrighted content from their respective rights holders.
Affected online services will have an uphill battle to fend off the efforts to interpret the ambiguous terms as imposing monitoring and filtering obligations. It is, of course, impossible to license contents for every copyrighted work that their users might upload to their site. But the big media firms can use this new rule to extract more compensation from platforms.
Critics have raised two major concerns about this proposal. First, it will likely further entrench the market power of the leading platforms that can afford to develop filtering technologies such as YouTube's ContentID, and deter new entry into the online content sharing market. Second, it will undermine user privacy and free speech interests, leading to blockages of many parodies, remixes, fan fiction, and other creative reuses of copyrighted works that would, if examined by a neutral observer, be deemed non-infringing.
When the proposal was pending before the European Council in late May, several members, including representatives from Finland, Germany, and the Netherlands, opposed it and offered some compromise language, so it does not have consensus support. Since then, opponents have mounted a public relations campaign to urge EU residents to contact their Parliamentary representatives telling them to vote no in order to "save the Internet."
Among the many critics of Article 13 is David Kaye, the United Nation's Special Rapporteur for Freedom of Expression. He wrote a nine-page letter explaining why Article 13 is inconsistent with EU's commitments under international human rights instruments.
In addition, Tim Berners-Lee, Vint Cerf, and 89 other Internet pioneers (plus me) signed an open letter urging the EU Parliament to drop Article 13: "By requiring Internet platforms to perform automatic filtering on all of the content that their users upload, Article 13 takes an unprecedented step toward the transformation of the Internet from an open platform for sharing and innovation, into a tool for the automated surveillance and control of its users."
More than 145 civil society organizations also came out against it. These protests were successful enough to induce a majority of the European Parliament to vote for giving further consideration to the DSM directive. Several stages remain in the EU's elongated process before this directive is finalized, either in its current or some revised form.
Much better news is the proposed new copyright exception to enable nonprofit research and cultural heritage institutions to engage in text- and data-mining (TDM). The European Commission and the Council recognize that digital technologies have opened up significant opportunities for using TDM techniques to make new discoveries by computational analysis of large datasets. These discoveries can advance not only natural but also human sciences in ways that will benefit the information society.
Article 3 would require EU member states to allow research and cultural heritage institutions to reproduce copyrighted works and extract information using TDM technologies, as long as the researchers had lawful access to the contents being mined. These researchers must, however, store such copies in a secure environment and retain the copies no longer than is necessary to achieve their scientific research objectives.
Importantly, rights holders cannot override the TDM exception through contract restrictions. (They can, however, use technology to ensure security and integrity of their networks and databases, which opens the possibility of technology overrides.) Article 3 also calls for rights holders, research organizations, and cultural heritage institutions to agree upon best practices for conducting TDM research.
No TDM Privilege for Profit-Making and Unaffiliated Researchers
The DSM Directive assumes that profit-making firms can and should get a license to engage in TDM research from the owners of the affected IP rights. Although the DSM contemplates the possibility of public-private partnerships, it forbids those in which private entities have control over TDM-related collaborative projects. Unaffiliated researchers (say, independent data scientists or think-tank personnel) cannot rely on the DSM's TDM exception.
Article 3 may put the EU at a disadvantage in AI research because some countries have already adopted less restrictive TDM exceptions. Japan, for instance, allows text- and data-mining without regard to the status of the miner, and does not confine the scope of the exception to nonprofit "scientific research." In the U.S., for-profit firms have been able to rely on fair use to make copies of in-copyright materials for TDM purposes, as in the Authors Guild v. Google case. This ruling did not limit TDM purposes to scientific research.
Commentators on the DSM Directive have expressed several concerns about the restrictions on its TDM exception. For one thing, TDM licenses may not be available on reasonable terms for startups and small businesses in the EU. Second, some EU firms may ship their TDM research offshore to take advantage of less-restrictive TDM rules elsewhere. Third, some non-EU firms may decide not to invest in TDM-related research in the EU because of these restrictions. Moreover, in the highly competitive global market for world-class AI and data science researchers, the EU may suffer from "brain drain" if its most talented researchers take job opportunities in jurisdictions where TDM is broadly legal.
The EU's proposed DSM Directive is highly controversial, especially the new obligations it would impose on online content-sharing services to thwart infringing uploads. In early July, the EU Parliament voted against giving approval to the May version of the DSM proposal; it voted in September to approve some amendments to the DSM Directive, which did not significantly change the Article 13 mandate. It will, however, be many months before the final text of the directive is voted on.
The prospect of bearing direct liability for the infringing activities of users will likely cause many sharing services to be overly cautious about what their users can upload.
Whether Article 13, if adopted as is, will "kill" the Internet as we know it, as some critics have charged, remains to be seen. Yet the prospect of bearing direct liability for the infringing activities of users will likely cause many sharing services to be overly cautious about what their users can upload and new entry will be chilled. In its current form, Article 13 gives copyright enforcement priority over the interests of users in information privacy and fundamental freedoms.
The DSM Directive's proposed exception for TDM research is a welcome development for those who work at research and cultural heritage institutions. However, the unfortunate withholding of the exception from for-profit firms and independent researchers may undermine prospects for the EU's achieving its aspiration to promote innovations in AI and data science industries. It will be difficult for EU-based entities to compete with American and Japanese firms whose laws provide them with much greater freedom to engage in TDM analyses.
Pamela Samuelson (pam@law.berkeley.edu) is the Richard M. Sherman Distinguished Professor of Law and Information at the University of California, Berkeley, and a member of the ACM Council.
Copyright held by author.
Request permission to (re)publish from the owner/author