This week, CDT joins the Association of Research Libraries in celebrating Fair Use/Fair Dealing Week. It is designed to highlight and promote the opportunities presented by fair use and fair dealing, celebrate successful stories, and explain these doctrines.
This week, the Legal Affairs Committee (JURI) of the European Parliament voted to approve the text of a provisional agreement on the long-debated Copyright Directive. This directive contains some provisions that, if adopted, will change the nature of the web (at least in the EU). The most troubling is Article 13, AKA “upload filters”, which would require “online content sharing service providers” (OCSSPs) such as YouTube to either obtain licenses from each copyright owner whose works are uploaded or made available through the service, or to put in place measures to prevent the availability of unauthorized uses of copyright-protected works. This provision would reverse course on one of the web’s long-standing legal foundations (shielding intermediaries from liability for copyright infringements by users), and risks damaging a fundamental aspect of copyright policy here in the U.S., fair use.
Chances are, you already know that “fair use” is a judge-made doctrine (incorporated into law in the Copyright Act) that allows people to reproduce some or all parts of copyright-protected works without infringing on the exclusive rights the Act gives to authors. The relevant statutory provision (17 U.S.C. § 107) describes four factors to consider when determining whether a particular use of a work is “fair”: the purpose and character of the use; the nature of the copyrighted work; the amount and substantiality of the portion used in relation to the work as a whole; and the effect of the use on the potential market for, or value of, the original work.
Although there are various judicial theories about how to weigh these factors, no single factor is necessarily determinative of the outcome. One may cancel out or accentuate another. Two may weigh in favor of fair use while two weigh against. Evaluating the factors requires a fairly comprehensive knowledge of the original work and its potential markets, an awareness of the context in which the new use appears, and the ability to recognize difficult-to-articulate concepts like parody. All this is to say that, although performing a fair use analysis is not impossible, it is rarely a straightforward process.
Now let’s talk briefly about automated content filtering systems. Online services offering hosting for user-uploaded content sometimes employ filtering systems to reduce unauthorized reproductions of copyright-protected works, or as part of a monetization program to compensate copyright owners for uses of their works. For copyright purposes, these systems are built on top of a database containing lists of individual copyrighted works and their owners. The system then scans all material uploaded to identify any part of an upload that matches an entry in the database. Upon finding a match, the system may either block access to the upload, take steps to associate it with a monetization program, or take some other action such as simply notifying the copyright owner.
These systems are sophisticated and fast (also expensive and at least occasionally inaccurate), and capable of processing incredible amounts of uploaded audio and video. However, they are only designed to recognize and flag instances of uploaded content listed in the database, not to decide whether the unauthorized use of that content might be acceptable under a limitation or exception to copyright, such as fair use. That kind of assessment, although potentially possible to automate in the future, depends on human review.
Enter the European Union. Although the Copyright Directive has some good elements, such as a mandatory exception to copyright for text and data mining, they are overshadowed by the potential negative impacts of Article 13. (Article 11 is also problematic and, if the EU had a fair use doctrine, would completely contradict it.)
Article 13’s instructions to either obtain authorization from all rightsholders or otherwise prevent the availability of unauthorized uses of their works amounts to a choice between an impossible task (licensing literally everything) and one that only the most well-funded online companies can achieve (upload filters). Beyond the many glaring flaws of Articles 11 and 13, neither provides meaningful allowances for limitations and exceptions to copyright. For example, although the most recent text of Article 13 contains language to preserve citizens’ ability to take advantage of limitations and exceptions to copyright, such as caricature, parody, and pastiche, it fails to recognize that automated filters are not capable of assessing whether a post falls within an exception.
Now let’s fast forward to a (still-avoidable) future, where the EU Copyright Directive has been approved and implemented by each Member State. For those online services that are capable of compliance, how many will dedicate a separate filtering system, or 27 variations, for their EU-based users? Given the recent history of the GDPR, it seems likely that Article 13’s only-optional-in-theory upload filters will be applied to users and content outside the EU as well.
The biggest sharing sites already employ some version of these filters, but Article 13, which makes service providers liable for unauthorized uses of protected works, may push them to become even more stringent. This is problematic from a fair use perspective because the filters effectively bypass the users’ opportunity to rely on fair use to justify their post as non-infringing. Both the U.S. and the EU (for now) have systems in place so that copyright owners can notify online service providers when they believe individual posts contain unauthorized uses of material for which they hold the copyright (Section 512 of the Digital Millennium Copyright Act and Article 14 of the E-Commerce Directive, respectively). The service providers can avoid liability for any infringing acts of their users as long as they remove the allegedly infringing content after notification. But, and this is crucial, users also have the opportunity to challenge removals if they believe their post to have been non-infringing. Upload filters leave users with no recourse for posts blocked by the filter.
Even if the service provider institutes some mechanism by which users can challenge a filter-blocked post, it would come with none of the legal processes put in place by the DMCA or the ECD to empower users against aggressive or abusive copyright holders. Nor could such a mechanism replace the environment for permissionless creativity and free expression currently preserved by intermediary liability shields, such as Section 512.
CDT hopes that the EU Parliament will reject Articles 11 and 13 in its upcoming plenary session. If it fails to do so, the rest of the world must rely on private companies to ensure that the EU’s misguided copyright policies do not restrict freedoms enjoyed elsewhere in the world. If you live in the EU and want to get involved, check out the work being done by Save Your Internet and Change.org. Just don’t wait too long to do so; the final vote on the Copyright Directive may come as soon as the end of March.