Categories
Copyright Internet Legislation Uncategorized

Separating Fact from Fiction in the Notice and Takedown Debate

By Kevin Madigan & Devlin Hartline

U.S. Capitol buidlingWith the Copyright Office undertaking a new study to evaluate the impact and effectiveness of the Section 512 safe harbor provisions, there’s been much discussion about how well the DMCA’s notice and takedown system is working for copyright owners, service providers, and users. While hearing from a variety of viewpoints can help foster a healthy discussion, it’s important to separate rigorous research efforts from overblown reports that offer incomplete data in support of dubious policy recommendations.

Falling into the latter category is Notice and Takedown in Everyday Practice, a recently-released study claiming to take an in-depth look at how well the notice and takedown system operates after nearly twenty years in practice. The study has garnered numerous headlines that repeat its conclusion that nearly 30% of all takedown requests are “questionable” and that echo its suggestions for statutory reforms that invariably disfavor copyright owners. But what the headlines don’t mention is that the study presents only a narrow and misleading assessment of the notice and takedown process that overstates its findings and fails to adequately support its broad policy recommendations.

Presumably released to coincide with the deadline for submitting comments to the Copyright Office on the state of Section 512, the authors claim to have produced “the broadest empirical analysis of the DMCA notice and takedown” system to date. They make bold pronouncements about how “the notice and takedown system . . . meets the goals it was intended to address” and “continues to provide an efficient method of enforcement in many circumstances.” But the goals identified by the authors are heavily skewed towards service providers and users at the expense of copyright owners, and the authors include no empirical analysis of whether the notice and takedown system is actually effective at combating widespread piracy.

The study reads more like propaganda than robust empiricism. It should be taken for what it is: A policy piece masquerading as an independent study. The authors’ narrow focus on one sliver of the notice and takedown process, with no analysis of the systemic results, leads to conclusions and recommendations that completely ignore the central issue of whether Section 512 fosters an online environment that adequately protects the rights of copyright owners. The authors conveniently ignore this part of the DMCA calculus and instead put forth a series of proposals that would systematically make it harder for copyright owners to protect their property rights.

To its credit, the study acknowledges many of its own limitations. For example, the authors recognize that the “dominance of Google notices in our dataset limits our ability to draw broader conclusions about the notice ecosystem.” Indeed, over 99.992% of the individual requests in the dataset for the takedown study were directed at Google, with 99.8% of that dataset directed at Google Search in particular. Of course, search engines do not include user-generated content—the links Google provides are links that Google itself collects and publishes. There are no third parties to alert about the takedowns since Google is taking down its own content. Likewise, removing links from Google Search does not actually remove the linked-to content from the internet.

The authors correctly admit that “the characteristics of these notices cannot be extrapolated to the entire world of notice sending.” A more thorough quantitative study would include data on sites that host user-generated content, like YouTube and Facebook. As it stands, the study gives us some interesting data on one search engine, but even that data is limited to a sample size of 1,826 requests out of 108 million over a six-month period in mid-2013. And it’s not even clear how these samples were randomized since the authors admittedly created “tranches” to ensure the notices collected were “of great substantive interest,” but they provide no details about how they created these tranches.

Despite explicitly acknowledging that the study’s data is not generalizable, the authors nonetheless rely on it to make numerous policy suggestions that would affect the entire notice and takedown system and that would tilt the deck further in favor of infringement and against copyright owners. They even identify some of their suggestions as explicitly reflecting “Public Knowledge’s suggestion,” which is a far cry from a reasoned academic approach. The authors do note that “any changes should take into account the interests of . . . small- and medium-sized copyright holders,” but this is mere lip service. Their proposals would hurt copyright owners of all shapes and sizes.

The authors justify their policy proposals by pointing to the “mistaken and abusive takedown demands” that they allegedly uncover in the study. These so-called “questionable” notices are the supposed proof that the entire notice and takedown system needs fixing. A closer look at these “questionable” notices shows that they’re not nearly so questionable. The authors claim that 4.2% of the notices surveyed (about 77 notices) are “fundamentally flawed because they targeted content that clearly did not match the identified infringed work.” This figure includes obvious mismatches, where the titles aren’t even the same. But it also includes ambiguous notices, such as where the underlying work does not match the title or where the underlying page changes over time.

The bulk of the so-called “questionable” notices comes from those notices that raise “questions about compliance with the statutory requirements” (15.4%, about 281 notices) or raise “potential fair use defenses” (7.3%, about 133 notices). As to the statutory requirements issue, the authors argue that these notices make it difficult for Google to locate the material to take down. This claim is severely undercut by the fact that, as they acknowledge in a footnote, Google complies with 97.5% of takedown notices overall. Moreover, it wades into the murky waters of whether copyright owners can send service providers a “representative list” of infringing works. Turning to the complaint about potential fair uses, the authors argue that copyright owners are not adequately considering “mashups, remixes, or covers.” But none of these uses are inherently fair, and there’s no reason to think that the notices were sent in bad faith just because someone might be able to make a fair use argument.

The authors claim that their “recommendations for statutory reforms are relatively modest,” but that supposed modesty is absent from their broad list of suggestions. Of course, everything they suggest increases the burdens and liabilities of copyright owners while lowering the burdens and liabilities of users, service providers, and infringers. Having overplayed the data on “questionable” notices, the authors reveal their true biases. And it’s important to keep in mind that they make these broad suggestions that would affect everyone in the notice and takedown system after explicitly acknowledging that their data “cannot be extrapolated to the entire world of notice sending.” Indeed, the study contains no empirical data on sites that host user-generated content, so there’s nothing whatsoever to support any changes for such sites.

The study concludes that the increased use of automated systems to identify infringing works online has resulted in the need for better mechanisms to verify the accuracy of takedown requests, including human review. But the data is limited to small surveys with secret questions and a tiny fraction of notices sent to one search engine. The authors offer no analysis of the potential costs of implementing their recommendations, nor do they consider how it might affect the ability of copyright owners to police piracy. Furthermore, data presented later in the study suggests that increased human review might have little effect on the accuracy of takedown notices. Not only do the authors fail to address the larger problem of whether the DMCA adequately addresses online piracy, their suggestions aren’t even likely to address the narrower problem of inaccurate notices that they want to fix.

Worse still, the study almost completely discards the ability of users to contest mistaken or abusive notices by filing counternotices. This is the solution that’s already built into the DMCA, yet the authors inexplicably dismiss it as ineffective and unused. Apart from providing limited answers from a few unidentified survey respondents, the authors offer no data on the frequency or effectiveness of counternotices. The study repeatedly criticizes the counternotice system as failing to offer “due process protection” to users, but that belief is grounded in the notion that a user that fails to send a counternotice has somehow been denied the chance. Moreover, it implies a constitutional right that is not at issue when two parties interact in the absence of government action. The same holds true for the authors’ repeated—and mistaken—invocation of “freedom of expression.”

More fundamentally, the study ignores the fact that the counternotice system is stacked against copyright owners. A user can simply file a counternotice and have the content in question reposted, and most service providers are willing to repost the content following a counternotice because they’re no longer on the hook should the content turn out to be infringing. The copyright owner, by contrast, then faces the choice of allowing the infringement to continue or filing an expensive lawsuit in federal court. The study makes it sound like users are rendered helpless because counternotices are too onerous, but the reality is that the system leaves copyright owners practically powerless to combat bad faith counternotices.

Pretty much everyone agrees that the notice and takedown system needs a tune up. The amount of infringing content available online today is immense. This rampant piracy has resulted in an incredible number of takedown notices being sent to service providers by copyright owners each day. Undoubtedly, the notice and takedown system should be updated to address these realities. And to the extent that some are abusing the system, they should be held accountable. But in considering changes to the entire system, we should not be persuaded by biased studies based on limited (and secret) datasets that provide little to no support for their ultimate conclusions and recommendations. While it may make for evocative headlines, it doesn’t make for good policy.

Categories
Copyright Internet Legislation Uncategorized

Middle Class Artists Want a DMCA System That Works

The following guest post comes from Rebecca Cusey, a second year law student at George Mason University School of Law.

By Rebecca Cusey

Rebecca_Cusey_HeadshotMason Law’s Arts & Entertainment Advocacy Clinic filed comments today with the U.S. Copyright Office detailing the frustrations and futilities experienced by everyday artists as they struggle with the DMCA system to protect their copyrights online.

Terrica Carrington and I wrote the comments on behalf of middle class artists, that group of creative professionals who invests in and lives off their art and yet does not have the kind of revenue stream or corporate backing of more well-known artists. These photographers, filmmakers, musicians, and other artists are squeezed between infringement that directly affects their ability to pay for things like a mortgage or orthodontics bill and the exorbitant cost of using the notice and takedown system to fight infringement.

Terrica and I spoke with four artists: Filmmaker Ellen Seidler, news photographer Yunghi Kim, musician Blake Morgan, audiovisual creator David Newhoff. These artists make works of value and have followings, and thus infringement. They make a profession of their art.

A middle class artist must do it all on her own – find infringement by hours of searching the web, compile lists of infringing posts on each site, navigate each site’s confusing DMCA notification system, and send takedown notification after takedown notification. And that’s all just sending the notifications. Monitoring to see if the infringing content has been removed or if it has simply been uploaded in another spot is a whole other job in itself.

The artists with whom we talked said it was not unusual in the least for a song, photograph, or film to be posted illegally in a thousand places, even tens of thousands of places. Finding infringement and sending notices took hundreds and thousands of hours, hours they could have spent taking photographs, making movies, or writing songs.

After all the time spent fighting infringement online, they felt the task was futile because the content simply reappeared, sometimes in a different place on the same site, other times because of counternotices filed with the ISP hosting the content claiming to have the right to post it.

These artists felt the notice and takedown system mandated by Section 512 of the Copyright Act was both all-consuming and futile, all-consuming because it ate hours upon hours and futile because it yielded little to no results. Ultimately, all of them decided to stop spending time trying to enforce their copyrights under the procedures of Section 512. It simply was not worth it.

Our comments were filed in response to a request by the U.S. Copyright Office for comments on the effectiveness of Section 512 in fighting infringement online. The Copyright Office wanted to know in particular if the provisions of Section 512 balanced the needs of ISPs to host content with the needs of copyright owners to control their work.

Middle class artists feel the balance is off and the scale tipped in favor of ISPs. These artists do not object to bearing some responsibility for protecting their copyrights online. They simply want a system that works.

To read our Section 512 comments, please click here.

Categories
Copyright Internet Legislation Uncategorized

Copyright Scholars: Courts Have Disrupted the DMCA’s Careful Balance of Interests

Washington, D.C. at nightThe U.S. Copyright Office is conducting a study of the safe harbors under Section 512 of the DMCA, and comments are due today. Working with Victor Morales and Danielle Ely from Mason Law’s Arts & Entertainment Advocacy Clinic, we drafted and submitted comments on behalf of several copyright law scholars. In our Section 512 comments, we look at one narrow issue that we believe is the primary reason the DMCA is not working as it should: the courts’ failure to properly apply the red flag knowledge standard. We argue that judicial interpretations of red flag knowledge have disrupted the careful balance of responsibilities Congress intended between copyright owners and service providers. Instead of requiring service providers to take action in the face of red flags, courts have allowed them to turn a blind eye and bury their heads in the sand.

Whether Section 512’s safe harbors are working as intended is a hotly contested issue. On the one hand, hundreds of artists and songwriters are calling for changes “to the antiquated DMCA which forces creators to police the entire internet for instances of theft, placing an undue burden on these artists and unfairly favoring technology companies and rogue pirate sites.” On the other hand, groups like the Internet Association, which includes tech giants such as Google and Facebook, claim that the safe harbors are “working effectively” since they “strike a balance between facilitating free speech and creativity while protecting the interests of copyright holders.” The Internet Association even claims that “the increasing number of notice and takedown requests” shows that the DMCA working.

Of course, it’s utter nonsense to suggest that the more takedown notices sent and processed, the more we know the DMCA is working. The point of the safe harbors, according to the Senate Report on the DMCA, is “to make digital networks safe places to disseminate and exploit copyrighted materials.” The proper metric of success is not the number of takedown notices sent; it’s whether the internet is a safe place for copyright owners to disseminate and exploit their works. The continuing availability of huge amounts of pirated works should tip us off that the safe harbors are not working as intended. If anything, the increasing need for takedown notices suggests that things are getting worse for copyright owners, not better. If the internet were becoming a safer place, the number of takedown notices should be decreasing. It’s not surprising that service providers enjoy the status quo, given that the burden of tracking down and identifying infringement doesn’t fall on them, but this is not the balance that Congress intended to strike.

Our comments to the Copyright Office run through the relevant legislative history to show what Congress really had in mind—and it wasn’t copyright owners doing all of the work in locating and identifying infringement online. Instead, as noted in the Senate Report, Congress sought to “preserve[] strong incentives for service providers and copyright owners to cooperate to detect and deal with copyright infringements that take place in the digital networked environment.” The red flag knowledge standard was a key leverage point to encourage service providers to participate in the effort to detect and eliminate infringement. Unfortunately, courts thus far have interpreted the standard so narrowly that, beyond acting on takedown notices, service providers have little incentive to work together with copyright owners to prevent piracy. Even in cases with the most crimson of flags, courts have failed to strip service providers of their safe harbor protection. Perversely, the current case law incentivizes service providers to actively avoid doing anything when they see red flags, lest they gain actual knowledge of infringement and jeopardize their safe harbors. This is exactly the opposite of what Congress intended.

The Second and Ninth Circuits have interpreted the red flag knowledge standard to require knowledge of specific infringing material before service providers can lose their safe harbors. While tech giants might think this is great, it’s terrible for authors and artists who need service providers to carry their share of the load in combating online piracy. Creators are left in a miserable position where they bear the entire burden of policing infringement across an immense range of services, effectively making it impossible to prevent the deluge of piracy of their works. The Second and Ninth Circuits believe red flag knowledge should require specificity because otherwise service providers wouldn’t know exactly what material to remove when faced with a red flag. We argue that Congress intended service providers with red flag knowledge of infringing activity in general to then bear the burden of locating and removing the specific infringing material. This is the balance of responsibilities that Congress had in mind when it crafted the red flag knowledge standard and differentiated it from the actual knowledge standard.

But all hope is not lost. The Second and Ninth Circuits are but two appellate courts, and there are many others that have yet to rule on the red flag knowledge issue. Moreover, the Supreme Court has never interpreted the safe harbors of the DMCA. We hope that our comments will help expose the underlying problem that hurts so many creators today who are stuck playing the DMCA’s whack-a-mole game when their very livelihoods are at stake. Congress intended the DMCA to be the cornerstone of a shared-responsibility approach to fighting online piracy. Unfortunately, it has become a shield that allows service providers to enable piracy on a massive scale without making any efforts to prevent it beyond acting on takedown notices. The fact that search engines can still index The Pirate Bay—an emblematic piracy site that even has the word “pirate” in its name—without concern of losing their safe harbor protection is a testament to how the courts have turned Congress’ intent on its head. We hope that the Copyright Office’s study will shed light on this important issue.

To read our Section 512 comments, please click here.

Categories
Copyright Internet Uncategorized

Attacking the Notice-and-Staydown Straw Man

Ever since the U.S. Copyright Office announced its study of the DMCA last December, the notice-and-staydown issue has become a particularly hot topic. Critics of notice-and-staydown have turned up the volume, repeating the same vague assertions about freedom, censorship, innovation, and creativity that routinely pop up whenever someone proposes practical solutions to curb online infringement. Worse still, many critics don’t even take the time to look at what proponents of notice-and-staydown are suggesting, choosing instead to knock down an extremist straw man that doesn’t reflect anyone’s view of how the internet should function. A closer look at what proponents of notice-and-staydown are actually proposing reveals that the two sides aren’t nearly as far apart as critics would have us believe. This is particularly true when it comes to the issue of how well notice-and-staydown would accommodate fair use.

For example, Joshua Lamel’s recent piece at The Huffington Post claims that “innovation and creativity are still under attack” by the “entertainment industry’s intense and well-financed lobbying campaign” pushing for notice-and-staydown. Lamel argues that the “content filtering proposed by advocates of a ‘notice and staydown’ system . . . would severely limit new and emerging forms of creativity.” And his parade of horribles is rather dramatic: “Parents can forget posting videos of their kids dancing to music and candidates would not be able to post campaign speeches because of the music that plays in the background. Remix culture and fan fiction would likely disappear from our creative discourse.” Scary stuff, if true. But Lamel fails to cite a single source showing that artists, creators, and other proponents of notice-and-staydown are asking for anything close to this.

Similarly, Elliot Harmon of the Electronic Frontier Foundation (EFF) argues that “a few powerful lobbyists” are pushing for notice-and-staydown such that “once a takedown notice goes uncontested, the platform should have to filter and block any future uploads of the same allegedly infringing content.” Harmon also assumes the worst: “Under the filter-everything approach, legitimate uses of works wouldn’t get the reasonable consideration they deserve,” and “computers would still not be able to consider a work’s fair use status.” Like Lamel, Harmon claims that “certain powerful content owners seek to brush aside the importance of fair use,” but he doesn’t actually mention what these supposed evildoers have to say about notice-and-staydown.

Harmon’s suggestion that the reliance on uncontested takedown notices gives inadequate consideration to fair use is particularly strange as it directly contradicts the position taken by the EFF. Back in October of 2007, copyright owners (including CBS and Fox) and service providers (including Myspace and Veoh) promulgated a list of Principles for User Generated Content Services. These Principles recommend that service providers should use fingerprinting technology to enact notice-and-staydown, with the general caveat that fair use should be accommodated. Two weeks later, the EFF published its own list of Fair Use Principles for User Generated Video Content suggesting in detail how notice-and-staydown should respect fair use.

The EFF’s Fair Use Principles include the following:

The use of “filtering” technology should not be used to automatically remove, prevent the uploading of, or block access to content unless the filtering mechanism is able to verify that the content has previously been removed pursuant to an undisputed DMCA takedown notice or that there are “three strikes” against it:

1. the video track matches the video track of a copyrighted work submitted by a content owner;
2. the audio track matches the audio track of that same copyrighted work; and
3. nearly the entirety (e.g., 90% or more) of the challenged content is comprised of a single copyrighted work (i.e., a “ratio test”).

If filtering technologies are not reliably able to establish these “three strikes,” further human review by the content owner should be required before content is taken down or blocked.

Though not explicitly endorsing notice-and-staydown, the EFF thinks it’s entirely consistent with fair use so long as (1) the content at issue has already been subject to one uncontested takedown notice, or (2) the content at issue is at least a 90% match to a copyrighted work. And the funny thing is that supporters of notice-and-staydown today are actually advocating for what the EFF recognized to be reasonable over eight years ago.

While Harmon never explicitly identifies the “powerful lobbyists” he accuses of wanting to trample on fair use, he does link to the Copyright Office’s recently-announced study of the DMCA and suggest that they can be found there. Reading through that announcement, I can only find three citations (in footnote 36) to people advocating for notice-and-staydown: (1) Professor Sean O’Connor of the University of Washington School of Law (and Senior Scholar at CPIP), (2) Paul Doda, Global Litigation Counsel at Elsevier, and (3) Maria Schneider, composer/conductor/producer. These three cites all point to testimonies given at the Section 512 of Title 17 hearing before the House Judiciary Committee in March of 2014, and they show that Harmon is attacking a straw man. In fact, all three of these advocates for notice-and-staydown seek a system that is entirely consistent with the EFF’s own Fair Use Principles.

Sean O’Connor seeks notice-and-staydown only for “reposted works,” that is, “ones that have already been taken down on notice” and that are “simply the original work reposted repeatedly by unauthorized persons.” His proposal only applies to works that “do not even purport to be transformative or non-infringing,” and he specifically excludes “mash-ups, remixes, covers, etc.” This not only comports with the EFF’s recommendations, it goes beyond them. Where the EFF would require either a previously uncontested notice or at least a 90% match, O’Connor thinks there should be both an uncontested notice and a 100% match.

The same is true for Paul Doda of Elsevier, who testifies that fingerprinting technology is “an appropriate and effective method to ensure that only copies that are complete or a substantially complete copy of a copyrighted work are prevented or removed by sites.” Doda explicitly notes that filtering is not suitable for “works that might require more detailed infringement analysis or ‘Fair Use’ analysis,” and he praises YouTube’s Content ID system “that can readily distinguish between complete copies of works and partial copies or clips.” Doda’s vision of notice-and-staydown is also more protective of fair use than the EFF’s Fair Use Principles. While the EFF suggests that a previously uncontested notice is sufficient, Doda instead only suggests that there be a substantial match.

Unlike O’Connor and Doda, Maria Schneider is not a lawyer. She’s instead a working musician, and her testimony reflects her own frustrations with the whack-a-mole problem under the DMCA’s current notice-and-takedown regime. As a solution, Schneider proposes that creators “should be able to prevent unauthorized uploading before infringement occurs,” and she points to YouTube’s Content ID as evidence that “it’s technically possible for companies to block unauthorized works.” While she doesn’t explicitly propose that there be a substantial match before content is filtered, Schneider gives the example of her “most recent album” being available “on numerous file sharing websites.” In other words, she’s concerned about verbatim copies of her works that aren’t possibly fair use, and nothing Schneider recommends contradicts the EFF’s own suggestions for accommodating fair use.

Lamel and Harmon paint a picture of powerful industry lobbyist boogeymen seeking an onerous system of notice-and-staydown that fails to adequately account for fair use, but neither produces any evidence to support their claims. Responses to the Copyright Office’s DMCA study are due on March 21st, and it will be interesting to see whether any of these supposed boogeymen really show up. There’s little doubt, though, that critics will continue attacking the notice-and-staydown straw man. And it’s really a shame, because advocates of notice-and-staydown are quite conscious of the importance of protecting fair use. This is easy to see, but first you have to look at what they’re really saying.

Categories
Copyright Innovation Internet Uncategorized

Notice-and-Staydown and Google Search: The Whack-A-Mole Problem Continues Unabated

After my last post discussing the necessity for notice-and-staydown to help copyright owners with the never-ending game of whack-a-mole under the DMCA, I was asked to clarify how this would work for Google Search in particular. The purpose of my post was to express the need for something better and the hope that fingerprinting technologies offer. But, admittedly, I did not do a good job of separating out how notice-and-staydown would work for hosting platforms as compared to search engines. I think the whack-a-mole problem with hosting sites is indeed different than with search engines, and while fingerprinting may work well for the former, it’s probably ill-suited for the latter.

It’s clear enough how fingerprinting technologies can be applied to hosting platforms, and the simple fact is that they are already being deployed. YouTube uses its own proprietary technology, Content ID, while other platforms, such as Facebook and SoundCloud, use Audible Magic. These technologies create digital fingerprints of content that are then compared to user-uploaded content. When there’s a match, the copyright owner can choose to either allow, track, mute, monetize, or block the uploaded content.

There isn’t a lot of publicly-available information about how accurate these fingerprinting technologies are or how widely copyright owners utilize them. We do know from Google’s Katherine Oyama, who testified to Congress in early 2014, that “more than 4,000 partners” used Content ID at the time and that copyright owners had “‘claimed’ more than 200 million videos on YouTube” with the technology. She also acknowledged that “Content ID is not perfect, sometimes mistakenly ascribing ownership to the wrong content and sometimes failing to detect a match in a video.” Despite these imperfections, the scale of which she didn’t spell out, YouTube continues to offer Content ID to copyright owners.

Oyama also indicated that Content ID does not “work for a service provider that offers information location tools (like search engines and social networks) but does not possess copies of all the audio and video files that it links to.” This scenario is clearly different. When a site hosts content uploaded by its users, it can easily match those uploads to the content it’s already fingerprinted. When a site links to content that’s hosted elsewhere, it may not be possible to analyze that content in the same way. For example, the linked-to site could simply prevent automated crawling. Of course, not all sites block such crawling, but more would probably start doing so if fingerprinting were used in this way.

For Google Search, notice-and-staydown would likely not depend upon fingerprinting technology. Instead, the changes would come from: (1) delisting rogue sites, (2) promoting legitimate content, (3) improving auto-complete, and (4) ceasing to link to the very links that have already been taken down. These suggestions are not anything new, but it’s clear that Google has not done all it can to make them effective. This is not to say that improvements haven’t been made, and Google is to be commended for the work that it’s done so far. But it can and should do more.

Sticking with the example of The Hateful Eight from my prior post, it’s easy to see how Google Search promotes piracy. Using a fresh installation of Chrome so as not to skew the results, I need only type “watch hat” into Google Search before its auto-complete first suggests I search for “watch hateful 8 online.” After following this suggestion, the first seven results are links to obviously-infringing copies of the film. The first link points to the film at the watchmovie.ms site. A quick glance at that site’s homepage shows that it offers numerous (if not only) films that are still in theaters, including Spectre, Star Wars: The Force Awakens, Creed, and The Hateful Eight. In other words, Google’s first search result from its first suggested search term points me to an illicit copy of the film on a site that’s obviously dedicated to infringement.

I’ve never heard of watchmovie.ms, so I checked its WHOIS data. The site was registered on October 14th of last year, and Google’s Transparency Report indicates that it started receiving takedown notices for it just a few days later. To date, Google has received 568 requests to remove 724 infringing links to watchmovie.ms, but its search engine dutifully continues to crawl and index “about 39,000 results” at the site. And, for reasons I simply cannot fathom, Google prefers to send me to that pirate site rather than point me to Google Play (or to any number of other sites) where I can pre-order the film and “watch hateful 8 online” legally.

Making matters worse, at the bottom of the first page of search results for “watch hateful 8 online,” Google links to four DMCA takedown notices it received from copyright owners that resulted in five links being removed from the first page of results. These four notices, in turn, contain a combined total of 499 illicit links to The Hateful Eight that Google has already taken down. This truly boggles the mind. Google takes down five infringing links from one page of search results, consistent with the DMCA, but then it links to those five links along with 494 more such links. And these linked-to links are even better for infringers, since they’ve been vetted by Google as being worthy of taking down.

As the producer of The Hateful Eight, Richard Gladstein, relayed to The Hollywood Reporter, Google told him that it is “not in a position to decide what is legal and what is illegal online.” This is a cop out. In other venues, Google contends that it’s doing a lot to fight piracy. It submitted comments to the U.S. Intellectual Property Enforcement Coordinator this past October explaining how “changes made to [its] algorithm have been highly effective in demoting sites receiving a high number of takedown notices.” This shows that it is indeed in a position to determine what is “illegal online” and to take action against pirate sites. But simply demoting these sites is not enough—they should be delisted altogether.

Everyone knows that The Pirate Bay is a pirate site, yet Google indexes “about 914,000 results” from just one of its domains. As of today, Google has received 187,540 requests to remove 3,628,242 links to that domain, yet Google doesn’t choose to delist the site from its results. Nor does it even appear to be demoting it. The top three search results for “thepiratebay hateful 8” are links to infringing copies of the film on The Pirate Bay. It’s clear that these links are infringing, yet Google makes copyright owners continue playing whack-a-mole for even the most obvious infringements.

This isn’t how the DMCA is supposed to work. Congress even envisioned this whack-a-mole scenario with search engines when it wrote the DMCA. The legislative history provides: “If, however, an Internet site is obviously pirate, then seeing it may be all that is needed for the service provider [i.e., search engine or other information location tool] to encounter a ‘red flag.’ A provider proceeding in the face of such a ‘red flag’ must do so without the benefit of a safe harbor.” The Pirate Bay is “obviously pirate,” and Google knows as much even without the 3.6 million takedown notices it’s received. It knows the same thing about lots of pirate sites, including the other domain names contained in its list of greatest hits.

Google could be doing more to help copyright owners with the whack-a-mole problem, but so far, it’s only taken a few baby steps. And when defending its refusal to delist obvious pirate sites, Google contends that it’s defending freedom of speech:

[W]hole-site removal sends the wrong message to other countries by favoring over-inclusive private censorship over the rule of law. If the U.S. embraces such an overbroad approach to address domestic law violations (e.g., copyright), it will embolden other countries to seek similar whole-site removal remedies for violations of their laws (e.g., insults to the king, dissident political speech). This would jeopardize free speech principles, emerging services, and the free flow of information online globally and in contexts far removed from copyright.

Delisting The Pirate Bay from Google Search isn’t about favoring “censorship over the rule of law.” It’s about Google favoring the rule of law over blatant criminal infringement and doing its part to be a good citizen in the digital economy where it plays no small role. The comparison of the conduct of criminal infringers to the speech of political dissidents rings hollow, and delisting the most obvious and egregious sites does not threaten free speech. Google already claims to demote pirate sites, yet that doesn’t “jeopardize free speech principles.” Neither will delisting them. And as long as Google consciously decides to index known pirate sites with its search engine, the whack-a-mole problem will only continue unabated for copyright owners.

Categories
Conferences Copyright Innovation Internet Uncategorized

Last Chance to Register for the Copyright and Technology NYC 2016 Conference

Tomorrow is the last chance to register for the Copyright and Technology NYC 2016 Conference. The conference will be held next Tuesday, January 19th, at New York University’s Kimmel Center.

In addition to Matthew Barblan and Devlin Hartline from CPIP, participants will come from the following organizations:

  • ASCAP
  • BMI
  • CBS
  • CCIA
  • Charter Communications
  • Cisco
  • Copyright Alliance
  • Copyright Clearance Center
  • Elsevier
  • Entertainment Software Assn.
  • Facebook
  • Getty Images
  • HBO
  • Imagem
  • NBA (National Basketball Association)
  • NMPA
  • RIAA
  • Scholastic
  • Sony Music Entertainment
  • SoundCloud
  • SoundExchange
  • Thomson Reuters
  • Time Warner
  • Twitter
  • U.S. Copyright Office
  • U.S. Patent and Trademark Office
  • UFC
  • Universal Music Group
  • Vevo
  • Viacom
  • Warner Music Group
  • Warner/Chappell Music

You can meet these people and many more!

To register for the conference, please click here.

Categories
Copyright Innovation Internet Uncategorized

Endless Whack-A-Mole: Why Notice-and-Staydown Just Makes Sense

Producer Richard Gladstein knows all about piracy. As he recently wrote for The Hollywood Reporter, his latest film, The Hateful Eight, was “viewed illegally in excess of 1.3 million times since its initial theatrical release on Christmas Day.” Gladstein is not shy about pointing fingers and naming names. He pins the blame, in no small part, on Google and (its subsidiary) YouTube—the “first and third most trafficked websites on the internet.” While acknowledging that fair use is important, Gladstein argues that it has become “an extremely useful tool for those looking to distract from or ignore the real copyright infringement issue: piracy.” His point is that it’s simply not fair use when someone uploads an entire copyrighted work to the internet, and claims that service providers can’t tell when something is infringing are disingenuous.

Gladstein questions why Google and YouTube pretend they are “unable to create and apply technical solutions to identify where illegal activity and copyright infringement are occurring and stop directing audiences toward them.” In his estimation, “Google and YouTube have the ability to create a vaccine that could eradicate the disease of content theft.” While Gladstein doesn’t mention the DMCA or its notice-and-takedown provisions specifically, I think what he has in mind is notice-and-staydown. That is, once a service provider is notified that the copyright owner has not authorized a given work to be uploaded to a given site, that service provider should not be able to maintain its safe harbor if it continues hosting or linking to the given work.

No small amount of ink has been spilled pointing out that the DMCA’s notice-and-takedown provisions have led to an endless game of whack-a-mole for copyright owners. Google’s own transparency report boasts how its search engine has received requests to take down over 63 million URLs in the past month alone. And it helpfully tells us that it’s received over 21 million such requests over the past four years for just one site: rapidgator.net. Google’s transparency doesn’t extend to how many times it’s been asked to remove the same work, nor does it tell us anything about takedown requests for YouTube. But there’s no reason to think those numbers aren’t equally as frustrating for copyright owners.

The question one should ask is why these numbers aren’t frustrating for Google and YouTube, as they have to deal with the deluge of notices. Apparently, they don’t mind at all. According to the testimony of Google’s Senior Copyright Policy Counsel, Katherine Oyama, the “DMCA’s shared responsibility approach works.” Oyama notes that Google has spent tens of millions of dollars creating the infrastructure necessary to efficiently respond to the increasing number of takedown notices it receives, but many (if not most) copyright owners don’t have those kinds of resources. For them, it’s daily battles of manually locating infringements across the entire internet and sending takedown notices. For Google, it’s mostly-automated responses to take down content that otherwise brings ad-based revenue.

These struggles hit individual authors and artists the hardest. As the U.S. Copyright Office noted in its recently-announced study of the DMCA, “[m]any smaller copyright owners . . . lack access to third-party services and sophisticated tools to monitor for infringing uses, which can be costly, and must instead rely on manual search and notification processes—an effort that has been likened to ‘trying to empty the ocean with a teaspoon.’” What makes the process so frustrating—and futile—is the fact that the same works get uploaded to the same platforms time and time again. And any time spent sending the same takedown notice to the same service provider is time that is not spent honing one’s craft and creating new works.

Gladstein is correct: Service providers like Google and YouTube could be doing more. And, somewhat ironically, doing more for copyright owners would actually mean that both sides end up doing less. The obvious solution to the whack-a-mole problem is notice-and-staydown—it just makes sense. There’s simply no reason why a copyright owner should have to keep telling a service provider the same thing over and over again.

Those who object to notice-and-staydown often point out that the DMCA process is susceptible to abuse. Indeed, there are some who send notices in bad faith, perhaps to silence unwanted criticism or commentary. But there’s no reason to think that such abuse is the rule and not the exception. Google’s own numbers show that it complied with 97% of notices in 2011 and 99% of notices in 2013. That’s still a potentially-significant amount of abuse from notice-senders, but it’s also certainly a ton of intentional abuse from infringers whose conduct generated the legitimate notices in the first place. And the vast majority of those infringers won’t get so much as a slap on the wrist.

Turning back to Gladstein’s theme, discussions about fair use or takedown abuse are beside the point. The simple fact is that garden-variety copyright infringement involves neither issue. As CPIP Senior Scholar Sean O’Connor testified to Congress, “for many artists and owners the majority of postings are simply straight-on non-transformative copies seeking to evade copyright.” It’s this simple piracy, where entire works are uploaded to the internet for all to take, that concerns copyright owners most. Gladstein cares about the 1.3 million illicit distributions and performances of The Hateful Eight that are obviously infringing, not the commentary of critics that would obviously be fair use. And takedown notices sent because of these illicit uploads are anything but abusive—the abusers are the infringers.

The technology to make notice-and-staydown work already exists. For example, Audible Magic and YouTube both have the technology to create digital fingerprints of copyrighted works. When users later upload these same works to the internet, the digital fingerprints can be matched so that the copyright owner can then control whether to allow, monetize, track, or block the upload altogether. This technology is a great start, but it’s only as good as its availability to copyright owners. The continued proliferation of infringing works on YouTube suggests that this technology isn’t being deployed properly. And Google has no comparable technology available for its search engine, leaving copyright owners with little choice but to continue playing endless whack-a-mole.

Fortunately, the tides have been turning, especially as the technology and content industries continue to merge. And strides are being made in the courts as well. For example, a Court of Appeal in Germany recently held that YouTube has the duty to both take down infringing content and to make sure that it stays down. A quick search of YouTube today shows that The Hateful Eight, which is still in theaters, is legitimately available for pre-order and is illicitly available to be streamed right now. One wonders why YouTube chooses to compete with itself, especially when it has the tool to prevent such unfair competition. Regardless, there is real hope that Gladstein’s call for a “vaccine that could eradicate the disease of content theft” will be just what the doctor ordered—and that “vaccine” is notice-and-staydown.

[Update: This post unintentionally generated confusion as to whether I think notice-and-staydown means that fingerprinting technologies should be used with search engines. I do not think that would work well. I explain how search engines could do more to help copyright owners with the whack-a-mole problem in this follow-up post.]

Categories
Conferences Copyright Innovation Internet Uncategorized

Join Us at the Copyright and Technology NYC 2016 Conference on January 19

Co-produced by GiantSteps, the Copyright Society, and Musonomics, the Copyright and Technology NYC 2016 Conference will be held at New York University’s Kimmel Center on Tuesday, January 19th. CPIP is a proud Media Sponsor of the event.

The conference program is available here, and registration is still open here.

Jacqueline Charlesworth, General Counsel and Associate Register of Copyrights at the U.S. Copyright Office, will be the keynote speaker. The timing is very fortuitous, as the Copyright Office just last week announced a new study to evaluate the effectiveness of the DMCA’s safe harbor provisions in Section 512 of the Copyright Act. Among the issues to be studied are the “costs and burdens of the notice-and-takedown process” and “how successfully section 512 addresses online infringement.” These very issues will be discussed at the conference.

The conference panels will discuss topics including live streaming, notice-and-staydown, copyright management information, safe harbor eligibility, collective licensing, and mass digitization. CPIP’s Executive Director Matthew Barblan will moderate the panel on safe harbor eligibility, and CPIP’s Assistant Director Devlin Hartline will be a panelist discussing notice-and-staydown.

We hope you will join us for an exciting and intellectually rewarding event!

Categories
Administrative Agency Copyright Legislation Uncategorized

Principles and Priorities to Guide Congress’s Ongoing Copyright Review

Last week, CPIP published a new white paper, Copyright Principles and Priorities to Foster a Creative Digital Marketplace, by Sandra Aistars, Mark Schultz, and myself, which draws from the testimonies and scholarly writings of CPIP Senior Scholars in order to guide Congress as it continues its comprehensive review of the Copyright Act. The white paper discusses the constitutional origins of copyright protection and offers principles and priorities for Congress to consider as it moves forward with the copyright review process.

The current copyright review began in early 2013, when Register of Copyrights Maria Pallante threw down the gauntlet in her Horace S. Manges lecture by urging Congress to create “the next great copyright act.” While noting that minor legislative tweaks certainly have their place, Register Pallante suggested that it’s time for Congress to do something far more sweeping. Since then, Congress has embarked on a comprehensive review of our copyright laws, conducting over twenty hearings since mid-2013.

CPIP Senior Scholars have been actively engaged in that hearing process. Prof. Sandra Aistars (while she was CEO of the Copyright Alliance) testified on the creative community’s contributions to innovation and suggested several principles for the review process. Prof. Mark Schultz offered testimony on the scope and subject matter of copyright, and Prof. Sean O’Connor gave testimony on the failure of the DMCA’s notice-and-takedown regime.

As we discuss in the white paper, the premise of our copyright system is that copyrights are more than just incentives to create—they’re also rewards to authors for their productive labors. The Founders understood that authors’ rights and the public good are complementary, and they knew that public interests are best served when individual interests are properly secured. That understanding has proved quite prescient, as copyright today drives many innovations that provide remarkable benefits to our economy, society, and culture.

In the white paper, we propose the following organizing principles for any further work reviewing or revising the Copyright Act:

    A. Stay True to Technology-Neutral Principles and Take the Long View
    B. Strengthen the Ability of Authors to Create and to Disseminate Works
    C. Value the Input of Creative Upstarts
    D. Ensure that Copyright Continues to Nurture Free Speech and Creative Freedom
    E. Rely on the Marketplace and Private Ordering Absent Clear Market Failures
    F. Value the Entire Body of Copyright Law

We then note that these principles in turn suggest that Congress prioritize the following areas for action:

    A. Copyright Office Modernization
    B. Registration and Recordation
    C. Mass Digitization and Orphan Works
    D. Small Claims
    E. Notice and Takedown
    F. Streaming Harmonization

The ball is still rolling with the copyright review process. The House Judiciary Committee began a listening tour this fall that kicked off in Nashville and then traveled to Silicon Valley and Los Angeles. Moreover, those who testified at the earlier hearings have been invited back to meet with Committee staff and discuss any further input they might have. And the Committee is open to “any interested party” coming in to discuss their interests.

All told, this lengthy review process places Congress in a good position to take the next step in bringing us closer to Register Pallante’s “next great copyright act.” And to that end, we hope that our white paper will help Congress keep the constitutional premise of copyright protection in mind as it chooses where we go from here.

To read the full white paper, please click here.