Categories
Copyright Internet Uncategorized

Attacking the Notice-and-Staydown Straw Man

Ever since the U.S. Copyright Office announced its study of the DMCA last December, the notice-and-staydown issue has become a particularly hot topic. Critics of notice-and-staydown have turned up the volume, repeating the same vague assertions about freedom, censorship, innovation, and creativity that routinely pop up whenever someone proposes practical solutions to curb online infringement. Worse still, many critics don’t even take the time to look at what proponents of notice-and-staydown are suggesting, choosing instead to knock down an extremist straw man that doesn’t reflect anyone’s view of how the internet should function. A closer look at what proponents of notice-and-staydown are actually proposing reveals that the two sides aren’t nearly as far apart as critics would have us believe. This is particularly true when it comes to the issue of how well notice-and-staydown would accommodate fair use.

For example, Joshua Lamel’s recent piece at The Huffington Post claims that “innovation and creativity are still under attack” by the “entertainment industry’s intense and well-financed lobbying campaign” pushing for notice-and-staydown. Lamel argues that the “content filtering proposed by advocates of a ‘notice and staydown’ system . . . would severely limit new and emerging forms of creativity.” And his parade of horribles is rather dramatic: “Parents can forget posting videos of their kids dancing to music and candidates would not be able to post campaign speeches because of the music that plays in the background. Remix culture and fan fiction would likely disappear from our creative discourse.” Scary stuff, if true. But Lamel fails to cite a single source showing that artists, creators, and other proponents of notice-and-staydown are asking for anything close to this.

Similarly, Elliot Harmon of the Electronic Frontier Foundation (EFF) argues that “a few powerful lobbyists” are pushing for notice-and-staydown such that “once a takedown notice goes uncontested, the platform should have to filter and block any future uploads of the same allegedly infringing content.” Harmon also assumes the worst: “Under the filter-everything approach, legitimate uses of works wouldn’t get the reasonable consideration they deserve,” and “computers would still not be able to consider a work’s fair use status.” Like Lamel, Harmon claims that “certain powerful content owners seek to brush aside the importance of fair use,” but he doesn’t actually mention what these supposed evildoers have to say about notice-and-staydown.

Harmon’s suggestion that the reliance on uncontested takedown notices gives inadequate consideration to fair use is particularly strange as it directly contradicts the position taken by the EFF. Back in October of 2007, copyright owners (including CBS and Fox) and service providers (including Myspace and Veoh) promulgated a list of Principles for User Generated Content Services. These Principles recommend that service providers should use fingerprinting technology to enact notice-and-staydown, with the general caveat that fair use should be accommodated. Two weeks later, the EFF published its own list of Fair Use Principles for User Generated Video Content suggesting in detail how notice-and-staydown should respect fair use.

The EFF’s Fair Use Principles include the following:

The use of “filtering” technology should not be used to automatically remove, prevent the uploading of, or block access to content unless the filtering mechanism is able to verify that the content has previously been removed pursuant to an undisputed DMCA takedown notice or that there are “three strikes” against it:

1. the video track matches the video track of a copyrighted work submitted by a content owner;
2. the audio track matches the audio track of that same copyrighted work; and
3. nearly the entirety (e.g., 90% or more) of the challenged content is comprised of a single copyrighted work (i.e., a “ratio test”).

If filtering technologies are not reliably able to establish these “three strikes,” further human review by the content owner should be required before content is taken down or blocked.

Though not explicitly endorsing notice-and-staydown, the EFF thinks it’s entirely consistent with fair use so long as (1) the content at issue has already been subject to one uncontested takedown notice, or (2) the content at issue is at least a 90% match to a copyrighted work. And the funny thing is that supporters of notice-and-staydown today are actually advocating for what the EFF recognized to be reasonable over eight years ago.

While Harmon never explicitly identifies the “powerful lobbyists” he accuses of wanting to trample on fair use, he does link to the Copyright Office’s recently-announced study of the DMCA and suggest that they can be found there. Reading through that announcement, I can only find three citations (in footnote 36) to people advocating for notice-and-staydown: (1) Professor Sean O’Connor of the University of Washington School of Law (and Senior Scholar at CPIP), (2) Paul Doda, Global Litigation Counsel at Elsevier, and (3) Maria Schneider, composer/conductor/producer. These three cites all point to testimonies given at the Section 512 of Title 17 hearing before the House Judiciary Committee in March of 2014, and they show that Harmon is attacking a straw man. In fact, all three of these advocates for notice-and-staydown seek a system that is entirely consistent with the EFF’s own Fair Use Principles.

Sean O’Connor seeks notice-and-staydown only for “reposted works,” that is, “ones that have already been taken down on notice” and that are “simply the original work reposted repeatedly by unauthorized persons.” His proposal only applies to works that “do not even purport to be transformative or non-infringing,” and he specifically excludes “mash-ups, remixes, covers, etc.” This not only comports with the EFF’s recommendations, it goes beyond them. Where the EFF would require either a previously uncontested notice or at least a 90% match, O’Connor thinks there should be both an uncontested notice and a 100% match.

The same is true for Paul Doda of Elsevier, who testifies that fingerprinting technology is “an appropriate and effective method to ensure that only copies that are complete or a substantially complete copy of a copyrighted work are prevented or removed by sites.” Doda explicitly notes that filtering is not suitable for “works that might require more detailed infringement analysis or ‘Fair Use’ analysis,” and he praises YouTube’s Content ID system “that can readily distinguish between complete copies of works and partial copies or clips.” Doda’s vision of notice-and-staydown is also more protective of fair use than the EFF’s Fair Use Principles. While the EFF suggests that a previously uncontested notice is sufficient, Doda instead only suggests that there be a substantial match.

Unlike O’Connor and Doda, Maria Schneider is not a lawyer. She’s instead a working musician, and her testimony reflects her own frustrations with the whack-a-mole problem under the DMCA’s current notice-and-takedown regime. As a solution, Schneider proposes that creators “should be able to prevent unauthorized uploading before infringement occurs,” and she points to YouTube’s Content ID as evidence that “it’s technically possible for companies to block unauthorized works.” While she doesn’t explicitly propose that there be a substantial match before content is filtered, Schneider gives the example of her “most recent album” being available “on numerous file sharing websites.” In other words, she’s concerned about verbatim copies of her works that aren’t possibly fair use, and nothing Schneider recommends contradicts the EFF’s own suggestions for accommodating fair use.

Lamel and Harmon paint a picture of powerful industry lobbyist boogeymen seeking an onerous system of notice-and-staydown that fails to adequately account for fair use, but neither produces any evidence to support their claims. Responses to the Copyright Office’s DMCA study are due on March 21st, and it will be interesting to see whether any of these supposed boogeymen really show up. There’s little doubt, though, that critics will continue attacking the notice-and-staydown straw man. And it’s really a shame, because advocates of notice-and-staydown are quite conscious of the importance of protecting fair use. This is easy to see, but first you have to look at what they’re really saying.

Categories
Copyright Innovation Internet Uncategorized

Notice-and-Staydown and Google Search: The Whack-A-Mole Problem Continues Unabated

After my last post discussing the necessity for notice-and-staydown to help copyright owners with the never-ending game of whack-a-mole under the DMCA, I was asked to clarify how this would work for Google Search in particular. The purpose of my post was to express the need for something better and the hope that fingerprinting technologies offer. But, admittedly, I did not do a good job of separating out how notice-and-staydown would work for hosting platforms as compared to search engines. I think the whack-a-mole problem with hosting sites is indeed different than with search engines, and while fingerprinting may work well for the former, it’s probably ill-suited for the latter.

It’s clear enough how fingerprinting technologies can be applied to hosting platforms, and the simple fact is that they are already being deployed. YouTube uses its own proprietary technology, Content ID, while other platforms, such as Facebook and SoundCloud, use Audible Magic. These technologies create digital fingerprints of content that are then compared to user-uploaded content. When there’s a match, the copyright owner can choose to either allow, track, mute, monetize, or block the uploaded content.

There isn’t a lot of publicly-available information about how accurate these fingerprinting technologies are or how widely copyright owners utilize them. We do know from Google’s Katherine Oyama, who testified to Congress in early 2014, that “more than 4,000 partners” used Content ID at the time and that copyright owners had “‘claimed’ more than 200 million videos on YouTube” with the technology. She also acknowledged that “Content ID is not perfect, sometimes mistakenly ascribing ownership to the wrong content and sometimes failing to detect a match in a video.” Despite these imperfections, the scale of which she didn’t spell out, YouTube continues to offer Content ID to copyright owners.

Oyama also indicated that Content ID does not “work for a service provider that offers information location tools (like search engines and social networks) but does not possess copies of all the audio and video files that it links to.” This scenario is clearly different. When a site hosts content uploaded by its users, it can easily match those uploads to the content it’s already fingerprinted. When a site links to content that’s hosted elsewhere, it may not be possible to analyze that content in the same way. For example, the linked-to site could simply prevent automated crawling. Of course, not all sites block such crawling, but more would probably start doing so if fingerprinting were used in this way.

For Google Search, notice-and-staydown would likely not depend upon fingerprinting technology. Instead, the changes would come from: (1) delisting rogue sites, (2) promoting legitimate content, (3) improving auto-complete, and (4) ceasing to link to the very links that have already been taken down. These suggestions are not anything new, but it’s clear that Google has not done all it can to make them effective. This is not to say that improvements haven’t been made, and Google is to be commended for the work that it’s done so far. But it can and should do more.

Sticking with the example of The Hateful Eight from my prior post, it’s easy to see how Google Search promotes piracy. Using a fresh installation of Chrome so as not to skew the results, I need only type “watch hat” into Google Search before its auto-complete first suggests I search for “watch hateful 8 online.” After following this suggestion, the first seven results are links to obviously-infringing copies of the film. The first link points to the film at the watchmovie.ms site. A quick glance at that site’s homepage shows that it offers numerous (if not only) films that are still in theaters, including Spectre, Star Wars: The Force Awakens, Creed, and The Hateful Eight. In other words, Google’s first search result from its first suggested search term points me to an illicit copy of the film on a site that’s obviously dedicated to infringement.

I’ve never heard of watchmovie.ms, so I checked its WHOIS data. The site was registered on October 14th of last year, and Google’s Transparency Report indicates that it started receiving takedown notices for it just a few days later. To date, Google has received 568 requests to remove 724 infringing links to watchmovie.ms, but its search engine dutifully continues to crawl and index “about 39,000 results” at the site. And, for reasons I simply cannot fathom, Google prefers to send me to that pirate site rather than point me to Google Play (or to any number of other sites) where I can pre-order the film and “watch hateful 8 online” legally.

Making matters worse, at the bottom of the first page of search results for “watch hateful 8 online,” Google links to four DMCA takedown notices it received from copyright owners that resulted in five links being removed from the first page of results. These four notices, in turn, contain a combined total of 499 illicit links to The Hateful Eight that Google has already taken down. This truly boggles the mind. Google takes down five infringing links from one page of search results, consistent with the DMCA, but then it links to those five links along with 494 more such links. And these linked-to links are even better for infringers, since they’ve been vetted by Google as being worthy of taking down.

As the producer of The Hateful Eight, Richard Gladstein, relayed to The Hollywood Reporter, Google told him that it is “not in a position to decide what is legal and what is illegal online.” This is a cop out. In other venues, Google contends that it’s doing a lot to fight piracy. It submitted comments to the U.S. Intellectual Property Enforcement Coordinator this past October explaining how “changes made to [its] algorithm have been highly effective in demoting sites receiving a high number of takedown notices.” This shows that it is indeed in a position to determine what is “illegal online” and to take action against pirate sites. But simply demoting these sites is not enough—they should be delisted altogether.

Everyone knows that The Pirate Bay is a pirate site, yet Google indexes “about 914,000 results” from just one of its domains. As of today, Google has received 187,540 requests to remove 3,628,242 links to that domain, yet Google doesn’t choose to delist the site from its results. Nor does it even appear to be demoting it. The top three search results for “thepiratebay hateful 8” are links to infringing copies of the film on The Pirate Bay. It’s clear that these links are infringing, yet Google makes copyright owners continue playing whack-a-mole for even the most obvious infringements.

This isn’t how the DMCA is supposed to work. Congress even envisioned this whack-a-mole scenario with search engines when it wrote the DMCA. The legislative history provides: “If, however, an Internet site is obviously pirate, then seeing it may be all that is needed for the service provider [i.e., search engine or other information location tool] to encounter a ‘red flag.’ A provider proceeding in the face of such a ‘red flag’ must do so without the benefit of a safe harbor.” The Pirate Bay is “obviously pirate,” and Google knows as much even without the 3.6 million takedown notices it’s received. It knows the same thing about lots of pirate sites, including the other domain names contained in its list of greatest hits.

Google could be doing more to help copyright owners with the whack-a-mole problem, but so far, it’s only taken a few baby steps. And when defending its refusal to delist obvious pirate sites, Google contends that it’s defending freedom of speech:

[W]hole-site removal sends the wrong message to other countries by favoring over-inclusive private censorship over the rule of law. If the U.S. embraces such an overbroad approach to address domestic law violations (e.g., copyright), it will embolden other countries to seek similar whole-site removal remedies for violations of their laws (e.g., insults to the king, dissident political speech). This would jeopardize free speech principles, emerging services, and the free flow of information online globally and in contexts far removed from copyright.

Delisting The Pirate Bay from Google Search isn’t about favoring “censorship over the rule of law.” It’s about Google favoring the rule of law over blatant criminal infringement and doing its part to be a good citizen in the digital economy where it plays no small role. The comparison of the conduct of criminal infringers to the speech of political dissidents rings hollow, and delisting the most obvious and egregious sites does not threaten free speech. Google already claims to demote pirate sites, yet that doesn’t “jeopardize free speech principles.” Neither will delisting them. And as long as Google consciously decides to index known pirate sites with its search engine, the whack-a-mole problem will only continue unabated for copyright owners.

Categories
Conferences Copyright Innovation Internet Uncategorized

Last Chance to Register for the Copyright and Technology NYC 2016 Conference

Tomorrow is the last chance to register for the Copyright and Technology NYC 2016 Conference. The conference will be held next Tuesday, January 19th, at New York University’s Kimmel Center.

In addition to Matthew Barblan and Devlin Hartline from CPIP, participants will come from the following organizations:

  • ASCAP
  • BMI
  • CBS
  • CCIA
  • Charter Communications
  • Cisco
  • Copyright Alliance
  • Copyright Clearance Center
  • Elsevier
  • Entertainment Software Assn.
  • Facebook
  • Getty Images
  • HBO
  • Imagem
  • NBA (National Basketball Association)
  • NMPA
  • RIAA
  • Scholastic
  • Sony Music Entertainment
  • SoundCloud
  • SoundExchange
  • Thomson Reuters
  • Time Warner
  • Twitter
  • U.S. Copyright Office
  • U.S. Patent and Trademark Office
  • UFC
  • Universal Music Group
  • Vevo
  • Viacom
  • Warner Music Group
  • Warner/Chappell Music

You can meet these people and many more!

To register for the conference, please click here.

Categories
Copyright Innovation Internet Uncategorized

Endless Whack-A-Mole: Why Notice-and-Staydown Just Makes Sense

Producer Richard Gladstein knows all about piracy. As he recently wrote for The Hollywood Reporter, his latest film, The Hateful Eight, was “viewed illegally in excess of 1.3 million times since its initial theatrical release on Christmas Day.” Gladstein is not shy about pointing fingers and naming names. He pins the blame, in no small part, on Google and (its subsidiary) YouTube—the “first and third most trafficked websites on the internet.” While acknowledging that fair use is important, Gladstein argues that it has become “an extremely useful tool for those looking to distract from or ignore the real copyright infringement issue: piracy.” His point is that it’s simply not fair use when someone uploads an entire copyrighted work to the internet, and claims that service providers can’t tell when something is infringing are disingenuous.

Gladstein questions why Google and YouTube pretend they are “unable to create and apply technical solutions to identify where illegal activity and copyright infringement are occurring and stop directing audiences toward them.” In his estimation, “Google and YouTube have the ability to create a vaccine that could eradicate the disease of content theft.” While Gladstein doesn’t mention the DMCA or its notice-and-takedown provisions specifically, I think what he has in mind is notice-and-staydown. That is, once a service provider is notified that the copyright owner has not authorized a given work to be uploaded to a given site, that service provider should not be able to maintain its safe harbor if it continues hosting or linking to the given work.

No small amount of ink has been spilled pointing out that the DMCA’s notice-and-takedown provisions have led to an endless game of whack-a-mole for copyright owners. Google’s own transparency report boasts how its search engine has received requests to take down over 63 million URLs in the past month alone. And it helpfully tells us that it’s received over 21 million such requests over the past four years for just one site: rapidgator.net. Google’s transparency doesn’t extend to how many times it’s been asked to remove the same work, nor does it tell us anything about takedown requests for YouTube. But there’s no reason to think those numbers aren’t equally as frustrating for copyright owners.

The question one should ask is why these numbers aren’t frustrating for Google and YouTube, as they have to deal with the deluge of notices. Apparently, they don’t mind at all. According to the testimony of Google’s Senior Copyright Policy Counsel, Katherine Oyama, the “DMCA’s shared responsibility approach works.” Oyama notes that Google has spent tens of millions of dollars creating the infrastructure necessary to efficiently respond to the increasing number of takedown notices it receives, but many (if not most) copyright owners don’t have those kinds of resources. For them, it’s daily battles of manually locating infringements across the entire internet and sending takedown notices. For Google, it’s mostly-automated responses to take down content that otherwise brings ad-based revenue.

These struggles hit individual authors and artists the hardest. As the U.S. Copyright Office noted in its recently-announced study of the DMCA, “[m]any smaller copyright owners . . . lack access to third-party services and sophisticated tools to monitor for infringing uses, which can be costly, and must instead rely on manual search and notification processes—an effort that has been likened to ‘trying to empty the ocean with a teaspoon.’” What makes the process so frustrating—and futile—is the fact that the same works get uploaded to the same platforms time and time again. And any time spent sending the same takedown notice to the same service provider is time that is not spent honing one’s craft and creating new works.

Gladstein is correct: Service providers like Google and YouTube could be doing more. And, somewhat ironically, doing more for copyright owners would actually mean that both sides end up doing less. The obvious solution to the whack-a-mole problem is notice-and-staydown—it just makes sense. There’s simply no reason why a copyright owner should have to keep telling a service provider the same thing over and over again.

Those who object to notice-and-staydown often point out that the DMCA process is susceptible to abuse. Indeed, there are some who send notices in bad faith, perhaps to silence unwanted criticism or commentary. But there’s no reason to think that such abuse is the rule and not the exception. Google’s own numbers show that it complied with 97% of notices in 2011 and 99% of notices in 2013. That’s still a potentially-significant amount of abuse from notice-senders, but it’s also certainly a ton of intentional abuse from infringers whose conduct generated the legitimate notices in the first place. And the vast majority of those infringers won’t get so much as a slap on the wrist.

Turning back to Gladstein’s theme, discussions about fair use or takedown abuse are beside the point. The simple fact is that garden-variety copyright infringement involves neither issue. As CPIP Senior Scholar Sean O’Connor testified to Congress, “for many artists and owners the majority of postings are simply straight-on non-transformative copies seeking to evade copyright.” It’s this simple piracy, where entire works are uploaded to the internet for all to take, that concerns copyright owners most. Gladstein cares about the 1.3 million illicit distributions and performances of The Hateful Eight that are obviously infringing, not the commentary of critics that would obviously be fair use. And takedown notices sent because of these illicit uploads are anything but abusive—the abusers are the infringers.

The technology to make notice-and-staydown work already exists. For example, Audible Magic and YouTube both have the technology to create digital fingerprints of copyrighted works. When users later upload these same works to the internet, the digital fingerprints can be matched so that the copyright owner can then control whether to allow, monetize, track, or block the upload altogether. This technology is a great start, but it’s only as good as its availability to copyright owners. The continued proliferation of infringing works on YouTube suggests that this technology isn’t being deployed properly. And Google has no comparable technology available for its search engine, leaving copyright owners with little choice but to continue playing endless whack-a-mole.

Fortunately, the tides have been turning, especially as the technology and content industries continue to merge. And strides are being made in the courts as well. For example, a Court of Appeal in Germany recently held that YouTube has the duty to both take down infringing content and to make sure that it stays down. A quick search of YouTube today shows that The Hateful Eight, which is still in theaters, is legitimately available for pre-order and is illicitly available to be streamed right now. One wonders why YouTube chooses to compete with itself, especially when it has the tool to prevent such unfair competition. Regardless, there is real hope that Gladstein’s call for a “vaccine that could eradicate the disease of content theft” will be just what the doctor ordered—and that “vaccine” is notice-and-staydown.

[Update: This post unintentionally generated confusion as to whether I think notice-and-staydown means that fingerprinting technologies should be used with search engines. I do not think that would work well. I explain how search engines could do more to help copyright owners with the whack-a-mole problem in this follow-up post.]

Categories
Conferences Copyright Innovation Internet Uncategorized

Join Us at the Copyright and Technology NYC 2016 Conference on January 19

Co-produced by GiantSteps, the Copyright Society, and Musonomics, the Copyright and Technology NYC 2016 Conference will be held at New York University’s Kimmel Center on Tuesday, January 19th. CPIP is a proud Media Sponsor of the event.

The conference program is available here, and registration is still open here.

Jacqueline Charlesworth, General Counsel and Associate Register of Copyrights at the U.S. Copyright Office, will be the keynote speaker. The timing is very fortuitous, as the Copyright Office just last week announced a new study to evaluate the effectiveness of the DMCA’s safe harbor provisions in Section 512 of the Copyright Act. Among the issues to be studied are the “costs and burdens of the notice-and-takedown process” and “how successfully section 512 addresses online infringement.” These very issues will be discussed at the conference.

The conference panels will discuss topics including live streaming, notice-and-staydown, copyright management information, safe harbor eligibility, collective licensing, and mass digitization. CPIP’s Executive Director Matthew Barblan will moderate the panel on safe harbor eligibility, and CPIP’s Assistant Director Devlin Hartline will be a panelist discussing notice-and-staydown.

We hope you will join us for an exciting and intellectually rewarding event!

Categories
Copyright Internet Uncategorized

One Step Closer to Mega Justice: Kim Dotcom Loses Extradition Hearing

Cross-posted from the Law Theories blog.

The news broke last night that Kim Dotcom has suffered a major setback in his bid to avoid standing trial in the Eastern District of Virginia on multiple felony charges relating to his Megaupload website. District Judge N.R. Dawson of the Auckland District Court in New Zealand dropped his 271-page bombshell opinion, holding that Kim Dotcom and his three co-defendants are eligible to be extradited to the United States. It’s been nearly four years since Dotcom was arrested in a dramatic raid by the New Zealand Police in early 2012, and this decision means that the victims of the so-called “Mega Conspiracy” are finally one step closer to justice.

Judge Dawson’s opinion is a remarkable read. He spends much of the opinion (pp. 16–172) going over a “selection” of the evidence and its inferences as proffered by the United States. For people claiming that there is no evidence that Dotcom did anything wrong, this should be required reading. The number of smoking-gun emails and Skype conversations is staggering. Luckily for the victims, the members of the Mega Conspiracy were not very good at covering their tracks.

There’s no doubt that Dotcom and his international team of lawyers have done a good job of stalling the proceedings. I’m sure most criminal defendants would love to receive even half as much legal representation as Dotcom has been able to procure. But at the end of the day, Judge Dawson is completely unimpressed with all of the irrelevant things that Dotcom argued. From the get-go, Dotcom tried to turn this extradition hearing into a trial on the merits under U.S. law—incredibly ironic, given that such a trial is the last thing Dotcom really wants. Judge Dawson thoroughly casts aside Dotcom’s extraneous nonsense and gets right to the heart of the matter.

As a preliminary matter, Judge Dawson rejects all three of Dotcom’s applications to stay the proceedings. The legal issues to be decided under New Zealand’s Extradition Act 1999, he says, are simple:

The Court must be satisfied, [1] that the appropriate supporting documentation has been produced, [2] that the offence is an extradition offence in relation to the extradition country and [3] that there is sufficient evidence relating to the offending that would justify the person’s trial if the conduct constituting the offence had occurred in New Zealand.

Note that Dotcom’s guilt under U.S. law is not something Judge Dawson decides—and rightfully so. That’s a matter for District Judge Liam O’Grady to oversee once Dotcom stands trial in the Eastern District of Virginia. Judge Dawson notes that Dotcom is “entitled to challenge the correctness of the evidence,” but that “any challenge to the evidence must be more than simply an alternative explanation.” And this is a big problem for Dotcom: All of his denials and handwavings are simply alternative explanations, and they don’t undermine the reliability of the evidence. The question, as Judge Dawson sees it, is whether the evidence would support a trial under New Zealand law. U.S. law, he holds, is irrelevant.

Dotcom claimed that he couldn’t conduct a proper defense because he couldn’t hire the experts he wanted to rebut the U.S. government’s read of the applicable law, but Judge Dawson would have none of it: “It is not within the jurisdiction of this Court to rule upon competing views of USA law. That is a matter for trial in the US court.” Furthermore, the fact that Dotcom could offer alternative explanations for the evidence just didn’t matter: “It is not the role of this court to weigh the merits of the applicant’s case against the possible alternative explanations that might be provided by the respondents.”

In one of my favorite passages, Judge Dawson mentions Dotcom’s claim that he’s being railroaded and that “public confidence in the integrity of the criminal justice system would be undermined if a stay was not granted.” Judge Dawson returns fire by pointing out that Dotcom has enjoyed “full access to the New Zealand legal system” and that granting the stay “with total disregard for the law,” as Dotcom would have it, “is more likely to undermine public confidence in the criminal justice system[.]” Touché!

It’s not until page 221 that Judge Dawson finally turns to the merits of whether Dotcom is eligible to be surrendered to the United States. That analysis, under Section 24, has two steps: (1) whether the alleged offense is an “extradition offence,” and (2) whether a prima facie case been established. Judge Dawson runs through this two-step test for all thirteen counts: conspiracy to commit racketeering, conspiracy to commit criminal copyright infringement, conspiracy to commit money laundering, five counts of criminal copyright infringement, and five counts of fraud by wire. Referencing the 150-plus pages of evidence he laid out earlier in the opinion, Judge Dawson holds that a prima facie case for each of the counts has been established for each of the defendants.

This is a far cry from the affidavit submitted by Harvard’s Larry Lessig, who claimed that there was no prima facie case against any of the defendants on any of the counts.

One issue worth mentioning is Dotcom’s claim that he would be eligible for the DMCA safe harbors in the United States. Judge Dawson has nothing to say about this, as he doesn’t think U.S. law is relevant to this hearing. But he does have this to say about Dotcom’s eligibility for the safe harbors under Section 92B of New Zealand’s Copyright Act:

The purpose and intent of s 92B is to provide a “safe harbour” for ISPs that unintentionally have their storage used by others to store illegitimate materials, a true “dumb pipe”. . . . Protection under s 92B is not available on the evidence adduced at this hearing. There is evidence before the Court sufficient to establish that the respondents were in the course of a business that knowingly uploaded copies of copyright protected material, they kept it with a view to committing acts that would infringe the copyright, they were complicit in promulgating that they possessed such material, and distributed it to large numbers of people for their own financial gain.

In other words, Dotcom would get no safe harbors under New Zealand’s version of the DMCA.

Lastly, Judge Dawson entertains Dotcom’s entertaining claim that a special exception should be made just for him because he is the victim of a politically-motivated prosecution orchestrated by “US movie studios executives” and “the Vice President of the United States of America, Mr Joe Biden.” (Seriously, he argued this.) Judge Dawson easily brushes this off:

That Mr Dotcom has a different point of view about the use of the internet from others including the US movie studios does not have the hallmarks of what is ordinarily regarded as political persecution for political beliefs. It is a difference of opinion about a business matter, the use of the internet and the application of copyright law.

No dice.

In the end, Judge Dawson notes that “it is not possible to issue decisions that would be less than encyclopaedic in length in order to cover every minor point alluded to in the hearing.” Regardless, he finds that “[t]here is no need to do this” since none of Dotcom’s arguments “come near to undermining the applicant’s case[.]” As such, Judge Dawson concludes that “[t]he overwhelming preponderance of evidence . . . establishes a prima facie case to answer for all respondents on each of the counts” and that “the respondents are all eligible for surrender on all thirteen counts in the superseding indictment.”

Dotcom will appeal, naturally, and his Twitter feed indicates that he feels good about his chances. I honestly can’t fathom why. It seems to me that his only success so far has been in delaying the inevitable, but that inevitably he’ll have to do the one thing that he doesn’t really want to do—defend his case on the merits under U.S. law when it really counts.

Categories
Copyright Internet Uncategorized

BMG v. Cox: ISP Liability and the Power of Inference

Cross-posted from the Law Theories blog.

As readers are likely aware, the jury verdict in BMG v. Cox was handed down on December 17th. The jury found that BMG had proved by a preponderance of the evidence that Cox’s users were direct infringers and that Cox is contributorily liable for that infringement. The interesting thing, to me at least, about these findings is that they were both proved by circumstantial evidence. That is, the jury inferred that Cox’s users were direct infringers and that Cox had the requisite knowledge to make it a contributory infringer. Despite all the headlines about smoking-gun emails from Cox’s abuse team, the case really came down a matter of inference.

Direct Infringement of the Public Distribution Right

Section 106(3) grants copyright owners the exclusive right “to distribute copies . . . of the copyrighted work to the public[.]” In the analog days, a copy had to first be made before it could be distributed, and this led to much of the case law focusing on the reproduction right. However, in the digital age, the public distribution usually occurs before the reproduction. In an upload-download scenario, the uploader publicly distributes the work and then the downloader makes the copy. This has brought much more attention to the contours of the public distribution right, and there are some interesting splits in the case law looking at online infringement.

Though from the analog world, there is one case that is potentially binding authority here: Hotaling v. Church of Jesus Christ of Latter-Day Saints. Handed down by the Fourth Circuit in 1997, Hotaling held that “a library distributes a published work . . . when it places an unauthorized copy of the work in its collection, includes the copy in its catalog or index system, and makes the copy available to the public.” The copies at issue in Hotaling were in microfiche form, and they could not be checked out by patrons. This meant that the plaintiff could not prove that the library actually disseminated the work to any member of the public. Guided by equitable concerns, the Fourth Circuit held that “a copyright holder would be prejudiced by a library that does not keep records of public use,” thus allowing the library to “unjustly profit by its own omission.”

Whether this aspect of Hotaling applies in the digital realm has been a point of contention, and the courts have been split on whether a violation of the public distribution right requires actual dissemination. As I’ve written about before, the Nimmer on Copyright treatise now takes the position that “[n]o consummated act of actual distribution need be demonstrated in order to implicate the copyright owner’s distribution right,” but that view has yet to be universally adopted. Regardless, even if actual dissemination is required, Hotaling can be read to stand for the proposition that it can be proved by circumstantial evidence. As one court put it, “Hotaling seems to suggest” that “evidence that a defendant made a copy of a work available to the public might, in conjunction with other circumstantial evidence, support an inference that the copy was likely transferred to a member of the public.”

The arguments made by BMG and Cox hashed out this now-familiar landscape. Cox argued that merely offering a work to the public is not enough: “Section 106(3) makes clear that Congress intended not to include unconsummated transactions.” It then distinguished Hotaling on its facts, suggesting that, unlike the plaintiff there, BMG was “in a position to gather information about alleged infringement, even if [it] chose not to.” In opposition, BMG pointed to district court cases citing Hotaling, as well as to the Nimmer treatise, for the proposition that making available is public distribution simpliciter.

As to Cox’s attempt to distinguish Hotaling on the facts, BMG argued that Cox was the one that failed “to record actual transmissions of infringing works by its subscribers over its network.” Furthermore, BMG argued that “a factfinder can infer that the works at issue were actually shared from the evidence that they were made available,” and it noted that cases Cox had relied on “permit the inference that dissemination actually took place.” In its reply brief, Cox faulted BMG for reading Hotaling so broadly, but it noticeably had nothing to say about the propriety of inferring that dissemination had actually taken place.

In his memorandum opinion issued on December 1st, District Judge Liam O’Grady sided with Cox on the making available issue and with BMG on the permissibility of inference. Reading Hotaling narrowly, Judge O’Grady held that the Fourth Circuit merely “articulated a principle that applies only in cases where it is impossible for a copyright owner to produce proof of actual distribution.” And without the making available theory on the table, “BMG must show an actual dissemination of a copyrighted work.” Nonetheless, Judge O’Grady held that the jury could infer actual dissemination based on the circumstantial evidence collected by BMG’s agent, Rightscorp:

Cox’s argument ignores the fact that BMG may establish direct infringement using circumstantial evidence that gives rise to an inference that Cox account holders or other authorized users accessed its service to directly infringe. . . . Rightscorp claims to have identified 2.5 million instances of Cox users making BMG’s copyrighted works available for download, and Rightscorp itself downloaded approximately 100,000 full copies of BMG’s works using Cox’s service. BMG has presented more than enough evidence to raise a genuine issue of material fact as to whether Cox account holders directly infringed its exclusive rights.

The jury was ultimately swayed by this circumstantial evidence, inferring that BMG had proved that it was more likely than not that Cox’s users had actually disseminated BMG’s copyrighted works. But proving direct infringement is only the first step, and BMG next had to demonstrate that Cox is contributorily liable for that infringement. As we’ll see, this too was proved by inference.

Contributory Infringement of the Public Distribution Right

While the Patent Act explicitly provides circumstances in which someone “shall be liable as a contributory infringer,” the Copyright Act’s approach is much less direct. As I’ve written about before, the entire body of judge-made law concerning secondary liability was imported into the 1976 Act via the phrase “to authorize” in Section 106. Despite missing this flimsy textual hook, the Supreme Court held in Sony that nothing precludes “the imposition of liability for copyright infringements on certain parties who have not themselves engaged in the infringing activity.” Indeed, the Court noted that “the concept of contributory infringement is merely a species of the broader problem of identifying the circumstances in which it is just to hold one individual accountable for the actions of another.”

Arguments about when it’s “just” to hold someone responsible for the infringement committed by another have kept lawyers busy for well over a century. The Second Circuit’s formulation of the contributory liability test in Gershwin has proved particularly influential over the past four decades: “[O]ne who, with knowledge of the infringing activity, induces, causes or materially contributes to the infringing conduct of another, may be held liable as a ‘contributory’ infringer.” This test has two elements: (1) knowledge, and (2) induce, cause, or materially contribute. Of course, going after the service provider, as opposed to going after the individual direct infringers, often makes sense. The Supreme Court noted this truism in Grokster:

When a widely shared service or product is used to commit infringement, it may be impossible to enforce rights in the protected work effectively against all direct infringers, the only practical alternative being to go against the distributor of the copying device for secondary liability on a theory of contributory or vicarious infringement.

And this is what BMG has done here by suing Cox instead of Cox’s users. The Supreme Court in Grokster also introduced a bit of confusion into the contributory infringement analysis. The theory at issue there was inducement—the plaintiffs argued that Grokster induced its users to infringe. Citing Gershwin, the Supreme Court stated this test: “One infringes contributorily by intentionally inducing or encouraging direct infringement[.]” Note how this is narrower than the test in Gershwin, which for the second element also permits causation or material contribution. While, on its face, this can plausibly be read to imply a narrowing of the traditional test for contributory infringement, the better read is that the Court merely mentioned the part of the test (inducement) that it was applying.

Nevertheless, Cox argued here that Grokster jettisoned a century’s worth of the material contribution flavor of contributory infringement: “While some interpret Grokster as creating a distinct inducement theory, the Court was clear: Grokster is the contributory standard.” Cox wanted the narrower inducement test to apply here because BMG would have a much harder time proving inducement over material contribution. As such, Cox focused on its lack of inducing behavior, noting that it did not take “any active steps to foster infringement.”

Despite its insistence that “Grokster supplanted the earlier Gershwin formulation,” Cox nevertheless argued that BMG’s anticipated material contribution claim “fails as a matter of law” since the knowledge element could not be proved. According to Cox, “Rightscorp’s notices do not establish Cox’s actual knowledge of any alleged infringement because notices are merely allegations of infringement[.]” Nor does the fact that it refused to receive notices from Rightscorp make it “willfully blind to copyright infringement on its network.” Cox didn’t argue that its service did not materially contribute to the infringement, and rightfully so—the material contribution element here is a no-brainer.

In opposition, BMG focused on Gershwin, declaring it to be “the controlling test for contributory infringement.” BMG noted that “Cox is unable to cite a single case adopting” its narrow “reading of Grokster, under which it would have silently overruled forty years of contributory infringement case law” applying Gershwin. (Indeed, I have yet to see a single court adopt Cox’s restrictive read of Grokster. This hasn’t stopped defendants from trying, though.) Turning to the material contribution element, BMG pointed out that “Cox does not dispute that it materially contributed to copyright infringement by its subscribers.” Again, Cox didn’t deny material contribution because it couldn’t win on this argument—the dispositive issue here is knowledge.

On the knowledge element, BMG proffered two theories. The first was that Cox is deemed “to have knowledge of infringement on its system where it knows or has reason to know of the infringing activity.” Here, BMG had sent Cox “millions of notices of infringement,” and it argued that Cox could not “avoid knowledge by blacklisting, deleting, or refusing” to accept its notices. Moreover, BMG noted that “Cox’s employees repeatedly acknowledged that they were aware of widespread infringement on Cox’s system.” BMG additionally argued that Cox was willfully blind since it “blacklisted or blocked every single notice of copyright infringement sent by Rightscorp on behalf of Plaintiffs, in an attempt to avoid specific knowledge of any infringement.”

In reply, Cox cited Sony for the rule that “a provider of a technology could not be liable for contributory infringement arising from misuse if the technology is capable of substantial noninfringing uses.” And since Cox’s service “is capable of substantial noninfringing users,” it claimed that it “cannot be liable under Sony.” Of course, as the Supreme Court clarified in Grokster, that is not the proper way to read Sony. Sony merely says that knowledge cannot be imputed because a service has some infringing uses. But BMG here is not asking for knowledge to be imputed based on the design of Cox’s service. It’s asking for knowledge to be inferred from the notices that Cox refused to receive.

Judge O’Grady made short work of Cox’s arguments. He cited Gershwin as the controlling law and rejected Cox’s argument vis-à-vis Grokster: “The Court finds no support for Cox’s reading of Grokster.” In a footnote, he brushed aside any discussion of whether Cox materially contributed to the infringement since Cox failed to raise the point in its initial memorandum. Judge O’Grady then turned to the knowledge element, stating the test as this: “The knowledge requirement is met by a showing of actual or constructive knowledge or by evidence that a defendant took deliberate actions to willfully blind itself to specific infringing activity.” In a footnote, he declined to follow the narrower rule in the Ninth Circuit from Napster that requires the plaintiff to establish “actual knowledge of specific acts of infringement.”

Thus, Judge O’Grady held that three types of knowledge were permissible to establish contributory infringement: (1) actual knowledge (“knew”), (2) constructive knowledge (“had reason to know”), or (3) willful blindness. Rejecting Cox’s theory to the contrary, he held that “DMCA-compliant notices are evidence of knowledge.” The catch here was that Cox refused to receive them, and it even ignored follow-up emails from BMG. And this is where inference came into play: Judge O’Grady held that Cox could have constructive knowledge since “a reasonable jury could conclude that Cox’s refusal to accept Rightscorp’s notices was unreasonable and that additional notice provided to Cox gave it reason to know of the allegedly infringing activity on its network.”

Turning to willful blindness, Judge O’Grady stated that it “requires more than negligence or recklessness.” Citing Global-Tech, he noted that BMG must prove that Cox “took ‘deliberate actions to avoid confirming a high probability of wrongdoing and who can almost be said to have actually known the critical facts.’” The issue here was clouded by the fact that Cox didn’t simply refuse to accept BMG’s notices from Rightscorp, but instead it offered to receive them if certain language offering settlements to Cox’s users was removed. While it would be reasonable to infer that Cox was not “deliberately avoiding knowledge of illegal activity,” Judge O’Grady held that “it is not the only inference available.” As such, he left it for the jury to decide as a question of fact which inference was better.

The jury verdict is now in, and we don’t know whether the jury found for BMG on the constructive knowledge theory or the willful blindness theory—or perhaps even both. Either way, the question boiled down to one of inference, and the jury was able to infer knowledge on Cox’s part. And this brings us back to the power of inference. Cox ended up being found liable as a contributory infringer for its users’ direct infringement of BMG’s public distribution rights, and both of these verdicts were established with nothing more than circumstantial evidence. That’s the power of inference when it comes to ISP liability.

Categories
Copyright Innovation Internet Remedies Uncategorized

Protecting Artists from Streaming Piracy Benefits Creativity and Technology

Here’s a brief excerpt of an op-ed by Devlin Hartline & Matthew Barblan that was published in The Hill:

In his recent op-ed in The Hill, Mike Montgomery argues that “[m]aking streaming copyright infringement a felony is a terrible idea” that will create “further rifts between tech and entertainment at a time when these two sectors are not only reliant upon one another, but melding.” While it’s true that the line between art and technology has become less discernable, it’s simply false that creating felony penalties for criminal streamers will put a wedge between the two. Instead, protecting artists and authors from such criminal enterprises serves to level the playing field so that honest creators and innovators can work together even more closely.

To read the rest of this op-ed, please visit The Hill.

Categories
Administrative Agency Copyright Legislation Uncategorized

Principles and Priorities to Guide Congress’s Ongoing Copyright Review

Last week, CPIP published a new white paper, Copyright Principles and Priorities to Foster a Creative Digital Marketplace, by Sandra Aistars, Mark Schultz, and myself, which draws from the testimonies and scholarly writings of CPIP Senior Scholars in order to guide Congress as it continues its comprehensive review of the Copyright Act. The white paper discusses the constitutional origins of copyright protection and offers principles and priorities for Congress to consider as it moves forward with the copyright review process.

The current copyright review began in early 2013, when Register of Copyrights Maria Pallante threw down the gauntlet in her Horace S. Manges lecture by urging Congress to create “the next great copyright act.” While noting that minor legislative tweaks certainly have their place, Register Pallante suggested that it’s time for Congress to do something far more sweeping. Since then, Congress has embarked on a comprehensive review of our copyright laws, conducting over twenty hearings since mid-2013.

CPIP Senior Scholars have been actively engaged in that hearing process. Prof. Sandra Aistars (while she was CEO of the Copyright Alliance) testified on the creative community’s contributions to innovation and suggested several principles for the review process. Prof. Mark Schultz offered testimony on the scope and subject matter of copyright, and Prof. Sean O’Connor gave testimony on the failure of the DMCA’s notice-and-takedown regime.

As we discuss in the white paper, the premise of our copyright system is that copyrights are more than just incentives to create—they’re also rewards to authors for their productive labors. The Founders understood that authors’ rights and the public good are complementary, and they knew that public interests are best served when individual interests are properly secured. That understanding has proved quite prescient, as copyright today drives many innovations that provide remarkable benefits to our economy, society, and culture.

In the white paper, we propose the following organizing principles for any further work reviewing or revising the Copyright Act:

    A. Stay True to Technology-Neutral Principles and Take the Long View
    B. Strengthen the Ability of Authors to Create and to Disseminate Works
    C. Value the Input of Creative Upstarts
    D. Ensure that Copyright Continues to Nurture Free Speech and Creative Freedom
    E. Rely on the Marketplace and Private Ordering Absent Clear Market Failures
    F. Value the Entire Body of Copyright Law

We then note that these principles in turn suggest that Congress prioritize the following areas for action:

    A. Copyright Office Modernization
    B. Registration and Recordation
    C. Mass Digitization and Orphan Works
    D. Small Claims
    E. Notice and Takedown
    F. Streaming Harmonization

The ball is still rolling with the copyright review process. The House Judiciary Committee began a listening tour this fall that kicked off in Nashville and then traveled to Silicon Valley and Los Angeles. Moreover, those who testified at the earlier hearings have been invited back to meet with Committee staff and discuss any further input they might have. And the Committee is open to “any interested party” coming in to discuss their interests.

All told, this lengthy review process places Congress in a good position to take the next step in bringing us closer to Register Pallante’s “next great copyright act.” And to that end, we hope that our white paper will help Congress keep the constitutional premise of copyright protection in mind as it chooses where we go from here.

To read the full white paper, please click here.

Categories
Copyright Innovation Internet Uncategorized

Protecting Authors and Artists by Closing the Streaming Loophole

U.S. Capitol buildingWe’ve released a new policy brief, Protecting Authors and Artists by Closing the Streaming Loophole, by Devlin Hartline & Matthew Barblan.

They argue that in order to protect authors and artists from having their works repeatedly stolen on the internet, it is long past time to harmonize the remedies for criminal copyright infringement to reflect the ways that copyrighted works are commonly misappropriated these days.

We’ve included the Introduction below. To read the full policy brief, please click here.

Protecting Authors and Artists by Closing the Streaming Loophole

By Devlin Hartline & Matthew Barblan

Introduction

Copyright protects the property rights of authors and artists through both civil and criminal remedies for infringement. While the civil remedies are commonplace, the sections of the Copyright Act that specify which forms of infringement qualify as criminal offenses are less familiar. Unfortunately for authors and artists, the remedies for criminal infringement have not been updated to reflect the realities of how copyrighted works are frequently misappropriated these days. Streaming has become more popular than ever, yet the law treats bad actors who traffic in illicit streams much more kindly than those who traffic in illicit downloads. This results in a loophole that emboldens bad actors and makes it harder for authors and artists to protect their property rights.

Authors and artists deserve better. It shouldn’t matter whether the works are illegally streamed to users or offered for download. From the perspective of a creator whose property rights are being ripped off, the result is exactly the same—the works are supplied to the public without the creator’s permission. Congress has a long history of modernizing copyright law to account for ever-changing technologies. Now that the internet has advanced to where streaming is a dominant method of illicitly disseminating copyrighted works, the time has come to close the streaming loophole and to harmonize the remedies for criminal copyright infringement.