Categories
Administrative Agency Copyright Innovation Internet Legislation Uncategorized

FCC’s Extreme Proposal Threatens the Livelihood of Creators

By Matthew Barblan & Kevin Madigan

circuit board

Earlier this year, the FCC proposed a new regulatory scheme ostensibly designed to improve the market for pay-TV set-top boxes. Chairman Wheeler claimed that the proposed rules would “tear down the barriers that currently prevent innovators from developing new ways for consumers to access and enjoy their favorite shows and movies on their terms.” But set-top boxes are already on their way out as more and more consumers turn to streaming apps to watch their favorite movies and shows. So what is the FCC up to here? A close look at the proposed rules reveals that this isn’t about set-top boxes at all. Instead, the rules are designed to benefit a handful of companies that want to disseminate pay-TV programs without negotiating with or paying a license to the owners of those programs, undermining the property rights of creators and copyright owners. The creative community is understandably up in arms.

As we explain in comments filed with the FCC, the proposed rules would require pay-TV providers to make copyrighted video content available to third-party companies that have no contractual relationship with either the pay-TV providers or the creators of the video programming. The Commission essentially aims to create a zero-rate compulsory license for these companies. But this zero-rate compulsory license would fundamentally disrupt copyright owners’ ability to pursue the wide variety of business models and licensing arrangements that enable our creative ecosystem to thrive.

A key component of copyright owners’ property interest is the ability to choose to whom they license their works and on what terms. Because their livelihoods depend on the success of their works, copyright owners are particularly well-positioned and incentivized to determine the best way to commercialize them. By conveying copyrighted works to third parties without the consent of copyright owners, the proposed rules trample on the property rights of copyright owners and risk severely damaging our vibrant creative economy.

Adding insult to injury, the proposed rules wouldn’t even require the recipients of this zero-rate compulsory license to abide by the underlying contractual terms between copyright owners and pay-TV providers. Licensing contracts between copyright owners and pay-TV providers often include specific terms detailing the obligations of the provider in distributing the creative works. These terms can include things like channel “neighborhood” assignments, branding requirements, advertising limits, platform restrictions, and the list goes on. While the Commission states that “our goal is to preserve the contractual arrangements” between copyright owners and pay-TV providers, the proposed rules would transfer some, but not all, of the underlying contractual obligations to the third-party recipients of the copyrighted works.

For example, under the Commission’s proposal, third-party recipients of the copyrighted works would not be required to abide by contractual terms about channel placement designed to protect viewer experience and brand value. Similarly, the Commission’s proposal would not require third-party recipients of copyrighted works to abide by contractual terms concerning advertising in the delivery of those works. By allowing third parties to sidestep these terms, the Commission risks reducing the advertising revenue that pay-TV providers can earn from disseminating copyrighted works, thereby reducing the value of the license agreements that copyright owners negotiate with pay-TV providers.

In another thumb-in-the-eye to creators and copyright owners, the Commission’s proposal fails to account for copyright owners who may want to protect their copyrighted works by disseminating them exclusively through proprietary (and not widely licensable) content protection mechanisms. Instead, the Commission proposes to require pay-TV providers “to support at least one content protection system to protect its multichannel video programming that is licensable on reasonable and nondiscriminatory terms by an organization that is not affiliated with [the pay-TV provider].” Thus, the Commission would force copyright owners to risk exposing their property to security threats that may be associated with using widely-licensable content protection mechanisms.

Furthermore, nothing in the Commission’s proposal would prevent third parties from delivering the copyrighted works side-by-side with stolen versions of those same works. It is easy to imagine a search function that aggregates copies of creative works from a variety of platforms and displays the search results side-by-side. In fact, anyone who has run an internet search for a movie or TV show has likely seen results that mix links to both legitimate and stolen works.

Copyright owners’ ability to protect their creative works is essential both to preserve the value of their property and to give them the confidence to enter into arrangements with intermediaries (like pay-TV providers) to disseminate their works to a wide variety of audiences. This is especially true in light of the unique security challenges involved in portable, online, and short-term access to copyrighted works. Any reasonable proposal in this space would help copyright owners move forward in the ongoing battle to prevent the rampant theft and illegal dissemination of their works that has accompanied the rise of the internet. Unfortunately, the Commission’s proposal does just the opposite, limiting copyright owners’ ability to protect their property and pushing them backwards in the ongoing struggle against piracy.

Furthermore, it is entirely unclear where the Commission would draw the legal authority to change the nature of copyright owners’ property rights. The proposed rules simply claim that Section 629 of the Communications Act grants the Commission authority to implement the regulations in order to ensure competition and consumer choice in the navigation device market. In its justification of authority, the Commission repeatedly states that it will broadly interpret ambiguous terms in the Communications Act and that “a broad interpretation is necessary.” But nowhere in its analysis does the Commission cite to language granting it the authority to rewrite copyright law. Even under the broadest of interpretations, it is clear that the Communications Act does not give the Commission the authority to amend the Copyright Act and create a zero-royalty compulsory license out of thin air.

By granting artists and creators property rights in the fruits of their labors, copyright supports a diverse and multifaceted ecosystem that enables the development, distribution, and enjoyment of creative works, and that provides significant economic and cultural benefits to our society. But this ecosystem only works if copyright owners are able to safely and freely deploy their property in the marketplace. Unfortunately, the Commission’s proposal fails to respect the property rights of creators and copyright owners, risking severe disruption to the very same creative marketplace the Commission claims to promote.

Categories
Antitrust Copyright International Law Internet Uncategorized

Google Image Search and the Misappropriation of Copyrighted Images

Cross-posted from the Mister Copyright blog.

Last week, American visual communications and stock photography agency Getty Images filed a formal complaint in support of the European Union’s investigation into Google’s anti-competitive business practices. The Getty complaint accuses Google of using its image search function to appropriate or “scrape” third-party copyrighted works, thereby drawing users away from the original source of the creative works and preserving its search engine dominance.

Specifically, Getty’s complaint focuses on changes made to Google’s image search functionality in 2013 that led to the appealing image galleries we’re familiar with today. Before the change, users were presented with low-resolution thumbnail versions of images and would be rerouted to the original source website to view a larger, more defined version and to find out how they might legally license or get permission to use the work. But with the current Google Image presentation, users are instantly delivered a large, desirable image and have no need to access the legitimate source. As Getty says in its complaint, “[b]ecause image consumption is immediate, once an image is displayed in high-resolution, large format, there is little impetus to view the image on the original source site.”

According to a study by Define Media Group, in the first year after the changes to Google Image search, image search referrals to original source websites were reduced by up to 80%. The report also provides before and after screenshots of a Google Image search and points out that before 2013, when a thumbnail was clicked, the source site appeared in the background. Not only does the source site not appear in the new version, but an extra click is required to get to the site, adding to the overall disconnect with the original content. Despite Google’s claims to the contrary, the authors of the study conclude that the new image search service is designed to keep users on the Google website.

It’s difficult not to consider Google’s image UI [user interface] change a shameless content grab – one which blatantly hijacks material that has been legitimately licensed by publishers so that Google Image users remain on their site, and are de-incentivized from visiting others.

While Getty’s complaint against Google is based on anticompetitive concerns, it involves the underlying contention that Google Image search enables misappropriation of copyrighted images on a massive scale. Anyone who has run a Google Image search knows that with the click of a mouse, a user is presented with hundreds of images related to their query, and with another simple right click, that user can then copy and paste these images as they please. But Google Image search often returns an abundance of copyright protected images, enabling anyone to copy, display and disseminate images without considering the underlying copyright and existing licenses. And while using the service may be free, make no mistake that Google is monetizing it through advertisements and the mining of users’ personal data.

When users are able to access and copy these full-screen, high resolution images from Google Image search, not only do third-party image providers lose traffic to their website, but the photographers and creators behind the images lose potential income, attribution and exposure that would come with users accessing the original source. As General Counsel Yoko Miyashita explains, “Getty Images represents over 200,000 photojournalists, content creators and artists around the world who rely on us to protect their ability to be compensated for their work.” When Google Image search obviates the need for a user to access the original creative content, these artists and creators are being denied a fair marketplace for their images, and their ability and motivation to create future works is jeopardized.

Shortly after Google changed to the new image search, individual photo publishers and image creators took to a Google Forum to voice their concerns over the effects the service was having on their images and personal web pages. A recurring complaint was that the service made it more difficult to find out information about images and that users now had to go through more steps to reach the original source website. One commenter, identifying herself as a “small time photo publisher,” described Google’s new practice of hotlinking to high-resolution images as a “skim engine” rather than a “search engine.” She lamented that not only was Google giving people access to her content without visiting her site, but her bandwidth usage (i.e. expense) went up due to the hotlinking of her high resolution images.

Google Image supporters argue that creators and image providers should simply use hotlink protection to block Google from displaying their content, but Google’s search engine dominance is so absolute, this would further curtail traffic to the original source of the content. Others suggest image providers stamp their images with watermarks to protect from infringement, but Getty VP Jonathan Lockwood explains that doing so would result in punishment from Google.

They penalise people who try to protect their content. There is then a ‘mismatch penalty’ for the site: you have to show the same one to Google Images that you own. If you don’t, you disappear.

The internet has made sharing creative works and gaining exposure as an artist easier than anyone could have imagined before the digital age, but it has also brought challenges in the form of protecting and controlling creative content. These challenges are particularly burdensome for image creators and providers, whose creative works are subject to unauthorized use the moment they are put online. Over the last few years, Google Image search has contributed to this problem by transforming from a service that provided direction to creative works to a complete substitute for original, licensed content.

With fewer opportunities for image providers and creators to realize a return–whether it be in the form of payment, attribution, or exposure–from their works, creativity and investment in creators will be stifled. Artists and rightsholders deserve fair compensation and credit for their works, and technology needs to work with image providers rather than against them to ensure that great content continues to be created.

Categories
Copyright Internet Legislation Uncategorized

Copyright Policy Should Be Based On Facts, Not Rhetoric

Here’s a brief excerpt of a post by Kevin Madigan & Devlin Hartline that was published on IPWatchdog.

After nearly twenty years with the DMCA, the Copyright Office has launched a new study to examine the impact and effectiveness of this system, and voices on both sides of the debate have filed comments expressing their views. For the most part, frustrated copyright owners report that the DMCA has not successfully stemmed the tide of online infringement, which is completely unsurprising to anyone who spends a few minutes online searching for copyrighted works. Unfortunately, some commentators are also pushing for changes that that would make things even more difficult for copyright owners.

To read the rest of this post, please visit IPWatchdog.

Categories
Copyright Internet Legislation Uncategorized

Separating Fact from Fiction in the Notice and Takedown Debate

By Kevin Madigan & Devlin Hartline

U.S. Capitol buidlingWith the Copyright Office undertaking a new study to evaluate the impact and effectiveness of the Section 512 safe harbor provisions, there’s been much discussion about how well the DMCA’s notice and takedown system is working for copyright owners, service providers, and users. While hearing from a variety of viewpoints can help foster a healthy discussion, it’s important to separate rigorous research efforts from overblown reports that offer incomplete data in support of dubious policy recommendations.

Falling into the latter category is Notice and Takedown in Everyday Practice, a recently-released study claiming to take an in-depth look at how well the notice and takedown system operates after nearly twenty years in practice. The study has garnered numerous headlines that repeat its conclusion that nearly 30% of all takedown requests are “questionable” and that echo its suggestions for statutory reforms that invariably disfavor copyright owners. But what the headlines don’t mention is that the study presents only a narrow and misleading assessment of the notice and takedown process that overstates its findings and fails to adequately support its broad policy recommendations.

Presumably released to coincide with the deadline for submitting comments to the Copyright Office on the state of Section 512, the authors claim to have produced “the broadest empirical analysis of the DMCA notice and takedown” system to date. They make bold pronouncements about how “the notice and takedown system . . . meets the goals it was intended to address” and “continues to provide an efficient method of enforcement in many circumstances.” But the goals identified by the authors are heavily skewed towards service providers and users at the expense of copyright owners, and the authors include no empirical analysis of whether the notice and takedown system is actually effective at combating widespread piracy.

The study reads more like propaganda than robust empiricism. It should be taken for what it is: A policy piece masquerading as an independent study. The authors’ narrow focus on one sliver of the notice and takedown process, with no analysis of the systemic results, leads to conclusions and recommendations that completely ignore the central issue of whether Section 512 fosters an online environment that adequately protects the rights of copyright owners. The authors conveniently ignore this part of the DMCA calculus and instead put forth a series of proposals that would systematically make it harder for copyright owners to protect their property rights.

To its credit, the study acknowledges many of its own limitations. For example, the authors recognize that the “dominance of Google notices in our dataset limits our ability to draw broader conclusions about the notice ecosystem.” Indeed, over 99.992% of the individual requests in the dataset for the takedown study were directed at Google, with 99.8% of that dataset directed at Google Search in particular. Of course, search engines do not include user-generated content—the links Google provides are links that Google itself collects and publishes. There are no third parties to alert about the takedowns since Google is taking down its own content. Likewise, removing links from Google Search does not actually remove the linked-to content from the internet.

The authors correctly admit that “the characteristics of these notices cannot be extrapolated to the entire world of notice sending.” A more thorough quantitative study would include data on sites that host user-generated content, like YouTube and Facebook. As it stands, the study gives us some interesting data on one search engine, but even that data is limited to a sample size of 1,826 requests out of 108 million over a six-month period in mid-2013. And it’s not even clear how these samples were randomized since the authors admittedly created “tranches” to ensure the notices collected were “of great substantive interest,” but they provide no details about how they created these tranches.

Despite explicitly acknowledging that the study’s data is not generalizable, the authors nonetheless rely on it to make numerous policy suggestions that would affect the entire notice and takedown system and that would tilt the deck further in favor of infringement and against copyright owners. They even identify some of their suggestions as explicitly reflecting “Public Knowledge’s suggestion,” which is a far cry from a reasoned academic approach. The authors do note that “any changes should take into account the interests of . . . small- and medium-sized copyright holders,” but this is mere lip service. Their proposals would hurt copyright owners of all shapes and sizes.

The authors justify their policy proposals by pointing to the “mistaken and abusive takedown demands” that they allegedly uncover in the study. These so-called “questionable” notices are the supposed proof that the entire notice and takedown system needs fixing. A closer look at these “questionable” notices shows that they’re not nearly so questionable. The authors claim that 4.2% of the notices surveyed (about 77 notices) are “fundamentally flawed because they targeted content that clearly did not match the identified infringed work.” This figure includes obvious mismatches, where the titles aren’t even the same. But it also includes ambiguous notices, such as where the underlying work does not match the title or where the underlying page changes over time.

The bulk of the so-called “questionable” notices comes from those notices that raise “questions about compliance with the statutory requirements” (15.4%, about 281 notices) or raise “potential fair use defenses” (7.3%, about 133 notices). As to the statutory requirements issue, the authors argue that these notices make it difficult for Google to locate the material to take down. This claim is severely undercut by the fact that, as they acknowledge in a footnote, Google complies with 97.5% of takedown notices overall. Moreover, it wades into the murky waters of whether copyright owners can send service providers a “representative list” of infringing works. Turning to the complaint about potential fair uses, the authors argue that copyright owners are not adequately considering “mashups, remixes, or covers.” But none of these uses are inherently fair, and there’s no reason to think that the notices were sent in bad faith just because someone might be able to make a fair use argument.

The authors claim that their “recommendations for statutory reforms are relatively modest,” but that supposed modesty is absent from their broad list of suggestions. Of course, everything they suggest increases the burdens and liabilities of copyright owners while lowering the burdens and liabilities of users, service providers, and infringers. Having overplayed the data on “questionable” notices, the authors reveal their true biases. And it’s important to keep in mind that they make these broad suggestions that would affect everyone in the notice and takedown system after explicitly acknowledging that their data “cannot be extrapolated to the entire world of notice sending.” Indeed, the study contains no empirical data on sites that host user-generated content, so there’s nothing whatsoever to support any changes for such sites.

The study concludes that the increased use of automated systems to identify infringing works online has resulted in the need for better mechanisms to verify the accuracy of takedown requests, including human review. But the data is limited to small surveys with secret questions and a tiny fraction of notices sent to one search engine. The authors offer no analysis of the potential costs of implementing their recommendations, nor do they consider how it might affect the ability of copyright owners to police piracy. Furthermore, data presented later in the study suggests that increased human review might have little effect on the accuracy of takedown notices. Not only do the authors fail to address the larger problem of whether the DMCA adequately addresses online piracy, their suggestions aren’t even likely to address the narrower problem of inaccurate notices that they want to fix.

Worse still, the study almost completely discards the ability of users to contest mistaken or abusive notices by filing counternotices. This is the solution that’s already built into the DMCA, yet the authors inexplicably dismiss it as ineffective and unused. Apart from providing limited answers from a few unidentified survey respondents, the authors offer no data on the frequency or effectiveness of counternotices. The study repeatedly criticizes the counternotice system as failing to offer “due process protection” to users, but that belief is grounded in the notion that a user that fails to send a counternotice has somehow been denied the chance. Moreover, it implies a constitutional right that is not at issue when two parties interact in the absence of government action. The same holds true for the authors’ repeated—and mistaken—invocation of “freedom of expression.”

More fundamentally, the study ignores the fact that the counternotice system is stacked against copyright owners. A user can simply file a counternotice and have the content in question reposted, and most service providers are willing to repost the content following a counternotice because they’re no longer on the hook should the content turn out to be infringing. The copyright owner, by contrast, then faces the choice of allowing the infringement to continue or filing an expensive lawsuit in federal court. The study makes it sound like users are rendered helpless because counternotices are too onerous, but the reality is that the system leaves copyright owners practically powerless to combat bad faith counternotices.

Pretty much everyone agrees that the notice and takedown system needs a tune up. The amount of infringing content available online today is immense. This rampant piracy has resulted in an incredible number of takedown notices being sent to service providers by copyright owners each day. Undoubtedly, the notice and takedown system should be updated to address these realities. And to the extent that some are abusing the system, they should be held accountable. But in considering changes to the entire system, we should not be persuaded by biased studies based on limited (and secret) datasets that provide little to no support for their ultimate conclusions and recommendations. While it may make for evocative headlines, it doesn’t make for good policy.

Categories
Administrative Agency Economic Study FTC Innovation Inventors Law and Economics Legislation Uncategorized

Acknowledging the Limitations of the FTC’s PAE Study

dictionary entry for the word "innovate"The FTC’s long-awaited case study of patent assertion entities (PAEs) is expected to be released this spring. Using its subpoena power under Section 6(b) to gather information from a handful of firms, the study promises us a glimpse at their inner workings. But while the results may be interesting, they’ll also be too narrow to support any informed policy changes. And you don’t have to take my word for it—the FTC admits as much. In one submission to the Office of Management and Budget (OMB), which ultimately decided whether the study should move forward, the FTC acknowledges that its findings “will not be generalizable to the universe of all PAE activity.” In another submission to the OMB, the FTC recognizes that “the case study should be viewed as descriptive and probative for future studies seeking to explore the relationships between organizational form and assertion behavior.”

However, this doesn’t mean that no one will use the study to advocate for drastic changes to the patent system. Even before the study’s release, many people—including some FTC Commissioners themselves—have already jumped to conclusions when it comes to PAEs, arguing that they are a drag on innovation and competition. Yet these same people say that we need this study because there’s no good empirical data analyzing the systemic costs and benefits of PAEs. They can’t have it both ways. The uproar about PAEs is emblematic of the broader movement that advocates for the next big change to the patent system before we’ve even seen how the last one panned out. In this environment, it’s unlikely that the FTC and other critics will responsibly acknowledge that the study simply cannot give us an accurate assessment of the bigger picture.

Limitations of the FTC Study

Many scholars have written about the study’s fundamental limitations. As statistician Fritz Scheuren points out, there are two kinds of studies: exploratory and confirmatory. An exploratory study is a starting point that asks general questions in order to generate testable hypotheses, while a confirmatory study is then used to test the validity of those hypotheses. The FTC study, with its open-ended questions to a handful of firms, is a classic exploratory study. At best, the study will generate answers that could help researchers begin to form theories and design another round of questions for further research. Scheuren notes that while the “FTC study may well be useful at generating exploratory data with respect to PAE activity,” it “is not designed to confirm supportable subject matter conclusions.”

One significant constraint with the FTC study is that the sample size is small—only twenty-five PAEs—and the control group is even smaller—a mixture of fifteen manufacturers and non-practicing entities (NPEs) in the wireless chipset industry. Scheuren reasons that there “is also the risk of non-representative sampling and potential selection bias due to the fact that the universe of PAEs is largely unknown and likely quite diverse.” And the fact that the control group comes from one narrow industry further prevents any generalization of the results. Scheuren concludes that the FTC study “may result in potentially valuable information worthy of further study,” but that it is “not designed in a way as to support public policy decisions.”

Professor Michael Risch questions the FTC’s entire approach: “If the FTC is going to the trouble of doing a study, why not get it done right the first time and a) sample a larger number of manufacturers, in b) a more diverse area of manufacturing, and c) get identical information?” He points out that the FTC won’t be well-positioned to draw conclusions because the control group is not even being asked the same questions as the PAEs. Risch concludes that “any report risks looking like so many others: a static look at an industry with no benchmark to compare it to.” Professor Kristen Osenga echoes these same sentiments and notes that “the study has been shaped in a way that will simply add fuel to the anti–‘patent troll’ fire without providing any data that would explain the best way to fix the real problems in the patent field today.”

Osenga further argues that the study is flawed since the FTC’s definition of PAEs perpetuates the myth that patent licensing firms are all the same. The reality is that many different types of businesses fall under the “PAE” umbrella, and it makes no sense to impute the actions of a small subset to the entire group when making policy recommendations. Moreover, Osenga questions the FTC’s “shortsighted viewpoint” of the potential benefits of PAEs, and she doubts how the “impact on innovation and competition” will be ascertainable given the questions being asked. Anne Layne-Farrar expresses similar doubts about the conclusions that can be drawn from the FTC study since only licensors are being surveyed. She posits that it “cannot generate a full dataset for understanding the conduct of the parties in patent license negotiation or the reasons for the failure of negotiations.”

Layne-Farrar concludes that the FTC study “can point us in fruitful directions for further inquiry and may offer context for interpreting quantitative studies of PAE litigation, but should not be used to justify any policy changes.” Consistent with the FTC’s own admissions of the study’s limitations, this is the real bottom line of what we should expect. The study will have no predictive power because it only looks at how a small sample of firms affect a few other players within the patent ecosystem. It does not quantify how that activity ultimately affects innovation and competition—the very information needed to support policy recommendations. The FTC study is not intended to produce the sort of compelling statistical data that can be extrapolated to the larger universe of firms.

FTC Commissioners Put Cart Before Horse

The FTC has a history of bias against PAEs, as demonstrated in its 2011 report that skeptically questioned the “uncertain benefits” of PAEs while assuming their “detrimental effects” in undermining innovation. That report recommended special remedy rules for PAEs, even as the FTC acknowledged the lack of objective evidence of systemic failure and the difficulty of distinguishing “patent transactions that harm innovation from those that promote it.” With its new study, the FTC concedes to the OMB that much is still not known about PAEs and that the findings will be preliminary and non-generalizable. However, this hasn’t prevented some Commissioners from putting the cart before the horse with PAEs.

In fact, the very call for the FTC to institute the PAE study started with its conclusion. In her 2013 speech suggesting the study, FTC Chairwoman Edith Ramirez recognized that “we still have only snapshots of the costs and benefits of PAE activity” and that “we will need to learn a lot more” in order “to see the full competitive picture.” While acknowledging the vast potential benefits of PAEs in rewarding invention, benefiting competition and consumers, reducing enforcement hurdles, increasing liquidity, encouraging venture capital investment, and funding R&D, she nevertheless concluded that “PAEs exploit underlying problems in the patent system to the detriment of innovation and consumers.” And despite the admitted lack of data, Ramirez stressed “the critical importance of continuing the effort on patent reform to limit the costs associated with some types of PAE activity.”

This position is duplicitous: If the costs and benefits of PAEs are still unknown, what justifies Ramirez’s rushed call for immediate action? While benefits have to be weighed against costs, it’s clear that she’s already jumped to the conclusion that the costs outweigh the benefits. In another speech a few months later, Ramirez noted that the “troubling stories” about PAEs “don’t tell us much about the competitive costs and benefits of PAE activity.” Despite this admission, Ramirez called for “a much broader response to flaws in the patent system that fuel inefficient behavior by PAEs.” And while Ramirez said that understanding “the PAE business model will inform the policy dialogue,” she stated that “it will not change the pressing need for additional progress on patent reform.”

Likewise, in an early 2014 speech, Commissioner Julie Brill ignored the study’s inherent limitations and exploratory nature. She predicted that the study “will provide a fuller and more accurate picture of PAE activity” that “will be put to good use by Congress and others who examine closely the activities of PAEs.” Remarkably, Brill stated that “the FTC and other law enforcement agencies” should not “wait on the results of the 6(b) study before undertaking enforcement actions against PAE activity that crosses the line.” Even without the study’s results, she thought that “reforms to the patent system are clearly warranted.” In Brill’s view, the study would only be useful for determining whether “additional reforms are warranted” to curb the activities of PAEs.

It appears that these Commissioners have already decided—in the absence of any reliable data on the systemic effects of PAE activity—that drastic changes to the patent system are necessary. Given their clear bias in this area, there is little hope that they will acknowledge the deep limitations of the study once it is released.

Commentators Jump the Gun

Unsurprisingly, many supporters of the study have filed comments with the FTC arguing that the study is needed to fill the huge void in empirical data on the costs and benefits associated with PAEs. Some even simultaneously argue that the costs of PAEs far outweigh the benefits, suggesting that they have already jumped to their conclusion and just want the data to back it up. Despite the study’s serious limitations, these commentators appear primed to use it to justify their foregone policy recommendations.

For example, the Consumer Electronics Association applauded “the FTC’s efforts to assess the anticompetitive harms that PAEs cause on our economy as a whole,” and it argued that the study “will illuminate the many dimensions of PAEs’ conduct in a way that no other entity is capable.” At the same time, it stated that “completion of this FTC study should not stay or halt other actions by the administrative, legislative or judicial branches to address this serious issue.” The Internet Commerce Coalition stressed the importance of the study of “PAE activity in order to shed light on its effects on competition and innovation,” and it admitted that without the information, “the debate in this area cannot be empirically based.” Nonetheless, it presupposed that the study will uncover “hidden conduct of and abuses by PAEs” and that “it will still be important to reform the law in this area.”

Engine Advocacy admitted that “there is very little broad empirical data about the structure and conduct of patent assertion entities, and their effect on the economy.” It then argued that PAE activity “harms innovators, consumers, startups and the broader economy.” The Coalition for Patent Fairness called on the study “to contribute to the understanding of policymakers and the public” concerning PAEs, which it claimed “impose enormous costs on U.S. innovators, manufacturers, service providers, and, increasingly, consumers and end-users.” And to those suggesting “the potentially beneficial role of PAEs in the patent market,” it stressed that “reform be guided by the principle that the patent system is intended to incentivize and reward innovation,” not “rent-seeking” PAEs that are “exploiting problems.”

The joint comments of Public Knowledge, Electronic Frontier Foundation, & Engine Advocacy emphasized the fact that information about PAEs “currently remains limited” and that what is “publicly known largely consists of lawsuits filed in court and anecdotal information.” Despite admitting that “broad empirical data often remains lacking,” the groups also suggested that the study “does not mean that legislative efforts should be stalled” since “the harms of PAE activity are well known and already amenable to legislative reform.” In fact, they contended not only that “a problem exists,” but that there’s even “reason to believe the scope is even larger than what has already been reported.”

Given this pervasive and unfounded bias against PAEs, there’s little hope that these and other critics will acknowledge the study’s serious limitations. Instead, it’s far more likely that they will point to the study as concrete evidence that even more sweeping changes to the patent system are in order.

Conclusion

While the FTC study may generate interesting information about a handful of firms, it won’t tell us much about how PAEs affect competition and innovation in general. The study is simply not designed to do this. It instead is a fact-finding mission, the results of which could guide future missions. Such empirical research can be valuable, but it’s very important to recognize the limited utility of the information being collected. And it’s crucial not to draw policy conclusions from it. Unfortunately, if the comments of some of the Commissioners and supporters of the study are any indication, many critics have already made up their minds about the net effects of PAEs, and they will likely use the study to perpetuate the biased anti-patent fervor that has captured so much attention in recent years.

Categories
Administrative Agency Copyright Uncategorized

A New Librarian of Congress and a New Copyright Office

By Sandra Aistars

U.S. Capitol buildingWith the Senate considering the confirmation of Dr. Carla Hayden as the next Librarian of Congress, I have joined thirteen other intellectual property law professors in an Open Letter suggesting that her confirmation should serve as an important reminder that the U.S. Copyright Office, a department within the Library of Congress, needs to be updated so that it can best serve the needs of the American people in the digital age.

As President Obama stated in his announcement of the nomination, “Dr. Hayden has devoted her career to modernizing libraries so that everyone can participate in today’s digital culture.” Indeed, I share Dr. Hayden’s enthusiasm for modernizing our libraries so that they can better promote and preserve our nation’s collective learning and culture for all to enjoy. This is especially important when it comes to the Library of Congress, the oldest and largest cultural institution in the United States.

However, it’s also important to provide the Copyright Office with the resources and authority it needs to perform its role as the nation’s leading expert on copyright law and a major facilitator of the marketplace for creative and scholarly works. Members of Congress and numerous stakeholders have been calling for modernization of the Copyright Office for years. They recognize that the Office faces significant structural challenges that limit its effectiveness in serving the American public. As the Office notes in its recent strategic plan, “there is no question that it must now modernize to meet current and future needs.”

Even though the Copyright Office and the Library of Congress have significantly different business needs, the Office is stuck using the Library’s information technology (IT) systems, which are not only antiquated, but were designed to serve a different purpose. A recent report by the Government Accountability Office finds that “serious weaknesses” in the Library’s IT systems are thwarting the Office’s ability to fulfill its mission in registering, recording, and depositing copyrighted works. The Office needs funding to improve these essential IT systems and to tailor them to its specific needs.

The legal structure of the Copyright Office should also be updated to best position it for success. There is currently no agency with independent, comprehensive rulemaking authority when it comes to copyright law. The Register of Copyrights can only establish regulations for administering the duties and functions of the Copyright Office, subject to approval by the Librarian of Congress. This inefficient structure burdens the Librarian with responsibilities that more appropriately should fall on the Register—the fully-vetted, leading expert in the Copyright Act.

Moreover, the Register of Copyrights is not appointed as a principal officer of the United States by the President. This means that should the Office take on additional duties like administering a small copyright claims court, the Register could not appoint and oversee the administrative law judges that would run this process. Likewise, the Register currently has no power over the three judges of the Copyright Royalty Board that determine the rates and terms of statutory copyright licenses. Congress has consistently assigned the Office more duties and relied upon its vast proficiency in copyright law, but the authority of the Register has not been updated accordingly.

Although I and the other signatories of the Open Letter do not all agree on every issue in copyright law, including where the Copyright Office should be positioned within our government, we do all agree that the Register of Copyrights should be a principal officer of the government. Only when the Register is appointed by the President and confirmed by the Senate will she have the authority and accountability appropriate to administer our nation’s copyright system.

The confirmation of Dr. Hayden as the new Librarian of Congress promises us the much-needed modernization of the Library of Congress, and it also provides us with the opportunity to reflect upon the needs of the Copyright Office. Both the Librarian and the Register play important roles in promoting and preserving our nation’s culture, but those roles are fundamentally different in nature. These two talented leaders should work closely with each other as partners, each possessing the authority and responsibility suitable for their differing areas of expertise.

The time to modernize the Copyright Office is now. Congress can ensure the continuing success of our copyright system by providing the Office with the resources and authority it needs to better serve us all in the digital age.

To read the full Open Letter, please click here.

Categories
Copyright Internet Legislation Uncategorized

Middle Class Artists Want a DMCA System That Works

The following guest post comes from Rebecca Cusey, a second year law student at George Mason University School of Law.

By Rebecca Cusey

Rebecca_Cusey_HeadshotMason Law’s Arts & Entertainment Advocacy Clinic filed comments today with the U.S. Copyright Office detailing the frustrations and futilities experienced by everyday artists as they struggle with the DMCA system to protect their copyrights online.

Terrica Carrington and I wrote the comments on behalf of middle class artists, that group of creative professionals who invests in and lives off their art and yet does not have the kind of revenue stream or corporate backing of more well-known artists. These photographers, filmmakers, musicians, and other artists are squeezed between infringement that directly affects their ability to pay for things like a mortgage or orthodontics bill and the exorbitant cost of using the notice and takedown system to fight infringement.

Terrica and I spoke with four artists: Filmmaker Ellen Seidler, news photographer Yunghi Kim, musician Blake Morgan, audiovisual creator David Newhoff. These artists make works of value and have followings, and thus infringement. They make a profession of their art.

A middle class artist must do it all on her own – find infringement by hours of searching the web, compile lists of infringing posts on each site, navigate each site’s confusing DMCA notification system, and send takedown notification after takedown notification. And that’s all just sending the notifications. Monitoring to see if the infringing content has been removed or if it has simply been uploaded in another spot is a whole other job in itself.

The artists with whom we talked said it was not unusual in the least for a song, photograph, or film to be posted illegally in a thousand places, even tens of thousands of places. Finding infringement and sending notices took hundreds and thousands of hours, hours they could have spent taking photographs, making movies, or writing songs.

After all the time spent fighting infringement online, they felt the task was futile because the content simply reappeared, sometimes in a different place on the same site, other times because of counternotices filed with the ISP hosting the content claiming to have the right to post it.

These artists felt the notice and takedown system mandated by Section 512 of the Copyright Act was both all-consuming and futile, all-consuming because it ate hours upon hours and futile because it yielded little to no results. Ultimately, all of them decided to stop spending time trying to enforce their copyrights under the procedures of Section 512. It simply was not worth it.

Our comments were filed in response to a request by the U.S. Copyright Office for comments on the effectiveness of Section 512 in fighting infringement online. The Copyright Office wanted to know in particular if the provisions of Section 512 balanced the needs of ISPs to host content with the needs of copyright owners to control their work.

Middle class artists feel the balance is off and the scale tipped in favor of ISPs. These artists do not object to bearing some responsibility for protecting their copyrights online. They simply want a system that works.

To read our Section 512 comments, please click here.

Categories
Copyright Internet Legislation Uncategorized

Copyright Scholars: Courts Have Disrupted the DMCA’s Careful Balance of Interests

Washington, D.C. at nightThe U.S. Copyright Office is conducting a study of the safe harbors under Section 512 of the DMCA, and comments are due today. Working with Victor Morales and Danielle Ely from Mason Law’s Arts & Entertainment Advocacy Clinic, we drafted and submitted comments on behalf of several copyright law scholars. In our Section 512 comments, we look at one narrow issue that we believe is the primary reason the DMCA is not working as it should: the courts’ failure to properly apply the red flag knowledge standard. We argue that judicial interpretations of red flag knowledge have disrupted the careful balance of responsibilities Congress intended between copyright owners and service providers. Instead of requiring service providers to take action in the face of red flags, courts have allowed them to turn a blind eye and bury their heads in the sand.

Whether Section 512’s safe harbors are working as intended is a hotly contested issue. On the one hand, hundreds of artists and songwriters are calling for changes “to the antiquated DMCA which forces creators to police the entire internet for instances of theft, placing an undue burden on these artists and unfairly favoring technology companies and rogue pirate sites.” On the other hand, groups like the Internet Association, which includes tech giants such as Google and Facebook, claim that the safe harbors are “working effectively” since they “strike a balance between facilitating free speech and creativity while protecting the interests of copyright holders.” The Internet Association even claims that “the increasing number of notice and takedown requests” shows that the DMCA working.

Of course, it’s utter nonsense to suggest that the more takedown notices sent and processed, the more we know the DMCA is working. The point of the safe harbors, according to the Senate Report on the DMCA, is “to make digital networks safe places to disseminate and exploit copyrighted materials.” The proper metric of success is not the number of takedown notices sent; it’s whether the internet is a safe place for copyright owners to disseminate and exploit their works. The continuing availability of huge amounts of pirated works should tip us off that the safe harbors are not working as intended. If anything, the increasing need for takedown notices suggests that things are getting worse for copyright owners, not better. If the internet were becoming a safer place, the number of takedown notices should be decreasing. It’s not surprising that service providers enjoy the status quo, given that the burden of tracking down and identifying infringement doesn’t fall on them, but this is not the balance that Congress intended to strike.

Our comments to the Copyright Office run through the relevant legislative history to show what Congress really had in mind—and it wasn’t copyright owners doing all of the work in locating and identifying infringement online. Instead, as noted in the Senate Report, Congress sought to “preserve[] strong incentives for service providers and copyright owners to cooperate to detect and deal with copyright infringements that take place in the digital networked environment.” The red flag knowledge standard was a key leverage point to encourage service providers to participate in the effort to detect and eliminate infringement. Unfortunately, courts thus far have interpreted the standard so narrowly that, beyond acting on takedown notices, service providers have little incentive to work together with copyright owners to prevent piracy. Even in cases with the most crimson of flags, courts have failed to strip service providers of their safe harbor protection. Perversely, the current case law incentivizes service providers to actively avoid doing anything when they see red flags, lest they gain actual knowledge of infringement and jeopardize their safe harbors. This is exactly the opposite of what Congress intended.

The Second and Ninth Circuits have interpreted the red flag knowledge standard to require knowledge of specific infringing material before service providers can lose their safe harbors. While tech giants might think this is great, it’s terrible for authors and artists who need service providers to carry their share of the load in combating online piracy. Creators are left in a miserable position where they bear the entire burden of policing infringement across an immense range of services, effectively making it impossible to prevent the deluge of piracy of their works. The Second and Ninth Circuits believe red flag knowledge should require specificity because otherwise service providers wouldn’t know exactly what material to remove when faced with a red flag. We argue that Congress intended service providers with red flag knowledge of infringing activity in general to then bear the burden of locating and removing the specific infringing material. This is the balance of responsibilities that Congress had in mind when it crafted the red flag knowledge standard and differentiated it from the actual knowledge standard.

But all hope is not lost. The Second and Ninth Circuits are but two appellate courts, and there are many others that have yet to rule on the red flag knowledge issue. Moreover, the Supreme Court has never interpreted the safe harbors of the DMCA. We hope that our comments will help expose the underlying problem that hurts so many creators today who are stuck playing the DMCA’s whack-a-mole game when their very livelihoods are at stake. Congress intended the DMCA to be the cornerstone of a shared-responsibility approach to fighting online piracy. Unfortunately, it has become a shield that allows service providers to enable piracy on a massive scale without making any efforts to prevent it beyond acting on takedown notices. The fact that search engines can still index The Pirate Bay—an emblematic piracy site that even has the word “pirate” in its name—without concern of losing their safe harbor protection is a testament to how the courts have turned Congress’ intent on its head. We hope that the Copyright Office’s study will shed light on this important issue.

To read our Section 512 comments, please click here.

Categories
Innovation Inventors Legislation Patent Law Patent Litigation Uncategorized

Changes to Patent Venue Rules Risk Collateral Damage to Innovators

dictionary entry for the word "innovate"Advocates for changing the patent venue rules, which dictate where patent owners can sue alleged infringers, have been arguing that their remedy will cure the supposed disease of abusive “trolls” filing suit after suit in the Eastern District of Texas. This is certainly true, but it’s only true in the sense that cyanide cures the common cold. What these advocates don’t mention is that their proposed changes will weaken patent rights across the board by severely limiting where all patent owners—even honest patentees that no one thinks are “trolls”—can sue for infringement. Instead of acknowledging the broad collateral damage their changes would cause to all patent owners, venue revision advocates invoke the talismanic “troll” narrative and hope that nobody will look closely at the details. The problem with their take on venue revision is that it’s neither fair nor balanced, and it continues the disheartening trend of equating “reform” with taking more sticks out every patent owner’s bundle of rights.

Those pushing for venue revision are working on two fronts, one judicial and the other legislative. On the judicial side, advocates have injected themselves into the TC Heartland case currently before the Federal Circuit. Though it has no direct connection to the Eastern District of Texas, advocates see it as a chance to shut plaintiffs out of that venue. Their argument in that case is so broad that it would drastically restrict where all patentees can sue for infringement—even making it impossible to sue infringing foreign defendants. Yet they don’t mention this collateral damage as they sell the “troll” narrative. On the legislative side, advocates have gotten behind the VENUE Act (S.2733), introduced in the Senate last Thursday. This bill leaves open a few more venues than TC Heartland, though it still significantly limits where all patent owners can sue. Advocates here also repeat the “troll” mantra instead of offering a single reason why it’s fair to change the rules for everyone else.

With both TC Heartland and the VENUE Act, venue revision advocates want to change the meaning of one word: “resides.” The specific patent venue statute, found in Section 1400(b) of Title 28, provides that patent infringement suits may be brought either (1) “in the judicial district where the defendant resides” or (2) “where the defendant has committed acts of infringement and has a regular and established place of business.” On its face, this seems fairly limited, but the key is the definition of the word “resides.” The general venue statute, found in Section 1391(c)(2) of Title 28, defines residency broadly: Any juridical entity, such as a corporation, “shall be deemed to reside, if a defendant, in any judicial district in which such defendant is subject to the court’s personal jurisdiction with respect to the civil action in question.” Taken together, these venue statutes mean that patent owners can sue juridical entities for infringement anywhere the court has personal jurisdiction over the defendant.

The plaintiff in TC Heartland is Kraft Foods, a large manufacturer incorporated in Delaware and headquartered in Illinois that runs facilities and sells products in Delaware. The defendant is TC Heartland, a large manufacturer incorporated and headquartered in Indiana. TC Heartland manufactured the allegedly-infringing products in Indiana and then knowingly shipped a large number of them directly into Delaware. Kraft Foods sued TC Heartland in Delaware on the theory that these shipments established personal jurisdiction—and thus venue—in that district. TC Heartland argued that venue was improper in Delaware, but the district court rejected that argument (see here and here). TC Heartland has now petitioned the Federal Circuit for a writ of mandamus, arguing that the broad definition of “reside” in Section 1391(c)(2) does not apply to the word “resides” in Section 1400(b). On this reading, venue would not lie in Delaware simply because TC Heartland did business there.

TC Heartland mentions in passing that its narrow read of Section 1400(b) is favorable as a policy matter because it would prevent venue shopping “abuses,” such as those allegedly occurring in the Eastern District of Texas. Noticeably, TC Heartland doesn’t suggest any policy reasons why Kraft Foods should not be permitted to bring an infringement suit in Delaware, and neither do any of the amici supporting TC Heartland. The amicus brief by the Electronic Frontier Foundation (EFF) et al. argues that Congress could not have intended “to permit venue in just about any court of the patent owner’s choosing.” But why is this hard to believe? The rule generally for all juridical entities is that they can be sued in any district where they chose to do business over matters relating to that business. This rule has long been regarded as perfectly fair and reasonable since these entities get both the benefits and the burdens of the law wherever they do business.

The EFF brief goes on for pages bemoaning the perceived ills of forum shopping in the Eastern District of Texas without once explaining the relevancy to Kraft Foods. It asks the Federal Circuit to “restore balance in patent litigation,” but its vision of “balance” fails to account for the myriad honest patent owners like Kraft Foods that nobody considers to be “trolls.” The same holds true for the amicus brief filed by Google et al. that discusses the “harm forum shopping causes” without elucidating how it has anything to do with Kraft Foods. Worse still, the position being urged by these amici would leave no place for patent owners to sue foreign defendants. If the residency definitions in Section 1391(c) don’t apply to Section 1400(b), as they argue, then a foreign defendant that doesn’t reside or have a regular place of business in the United States can never be sued for patent infringement—an absurd result. But rather than acknowledge this collateral damage, the amici simply sweep it under the rug.

The simple fact is that there’s nothing untoward about Kraft Foods filing suit in Delaware. That’s where TC Heartland purposefully directed its conduct when it knowingly shipped the allegedly-infringing products there. It’s quite telling that venue revision advocates are using TC Heartland as a platform for changing the rules generally when they can’t even explain why the rules should be changed in that very case. And this is the problem: If there’s no good reason for keeping Kraft Foods out of Delaware, then they shouldn’t be advocating for changes that would do just that. Keeping patent owners from suing in the Eastern District of Texas is no reason to keep Kraft Foods out of Delaware, and it’s certainly no reason to make it impossible for all patent owners to sue foreign-based defendants that infringe in the United States. Advocates of venue revision tacitly admit as much when they say nothing about this collateral damage. This isn’t fair and balanced; it’s another huge turn of the anti-patent ratchet disguised as “reform.”

The same is true with the VENUE Act, which copies almost verbatim the venue provisions of the Innovation Act. This bill would also severely restrict where all patent owners can sue by making it so that a defendant doesn’t “reside” wherever a district court has personal jurisdiction arising from its allegedly-infringing conduct. To its credit, the VENUE Act does include new provisions allowing suit where an inventor conducted R&D that led to the application for the patent at issue. It also allows suit wherever either party “has a regular and established physical facility” and has engaged in R&D of the invention at issue, “manufactured a tangible product” that embodies that invention, or “implemented a manufacturing process for a tangible good” in which the claimed process is embodied. Furthermore, the bill makes the same venue rules applicable to patent owners suing for infringement and accused infringers filing for a declaratory judgment, and it solves the problem of foreign-based defendants by stating that the residency definition in Section 1391(c)(3) applies in that situation.

While the proposed changes in the VENUE Act aren’t as severe as those sought by venue revision advocates in TC Heartland, they nevertheless take numerous venues off of the table for patentees and accused infringers alike. But rather than acknowlede these wide-sweeping changes and offer reasons for implementing them, advocates of the VENUE Act simply harp on the narrative of “trolls” in Texas. For example, Julie Samuels at Engine argues that the “current situation in the Eastern District of Texas makes it exceedingly difficult for defendants” to enforce their rights and that we need to “level the playing field.” Likewise, Elliot Harmon at the EFF Blog suggests that the VENUE Act will “finally address the egregious forum shopping that dominates patent litigation” and “bring a modicum of fairness to a broken patent system.” Yet neither Samuels nor Harmon explains why we should change the rules for all patent owners and accused infringers—especially the ones that aren’t forum shopping in Texas.

The VENUE Act would simply take a system that is perceived to favor plaintiffs and replace it with one that definitely favors defendants. For instance, an alleged infringer with continuous and systematic contacts in the Eastern District of Virginia can currently be sued there, but the VENUE Act would take away this option since it’s based on mere general jurisdiction. Likewise, the current venue rules allow suits anywhere the court has specific jurisdiction over the defendant—potentially in every venue for a nationwide enterprise—yet the VENUE Act would make dozens of these venues improper. Furthermore, patentees can now bring suits against multiple defendants in a single forum, saving time and money for all involved, but the VENUE Act would make this possibility much less likely to occur.

The “troll” narrative employed by venue revision advocates may sound appealing on the surface, but it quickly becomes clear that they either haven’t considered or don’t care about how their proposed changes would affect everyone else. If we’re going to talk about abusive litigation practices in need of revision, we should talk about where they’re occurring across the entire patent system. This discussion should include the practices of both patent owners and alleged infringers, and we should directly confront the systemic collateral damage that any proposed changes would cause. As it stands, there’s little hope that the current myopic focus on “trolls” will lead to any true reform that’s fair and balanced for everyone.

Categories
Innovation Inventors Legislation Patent Law Uncategorized

No Consensus That Broad Patent ‘Reform’ is Necessary or Helpful

Here’s a brief excerpt of an op-ed by Adam Mossoff & Devlin Hartline that was published in The Hill:

Two recent op-eds published in The Hill argue that broad patent legislation—misleadingly labeled “reform”—is needed because the U.S. patent system is fundamentally broken. In the first, Timothy Lee contends that opponents “cannot with a straight face” argue that we don’t need wide-sweeping changes to our patent system. In the second, Michele Boldrin and David K. Levine maintain that there is “consensus among academic researchers” that the system is “failing.”

Both op-eds suggest that there are no principled reasons, whether legal or economic, to object to the overhaul of the patent system included in the Innovation Act. Both op-eds are wrong.

To read the rest of this op-ed, please visit The Hill.