Categories
Administrative Agency Innovation Patent Law

The PTAB’s Regulatory Overreach and How it Cripples the Innovation Economy

files labeled as "patents"On August 14, 2017, the Regulatory Transparency Project of the Federalist Society published a new white paper, Crippling the Innovation Economy: Regulatory Overreach at the Patent Office. This white paper examines how an administrative tribunal created in 2011—the Patent Trial and Appeal Board (PTAB)—has become “a prime example regulatory overreach.” Several CPIP scholars are members of the Intellectual Property Working Group in the Regulatory Transparency Project that produced the white paper, including Professors Adam Mossoff, Kristen Osenga, Erika Lietzan, and Mark Schultz, and several are listed as co-authors.

Among the sweeping changes to the U.S. patent system included in the America Invents Act (AIA) was the creation of the PTAB, a new administrative body within the U.S. Patent and Trademark Office (USPTO). The PTAB hears petitions challenging already-issued patents as defective. Anyone can file a petition to have a patent declared invalid. The original idea was that this would help to weed out “bad patents,” i.e., patents that should not have been issued in the first place.

In the past five years, however, it has become clear that the PTAB has become an example of an administrative tribunal that has gone too far. Lacking the proper procedural and substantive restraints that constrain courts and even other agencies in respecting the rights of citizens brought before them, the PTAB is now “killing large numbers of patents and casting a pall of uncertainty for inventors and investors.”

In just a few years, the laudable goal of the PTAB in providing a cheaper, faster way to invalidate “bad patents” has led to a situation in which all patents now have a shroud of doubt around them, undermining the stable and effective property rights that serve as the engine of the innovation economy. The former chief judge of the court that hears all patent appeals recently said that the PTAB is a “patent death squad,” and confirming that this is not extreme rhetoric, the first chief judge of the PTAB responded to this criticism by embracing it: “If we weren’t, in part, doing some ‘death squadding,’ we would not be doing what the [AIA] statute calls on us to do.”

The white paper briefly discusses the history and purpose of the U.S. patent system and describes the PTAB and how it operates. The substance of the white paper details extensively the procedural and substantive problems in how the PTAB has failed to respect both the basic requirements of the rule of law and the rights of patent owners. The concern is that this undermines the stable and effective platform that patent rights provide as the engine of the innovation economy.

To read the white paper, please click here.

Categories
Administrative Agency Commercialization Copyright Copyright Licensing Infringement Innovation Internet Legislation Supreme Court Uncategorized

Letter on FCC Set-Top Box Regulation Once Again Confuses the Issue

Washington, D.C. at nightLast week, a group of law professors wrote a letter to the acting Librarian of Congress in which they claim that the current FCC proposal to regulate cable video navigation systems does not deprive copyright owners of the exclusive rights guaranteed by the Copyright Act. The letter repeats arguments from response comments they  filed along with the Electronic Frontier Foundation (EFF), accusing the Copyright Office of misinterpreting the scope of copyright law and once again bringing up Sony v. Universal to insist that copyright owners are overstepping their bounds. Unfortunately, the IP professors’ recurring reliance on Sony is misplaced, as the 30-year-old case does not address the most significant and troubling copyright violations that will result from the FCC’s proposed rules.

In 1984, the Supreme Court in Sony held that recording television shows on a personal VCR was an act of “time-shifting” and therefor did not constitute copyright infringement. The court also ruled that contributory liability requires knowledge, and such knowledge will not be imputed to a defendant based solely on the characteristics or design of a distributed product if that product is “capable of substantial noninfringing uses.” But while this precedent remains good law today, it does not apply to the real concerns creators and copyright owners have with the FCC’s attempt to redistribute their works without authorization.

The FCC’s proposed rules would require pay-TV providers to send copyrighted, licensed TV programs to third parties, even if the transmission would violate the agreements that pay-TV providers carefully negotiated with copyright owners. A different group of IP scholars recently explained to the FCC that by forcing pay-TV providers to exceed the scope of their licenses, the proposed rules effectively create a zero-rate compulsory license and undermine the property rights of creators and copyright owners. The compulsory license would benefit third-party recipients of the TV programs who have no contractual relationship with either the copyright owners or pay-TV providers, depriving creators and copyright owners of the right to license their works on their own terms.

This unauthorized siphoning and redistribution of copyrighted works would occur well before the programming reaches the in-home navigational device, a fact that the authors of the recent letter to the Librarian of Congress either don’t understand, or choose to ignore. Creators and copyright owners are not attempting to “exert control over the market for video receivers,” as the letter suggests. The manufacture and distribution of innovative devices that allow consumers to access the programming to which they subscribe is something that copyright owners and creators embrace, and a thriving market for such devices already exists.

As more consumers resort to cord cutting, countless options have become available in terms of navigational devices and on-demand streaming services. Apple TV, Roku, Nexus Player and Amazon Fire Stick are just a few of the digital media players consumers can choose from, and more advanced devices are  always being released. But while the creative community supports the development of these devices, it is the circumvention of existing licenses and disregard for the rights of creators to control their works that has artists and copyright owners worried.

Sony has become a rallying cry for those arguing that copyright owners are attempting to control and stymie the development of new devices and technologies, but these critics neglect the substantial problems presented by the transmission of digital media that Sony couldn’t predict and does not address. In the era of the VCR, there was no Internet over which television was broadcast into the home. In 1984, once a VCR manufacturer sold a unit, they ceased to have any control over the use of the machine. Consumers could use VCRs to record television or play video cassettes as they pleased, and the manufacturer wouldn’t benefit from their activity either way.

The difference with the current navigation device manufacturers is that they will receive copyrighted TV programs to which they’ll have unbridled liberty to repackage and control before sending them to the in-home navigation device. The third-party device manufacturers will not only be able to tamper with the channel placement designed to protect viewer experience and brand value, they will also be able to insert their own advertising into the delivery of the content, reducing pay-tv ad revenue and the value of the license agreements that copyright owners negotiate with pay-TV providers. The FCC’s proposal isn’t really about navigational devices, it’s about the control of creative works and the building of services around TV programs that the FCC plans to distribute to third parties free of any obligation to the owners and creators of those programs.

The authors of the letter conflate two distinct issues, misleading influential decision makers that may not be as well versed in the intricacies of copyright law. By stubbornly comparing the copyright issues surrounding the FCC’s proposed rules to those considered by the Supreme Court in Sony, they craftily try to divert attention away from the real matter at hand: Not what consumers do with the creative works they access in the privacy of their own homes, but how those works are delivered to consumers’ homes in the first place.

It’s curious that after many rounds of back-and-forth comments discussing the FCC’s proposal, proponents of the rules still refuse to address this primary copyright concern that has been continuously raised by creators and copyright owners in corresponding comments, articles, and letters (see also here, here, here, and here). Perhaps the authors of the recent letter simply do not grasp the real implications of the FCC’s plan to seize and redistribute copyrighted content, but given their years of copyright law experience, that is unlikely. More probable is that they recognize the complications inherent in the proposal, but do not have a good answer to the questions raised by the proposal’s critics, so they choose instead to cloud the issue with a similar-sounding but separate issue. But if they truly want to make progress in the set-top box debate and clear the way for copyright compliant navigational devices, they’ll need to do more than fall back on the same, irrelevant arguments.

Categories
Administrative Agency Copyright Infringement Innovation Internet Uncategorized

IP Scholars to FCC: It’s Not About "The Box"

Washington, D.C. at nightThis past April, we joined other IP scholars in explaining how the FCC’s proposed set-top box rules would undermine the property rights of creators and copyright owners. In reply comments filed last month, the EFF and a group of IP academics argued that the proposed rules would not implicate any copyright owners’ exclusive rights. Yesterday, we filed an ex parte letter with the Commissioners pointing out why this is wrong. The full letter is copied below.


Dear Chairman Wheeler and Commissioners Clyburn, Rosenworcel, Pai and O’Rielly:

On April 22, 2016, the undersigned intellectual property law scholars submitted comments to the FCC in response to the notice of proposed rulemaking in the matter of “Expanding Consumers’ Video Navigation Choices; Commercial Availability of Navigation Devices.” Our comments expressed concerns with the proposed rules’ harmful impact on the property rights of creators and copyright owners. Specifically, we warned that the Commission’s proposed rules would undermine the exclusive property rights guaranteed to copyright owners under the Copyright Act by severely limiting their ability to determine whom to license their property rights to and on what terms. In so doing, the proposed rules risk fundamentally disrupting the vibrant creative ecosystem that those property rights support.

Together with the Electronic Frontier Foundation (EFF), another group of intellectual property law academics submitted reply comments framed as a rebuttal to the many comments that dutifully explained how the Commission’s proposed rules would violate copyright law and imperil the property rights of creators and copyright owners. The EFF/professors offer “observations” on copyright law and conclude that “the proposed rules are consistent with copyright in both letter and spirit.” To reach this conclusion, the EFF/professors broadly defend navigational devices, arguing that “products and services that touch copyrighted works do not infringe copyright, and do not require a license,” and that the “devices and services under the proposed rules” would be non-infringing just like televisions and VCRs.

Surprisingly, despite claiming that the proposed rules are consistent with copyright law, the EFF/professors fail to address the primary copyright concern raised by us and by many other commenters. By focusing on the navigational devices themselves, rather than on how creative works are delivered to those devices, the EFF/professors perform a sleight of hand that masks the real problem. The issue is not what consumers do with the creative works they receive in the privacy of their own homes—the issue is how those creative works are delivered to consumers’ homes in the first place.

The creative works that pay-TV consumers watch on their televisions come from multiple sources, including satellite, cable, and telephony-based transmissions. These transmissions are public performances or public distributions, and as a result, they must be licensed. Ignoring this simple principle of copyright law, the Commission would require pay-TV providers to send copyrighted works to third parties even if doing so exceeds the scope of pay-TV providers’ licenses with copyright owners. By forcing pay-TV providers to exceed the scope of their licenses, the proposed rules would undermine the property rights of creators and copyright owners, effectively creating a zero-rate compulsory license to the benefit of third parties that have no contractual relationship with either copyright owners or pay-TV providers that copyright owners license their works to. Furthermore, the Commission seeks to create this zero-rate compulsory license despite lacking any authority to do so; the Communications Act certainly does not give the Commission authority to amend the Copyright Act and create a new compulsory license for copyrighted works.

The reply comments of the EFF/professors do not address this concern at all. Committing the bulk of their reply to a broad discussion articulating their principles for copyright law, the EFF/professors fail to respond to the distinct copyright issues that inevitably result from this newly-created compulsory license. It is unclear why the EFF/professors do not address this issue, as it was echoed again and again in the comments to which they purport to respond. Because the proposed rules would brazenly undercut copyright owners’ property rights, we believe it is important to call attention to the inability of the EFF/professors to even mention this fundamental problem in their response.

Put simply, the proposed rules would take away the ability of creators and copyright owners to license their works on their own terms. It would give third parties all of the benefits afforded to pay-TV providers by their agreements with copyright owners without the burdens of paying a license or agreeing to the underlying contract terms. This isn’t about “the box,” and it isn’t about what consumers do with the creative works they receive in their homes. The issue is what goes into “the box,” and more importantly, how it gets there. That the EFF/professors ignore this primary issue speaks volumes. The fact that third parties currently need a license from copyright owners to do the very things the proposed rules would countenance demonstrates that the rules would undermine the property rights of creators and copyright owners.

The EFF/professors properly note that the “ultimate goal” of copyright is to benefit the public good. What they fail to understand is that by securing to artists and creators property rights in the fruits of their labors, copyright serves the interests of creators and the public alike, fulfilling its constitutional purpose and forming the bedrock of our creative economy. We urge the Commission to consider and address—as the EFF/professors do not—how the proposed rules inappropriately interfere with the property rights of creators and copyright owners and the damage they stand to cause to our diverse and vibrant creative marketplace.

To download our letter to the FCC, please click here.

Categories
Administrative Agency Copyright Innovation Internet Legislation Uncategorized

FCC’s Extreme Proposal Threatens the Livelihood of Creators

By Matthew Barblan & Kevin Madigan

circuit board

Earlier this year, the FCC proposed a new regulatory scheme ostensibly designed to improve the market for pay-TV set-top boxes. Chairman Wheeler claimed that the proposed rules would “tear down the barriers that currently prevent innovators from developing new ways for consumers to access and enjoy their favorite shows and movies on their terms.” But set-top boxes are already on their way out as more and more consumers turn to streaming apps to watch their favorite movies and shows. So what is the FCC up to here? A close look at the proposed rules reveals that this isn’t about set-top boxes at all. Instead, the rules are designed to benefit a handful of companies that want to disseminate pay-TV programs without negotiating with or paying a license to the owners of those programs, undermining the property rights of creators and copyright owners. The creative community is understandably up in arms.

As we explain in comments filed with the FCC, the proposed rules would require pay-TV providers to make copyrighted video content available to third-party companies that have no contractual relationship with either the pay-TV providers or the creators of the video programming. The Commission essentially aims to create a zero-rate compulsory license for these companies. But this zero-rate compulsory license would fundamentally disrupt copyright owners’ ability to pursue the wide variety of business models and licensing arrangements that enable our creative ecosystem to thrive.

A key component of copyright owners’ property interest is the ability to choose to whom they license their works and on what terms. Because their livelihoods depend on the success of their works, copyright owners are particularly well-positioned and incentivized to determine the best way to commercialize them. By conveying copyrighted works to third parties without the consent of copyright owners, the proposed rules trample on the property rights of copyright owners and risk severely damaging our vibrant creative economy.

Adding insult to injury, the proposed rules wouldn’t even require the recipients of this zero-rate compulsory license to abide by the underlying contractual terms between copyright owners and pay-TV providers. Licensing contracts between copyright owners and pay-TV providers often include specific terms detailing the obligations of the provider in distributing the creative works. These terms can include things like channel “neighborhood” assignments, branding requirements, advertising limits, platform restrictions, and the list goes on. While the Commission states that “our goal is to preserve the contractual arrangements” between copyright owners and pay-TV providers, the proposed rules would transfer some, but not all, of the underlying contractual obligations to the third-party recipients of the copyrighted works.

For example, under the Commission’s proposal, third-party recipients of the copyrighted works would not be required to abide by contractual terms about channel placement designed to protect viewer experience and brand value. Similarly, the Commission’s proposal would not require third-party recipients of copyrighted works to abide by contractual terms concerning advertising in the delivery of those works. By allowing third parties to sidestep these terms, the Commission risks reducing the advertising revenue that pay-TV providers can earn from disseminating copyrighted works, thereby reducing the value of the license agreements that copyright owners negotiate with pay-TV providers.

In another thumb-in-the-eye to creators and copyright owners, the Commission’s proposal fails to account for copyright owners who may want to protect their copyrighted works by disseminating them exclusively through proprietary (and not widely licensable) content protection mechanisms. Instead, the Commission proposes to require pay-TV providers “to support at least one content protection system to protect its multichannel video programming that is licensable on reasonable and nondiscriminatory terms by an organization that is not affiliated with [the pay-TV provider].” Thus, the Commission would force copyright owners to risk exposing their property to security threats that may be associated with using widely-licensable content protection mechanisms.

Furthermore, nothing in the Commission’s proposal would prevent third parties from delivering the copyrighted works side-by-side with stolen versions of those same works. It is easy to imagine a search function that aggregates copies of creative works from a variety of platforms and displays the search results side-by-side. In fact, anyone who has run an internet search for a movie or TV show has likely seen results that mix links to both legitimate and stolen works.

Copyright owners’ ability to protect their creative works is essential both to preserve the value of their property and to give them the confidence to enter into arrangements with intermediaries (like pay-TV providers) to disseminate their works to a wide variety of audiences. This is especially true in light of the unique security challenges involved in portable, online, and short-term access to copyrighted works. Any reasonable proposal in this space would help copyright owners move forward in the ongoing battle to prevent the rampant theft and illegal dissemination of their works that has accompanied the rise of the internet. Unfortunately, the Commission’s proposal does just the opposite, limiting copyright owners’ ability to protect their property and pushing them backwards in the ongoing struggle against piracy.

Furthermore, it is entirely unclear where the Commission would draw the legal authority to change the nature of copyright owners’ property rights. The proposed rules simply claim that Section 629 of the Communications Act grants the Commission authority to implement the regulations in order to ensure competition and consumer choice in the navigation device market. In its justification of authority, the Commission repeatedly states that it will broadly interpret ambiguous terms in the Communications Act and that “a broad interpretation is necessary.” But nowhere in its analysis does the Commission cite to language granting it the authority to rewrite copyright law. Even under the broadest of interpretations, it is clear that the Communications Act does not give the Commission the authority to amend the Copyright Act and create a zero-royalty compulsory license out of thin air.

By granting artists and creators property rights in the fruits of their labors, copyright supports a diverse and multifaceted ecosystem that enables the development, distribution, and enjoyment of creative works, and that provides significant economic and cultural benefits to our society. But this ecosystem only works if copyright owners are able to safely and freely deploy their property in the marketplace. Unfortunately, the Commission’s proposal fails to respect the property rights of creators and copyright owners, risking severe disruption to the very same creative marketplace the Commission claims to promote.

Categories
Administrative Agency Economic Study FTC Innovation Inventors Law and Economics Legislation Uncategorized

Acknowledging the Limitations of the FTC’s PAE Study

dictionary entry for the word "innovate"The FTC’s long-awaited case study of patent assertion entities (PAEs) is expected to be released this spring. Using its subpoena power under Section 6(b) to gather information from a handful of firms, the study promises us a glimpse at their inner workings. But while the results may be interesting, they’ll also be too narrow to support any informed policy changes. And you don’t have to take my word for it—the FTC admits as much. In one submission to the Office of Management and Budget (OMB), which ultimately decided whether the study should move forward, the FTC acknowledges that its findings “will not be generalizable to the universe of all PAE activity.” In another submission to the OMB, the FTC recognizes that “the case study should be viewed as descriptive and probative for future studies seeking to explore the relationships between organizational form and assertion behavior.”

However, this doesn’t mean that no one will use the study to advocate for drastic changes to the patent system. Even before the study’s release, many people—including some FTC Commissioners themselves—have already jumped to conclusions when it comes to PAEs, arguing that they are a drag on innovation and competition. Yet these same people say that we need this study because there’s no good empirical data analyzing the systemic costs and benefits of PAEs. They can’t have it both ways. The uproar about PAEs is emblematic of the broader movement that advocates for the next big change to the patent system before we’ve even seen how the last one panned out. In this environment, it’s unlikely that the FTC and other critics will responsibly acknowledge that the study simply cannot give us an accurate assessment of the bigger picture.

Limitations of the FTC Study

Many scholars have written about the study’s fundamental limitations. As statistician Fritz Scheuren points out, there are two kinds of studies: exploratory and confirmatory. An exploratory study is a starting point that asks general questions in order to generate testable hypotheses, while a confirmatory study is then used to test the validity of those hypotheses. The FTC study, with its open-ended questions to a handful of firms, is a classic exploratory study. At best, the study will generate answers that could help researchers begin to form theories and design another round of questions for further research. Scheuren notes that while the “FTC study may well be useful at generating exploratory data with respect to PAE activity,” it “is not designed to confirm supportable subject matter conclusions.”

One significant constraint with the FTC study is that the sample size is small—only twenty-five PAEs—and the control group is even smaller—a mixture of fifteen manufacturers and non-practicing entities (NPEs) in the wireless chipset industry. Scheuren reasons that there “is also the risk of non-representative sampling and potential selection bias due to the fact that the universe of PAEs is largely unknown and likely quite diverse.” And the fact that the control group comes from one narrow industry further prevents any generalization of the results. Scheuren concludes that the FTC study “may result in potentially valuable information worthy of further study,” but that it is “not designed in a way as to support public policy decisions.”

Professor Michael Risch questions the FTC’s entire approach: “If the FTC is going to the trouble of doing a study, why not get it done right the first time and a) sample a larger number of manufacturers, in b) a more diverse area of manufacturing, and c) get identical information?” He points out that the FTC won’t be well-positioned to draw conclusions because the control group is not even being asked the same questions as the PAEs. Risch concludes that “any report risks looking like so many others: a static look at an industry with no benchmark to compare it to.” Professor Kristen Osenga echoes these same sentiments and notes that “the study has been shaped in a way that will simply add fuel to the anti–‘patent troll’ fire without providing any data that would explain the best way to fix the real problems in the patent field today.”

Osenga further argues that the study is flawed since the FTC’s definition of PAEs perpetuates the myth that patent licensing firms are all the same. The reality is that many different types of businesses fall under the “PAE” umbrella, and it makes no sense to impute the actions of a small subset to the entire group when making policy recommendations. Moreover, Osenga questions the FTC’s “shortsighted viewpoint” of the potential benefits of PAEs, and she doubts how the “impact on innovation and competition” will be ascertainable given the questions being asked. Anne Layne-Farrar expresses similar doubts about the conclusions that can be drawn from the FTC study since only licensors are being surveyed. She posits that it “cannot generate a full dataset for understanding the conduct of the parties in patent license negotiation or the reasons for the failure of negotiations.”

Layne-Farrar concludes that the FTC study “can point us in fruitful directions for further inquiry and may offer context for interpreting quantitative studies of PAE litigation, but should not be used to justify any policy changes.” Consistent with the FTC’s own admissions of the study’s limitations, this is the real bottom line of what we should expect. The study will have no predictive power because it only looks at how a small sample of firms affect a few other players within the patent ecosystem. It does not quantify how that activity ultimately affects innovation and competition—the very information needed to support policy recommendations. The FTC study is not intended to produce the sort of compelling statistical data that can be extrapolated to the larger universe of firms.

FTC Commissioners Put Cart Before Horse

The FTC has a history of bias against PAEs, as demonstrated in its 2011 report that skeptically questioned the “uncertain benefits” of PAEs while assuming their “detrimental effects” in undermining innovation. That report recommended special remedy rules for PAEs, even as the FTC acknowledged the lack of objective evidence of systemic failure and the difficulty of distinguishing “patent transactions that harm innovation from those that promote it.” With its new study, the FTC concedes to the OMB that much is still not known about PAEs and that the findings will be preliminary and non-generalizable. However, this hasn’t prevented some Commissioners from putting the cart before the horse with PAEs.

In fact, the very call for the FTC to institute the PAE study started with its conclusion. In her 2013 speech suggesting the study, FTC Chairwoman Edith Ramirez recognized that “we still have only snapshots of the costs and benefits of PAE activity” and that “we will need to learn a lot more” in order “to see the full competitive picture.” While acknowledging the vast potential benefits of PAEs in rewarding invention, benefiting competition and consumers, reducing enforcement hurdles, increasing liquidity, encouraging venture capital investment, and funding R&D, she nevertheless concluded that “PAEs exploit underlying problems in the patent system to the detriment of innovation and consumers.” And despite the admitted lack of data, Ramirez stressed “the critical importance of continuing the effort on patent reform to limit the costs associated with some types of PAE activity.”

This position is duplicitous: If the costs and benefits of PAEs are still unknown, what justifies Ramirez’s rushed call for immediate action? While benefits have to be weighed against costs, it’s clear that she’s already jumped to the conclusion that the costs outweigh the benefits. In another speech a few months later, Ramirez noted that the “troubling stories” about PAEs “don’t tell us much about the competitive costs and benefits of PAE activity.” Despite this admission, Ramirez called for “a much broader response to flaws in the patent system that fuel inefficient behavior by PAEs.” And while Ramirez said that understanding “the PAE business model will inform the policy dialogue,” she stated that “it will not change the pressing need for additional progress on patent reform.”

Likewise, in an early 2014 speech, Commissioner Julie Brill ignored the study’s inherent limitations and exploratory nature. She predicted that the study “will provide a fuller and more accurate picture of PAE activity” that “will be put to good use by Congress and others who examine closely the activities of PAEs.” Remarkably, Brill stated that “the FTC and other law enforcement agencies” should not “wait on the results of the 6(b) study before undertaking enforcement actions against PAE activity that crosses the line.” Even without the study’s results, she thought that “reforms to the patent system are clearly warranted.” In Brill’s view, the study would only be useful for determining whether “additional reforms are warranted” to curb the activities of PAEs.

It appears that these Commissioners have already decided—in the absence of any reliable data on the systemic effects of PAE activity—that drastic changes to the patent system are necessary. Given their clear bias in this area, there is little hope that they will acknowledge the deep limitations of the study once it is released.

Commentators Jump the Gun

Unsurprisingly, many supporters of the study have filed comments with the FTC arguing that the study is needed to fill the huge void in empirical data on the costs and benefits associated with PAEs. Some even simultaneously argue that the costs of PAEs far outweigh the benefits, suggesting that they have already jumped to their conclusion and just want the data to back it up. Despite the study’s serious limitations, these commentators appear primed to use it to justify their foregone policy recommendations.

For example, the Consumer Electronics Association applauded “the FTC’s efforts to assess the anticompetitive harms that PAEs cause on our economy as a whole,” and it argued that the study “will illuminate the many dimensions of PAEs’ conduct in a way that no other entity is capable.” At the same time, it stated that “completion of this FTC study should not stay or halt other actions by the administrative, legislative or judicial branches to address this serious issue.” The Internet Commerce Coalition stressed the importance of the study of “PAE activity in order to shed light on its effects on competition and innovation,” and it admitted that without the information, “the debate in this area cannot be empirically based.” Nonetheless, it presupposed that the study will uncover “hidden conduct of and abuses by PAEs” and that “it will still be important to reform the law in this area.”

Engine Advocacy admitted that “there is very little broad empirical data about the structure and conduct of patent assertion entities, and their effect on the economy.” It then argued that PAE activity “harms innovators, consumers, startups and the broader economy.” The Coalition for Patent Fairness called on the study “to contribute to the understanding of policymakers and the public” concerning PAEs, which it claimed “impose enormous costs on U.S. innovators, manufacturers, service providers, and, increasingly, consumers and end-users.” And to those suggesting “the potentially beneficial role of PAEs in the patent market,” it stressed that “reform be guided by the principle that the patent system is intended to incentivize and reward innovation,” not “rent-seeking” PAEs that are “exploiting problems.”

The joint comments of Public Knowledge, Electronic Frontier Foundation, & Engine Advocacy emphasized the fact that information about PAEs “currently remains limited” and that what is “publicly known largely consists of lawsuits filed in court and anecdotal information.” Despite admitting that “broad empirical data often remains lacking,” the groups also suggested that the study “does not mean that legislative efforts should be stalled” since “the harms of PAE activity are well known and already amenable to legislative reform.” In fact, they contended not only that “a problem exists,” but that there’s even “reason to believe the scope is even larger than what has already been reported.”

Given this pervasive and unfounded bias against PAEs, there’s little hope that these and other critics will acknowledge the study’s serious limitations. Instead, it’s far more likely that they will point to the study as concrete evidence that even more sweeping changes to the patent system are in order.

Conclusion

While the FTC study may generate interesting information about a handful of firms, it won’t tell us much about how PAEs affect competition and innovation in general. The study is simply not designed to do this. It instead is a fact-finding mission, the results of which could guide future missions. Such empirical research can be valuable, but it’s very important to recognize the limited utility of the information being collected. And it’s crucial not to draw policy conclusions from it. Unfortunately, if the comments of some of the Commissioners and supporters of the study are any indication, many critics have already made up their minds about the net effects of PAEs, and they will likely use the study to perpetuate the biased anti-patent fervor that has captured so much attention in recent years.

Categories
Administrative Agency Copyright Uncategorized

A New Librarian of Congress and a New Copyright Office

By Sandra Aistars

U.S. Capitol buildingWith the Senate considering the confirmation of Dr. Carla Hayden as the next Librarian of Congress, I have joined thirteen other intellectual property law professors in an Open Letter suggesting that her confirmation should serve as an important reminder that the U.S. Copyright Office, a department within the Library of Congress, needs to be updated so that it can best serve the needs of the American people in the digital age.

As President Obama stated in his announcement of the nomination, “Dr. Hayden has devoted her career to modernizing libraries so that everyone can participate in today’s digital culture.” Indeed, I share Dr. Hayden’s enthusiasm for modernizing our libraries so that they can better promote and preserve our nation’s collective learning and culture for all to enjoy. This is especially important when it comes to the Library of Congress, the oldest and largest cultural institution in the United States.

However, it’s also important to provide the Copyright Office with the resources and authority it needs to perform its role as the nation’s leading expert on copyright law and a major facilitator of the marketplace for creative and scholarly works. Members of Congress and numerous stakeholders have been calling for modernization of the Copyright Office for years. They recognize that the Office faces significant structural challenges that limit its effectiveness in serving the American public. As the Office notes in its recent strategic plan, “there is no question that it must now modernize to meet current and future needs.”

Even though the Copyright Office and the Library of Congress have significantly different business needs, the Office is stuck using the Library’s information technology (IT) systems, which are not only antiquated, but were designed to serve a different purpose. A recent report by the Government Accountability Office finds that “serious weaknesses” in the Library’s IT systems are thwarting the Office’s ability to fulfill its mission in registering, recording, and depositing copyrighted works. The Office needs funding to improve these essential IT systems and to tailor them to its specific needs.

The legal structure of the Copyright Office should also be updated to best position it for success. There is currently no agency with independent, comprehensive rulemaking authority when it comes to copyright law. The Register of Copyrights can only establish regulations for administering the duties and functions of the Copyright Office, subject to approval by the Librarian of Congress. This inefficient structure burdens the Librarian with responsibilities that more appropriately should fall on the Register—the fully-vetted, leading expert in the Copyright Act.

Moreover, the Register of Copyrights is not appointed as a principal officer of the United States by the President. This means that should the Office take on additional duties like administering a small copyright claims court, the Register could not appoint and oversee the administrative law judges that would run this process. Likewise, the Register currently has no power over the three judges of the Copyright Royalty Board that determine the rates and terms of statutory copyright licenses. Congress has consistently assigned the Office more duties and relied upon its vast proficiency in copyright law, but the authority of the Register has not been updated accordingly.

Although I and the other signatories of the Open Letter do not all agree on every issue in copyright law, including where the Copyright Office should be positioned within our government, we do all agree that the Register of Copyrights should be a principal officer of the government. Only when the Register is appointed by the President and confirmed by the Senate will she have the authority and accountability appropriate to administer our nation’s copyright system.

The confirmation of Dr. Hayden as the new Librarian of Congress promises us the much-needed modernization of the Library of Congress, and it also provides us with the opportunity to reflect upon the needs of the Copyright Office. Both the Librarian and the Register play important roles in promoting and preserving our nation’s culture, but those roles are fundamentally different in nature. These two talented leaders should work closely with each other as partners, each possessing the authority and responsibility suitable for their differing areas of expertise.

The time to modernize the Copyright Office is now. Congress can ensure the continuing success of our copyright system by providing the Office with the resources and authority it needs to better serve us all in the digital age.

To read the full Open Letter, please click here.

Categories
Administrative Agency Innovation Inventors Patent Law Patent Litigation Uncategorized

#AliceStorm: When It Rains, It Pours…

The following guest post from Robert R. Sachs, Partner at Fenwick & West LLP, first appeared on the Bilski Blog, and it is reposted here with permission.

By Robert R. Sachs

Last year I christened the post-Alice impact on patents #Alicestorm, riffing on the hashtag #hellastorm used to refer to the Pineapple Express storms the drenched the Bay Area in December 2014. This year we have El Niño bringing day after day of rain, and so too we have Alice decisions coming down in sheets. Here is a comparison of number of decisions per month since Alice.

Table 1 Quarterly Numbers

On average, we’re seeing about eleven Section 101 decisions per month in the federal courts. The overall success rate has been holding in the low 70% range, currently 72.1% (down from 73% in October). Similarly, motions on the pleadings continue with an equally strong success rate of 71.7% (down from 71.9%):

Table 2 Summary

The number of patents invalidated has increased dramatically from 354 as of October 2015 to over 400, while the number of invalidated claims is now over 12,000. The courts routinely invalidate all of the claims of a patent based on a single “representative” claim, including all dependent claims regardless of their level of specificity. This seems entirely contrary to the notion that dependent claims necessarily narrow the broad and presumably abstract independent claims, and likely provide at least some features that are “significantly more” then than the abstract idea, as well as recitations that are non-generic technology that provides some “improvement” over the art. Is it really possible that so many thousands of dependent claims had no merit? If dependent claims are hedges against invalidity under Section 103, why do they have so little bearing under Section 101?

PTAB continues to be the points leader on the board, with the institution rate on Section 101 based Covered Business Method petitions climbing to 84.8% (up from 83.7%) and an unbroken string of 38 final decisions on Section 101 finding the challenged patent ineligible.

The motion analysis remains consistent with what we’ve seen before:

Motions

The most active courts and judges, by number of Section 101 decisions, are in Delaware with 35 decisions (Andrews, Robinson, Stark, Sleet, Burke) and Texas with 29 (Gilstrap, Payne, Mitchell, Schroeder).

Judges

The number of new patent cases filed climbed in 2015, including a large filing spike at the end of November, 2015 due to the elimination of Form 18, which required only the basic allegation that the plaintiff owned the patent and that the defendant infringed, without any substantive allegations. As a result, we expect to see the continued stream of Section 101 motions and ineligibility outcomes.

Categories
Administrative Agency Copyright Legislation Uncategorized

Principles and Priorities to Guide Congress’s Ongoing Copyright Review

Last week, CPIP published a new white paper, Copyright Principles and Priorities to Foster a Creative Digital Marketplace, by Sandra Aistars, Mark Schultz, and myself, which draws from the testimonies and scholarly writings of CPIP Senior Scholars in order to guide Congress as it continues its comprehensive review of the Copyright Act. The white paper discusses the constitutional origins of copyright protection and offers principles and priorities for Congress to consider as it moves forward with the copyright review process.

The current copyright review began in early 2013, when Register of Copyrights Maria Pallante threw down the gauntlet in her Horace S. Manges lecture by urging Congress to create “the next great copyright act.” While noting that minor legislative tweaks certainly have their place, Register Pallante suggested that it’s time for Congress to do something far more sweeping. Since then, Congress has embarked on a comprehensive review of our copyright laws, conducting over twenty hearings since mid-2013.

CPIP Senior Scholars have been actively engaged in that hearing process. Prof. Sandra Aistars (while she was CEO of the Copyright Alliance) testified on the creative community’s contributions to innovation and suggested several principles for the review process. Prof. Mark Schultz offered testimony on the scope and subject matter of copyright, and Prof. Sean O’Connor gave testimony on the failure of the DMCA’s notice-and-takedown regime.

As we discuss in the white paper, the premise of our copyright system is that copyrights are more than just incentives to create—they’re also rewards to authors for their productive labors. The Founders understood that authors’ rights and the public good are complementary, and they knew that public interests are best served when individual interests are properly secured. That understanding has proved quite prescient, as copyright today drives many innovations that provide remarkable benefits to our economy, society, and culture.

In the white paper, we propose the following organizing principles for any further work reviewing or revising the Copyright Act:

    A. Stay True to Technology-Neutral Principles and Take the Long View
    B. Strengthen the Ability of Authors to Create and to Disseminate Works
    C. Value the Input of Creative Upstarts
    D. Ensure that Copyright Continues to Nurture Free Speech and Creative Freedom
    E. Rely on the Marketplace and Private Ordering Absent Clear Market Failures
    F. Value the Entire Body of Copyright Law

We then note that these principles in turn suggest that Congress prioritize the following areas for action:

    A. Copyright Office Modernization
    B. Registration and Recordation
    C. Mass Digitization and Orphan Works
    D. Small Claims
    E. Notice and Takedown
    F. Streaming Harmonization

The ball is still rolling with the copyright review process. The House Judiciary Committee began a listening tour this fall that kicked off in Nashville and then traveled to Silicon Valley and Los Angeles. Moreover, those who testified at the earlier hearings have been invited back to meet with Committee staff and discuss any further input they might have. And the Committee is open to “any interested party” coming in to discuss their interests.

All told, this lengthy review process places Congress in a good position to take the next step in bringing us closer to Register Pallante’s “next great copyright act.” And to that end, we hope that our white paper will help Congress keep the constitutional premise of copyright protection in mind as it chooses where we go from here.

To read the full white paper, please click here.

Categories
Administrative Agency Innovation Inventors Patent Law Patent Litigation Uncategorized

#AliceStorm for Halloween: Was it a Trick or a Treat?

The following guest post from Robert R. Sachs, Partner at Fenwick & West LLP, first appeared on the Bilski Blog, and it is reposted here with permission.

By Robert R. Sachs

Alice has been busy the last two months, continuing to haunt the federal courts and the Knox and Randolph buildings at the USPTO. Here are the latest #AliceStorm numbers through the end of October 2015:

There have been 34 district court decisions in the past two months, but the percentage of invalidity decision is holding constant at 70.5%. The number of patent claims invalidated is now over 11,000, but also holding steady at around 71%.

There have been no new Federal Circuit Section 101 decisions, but we’re going to see a flurry of activity in the next couple of months, as the court has recently heard oral argument in a number of patent eligibility cases, and more are on calendar for November.

Motions on the pleadings have soared, with 23 in the past two months alone, and the success rate is up a tick from 70.1% to 71.4%.

PTAB is a bit mixed: the CBM institution rate is down from 86.2% 83.7%, but the final decision rate is still 100%, with 6 decisions in the past two months invalidating the patents in suit.

Turning to the motion analysis, the motions on the pleadings are the second scariest thing to a patent holder after the specter of attorney fees under Octane Fitness:

The Delaware district court continues as the graveyard of business methods and software patents, with 31 eligibility decisions, up from 19 just two months ago, and their invalidity rate is up from 86.4% to 90.3%.

Jumping into second place is the Eastern District of Texas, with 23 decisions total (up from 16). Contrary to the rest of the rest of the bench, their invalidity rate is 34.8%. The Northern District of California edged up from 75% to 78.9% invalidity, and C.D. Cal is up almost 2%.

And finally, here is the run down on the all district court judges with two or more Section 101 decisions.

With today’s blog, I’m introducing some entirely new types of data, looking at the characteristics of the patents that have been subject to Section 101 motions.

As expected, business method patents are the most heavily litigated and invalidated (click to see full size):

The distribution of patents in terms of earlier priority dates shows a very large fraction of the invalidated patents were first filed in 2000:

Now compare that to the distribution of patent classes with respect to priority year as well:

Here too we see a very large number of the business method patents filed in 2000. I’ve coded all of the software related technologies as blue to group them visually.

Why the cluster around 2000? State Street Bank, which held that there was no rule against business method patents, was decided in mid-1998. As those of us who were practicing them remember, it took about two years before the impact of the decision was widespread. This was also the time of the Dotcom bubble when it seemed that just about everyone was starting up a business on Internet. Those two factors resulted in a surge of patent filings.

Of all the patents that have been thus challenged under Alice, only two have post-Bilski priority dates:

  • 8447263, Emergency call analysis system, filed in 2011, and litigated in Boar’s Head Corp. v. DirectApps, Inc., 2015 U.S. Dist. LEXIS 98502 (E.D. Cal., 7/28/2015). The court granted DirectApps motion to dismiss, finding the patent invalid.
  • 8938510, On-demand mailbox synchronization and migration system, filed in 2010, and litigated in BitTitan, Inc. v. SkyKick, Inc., 2015 U.S. Dist. LEXIS 114082 (W.D. Wash, 8/27/2015). BitTran’s motion for preliminary injunction was denied in part because of SkyKick successfully argued that BitTrans was not likely to succeed on the merits due to Alice problems.
  • 8,604,943; 9,070,295; 9,082,097; 9,082,098 and 9,087,313, all of which claim priority to March, 2012, and were invalidated just last week in MacroPoint LLC v. FourKites Inc., Case. No. 1:15-cv-01002 (N.D. Ohio, Nov. 5, 2015). The court invalided all 94 claims in these patents, as being directed to the abstract idea of “tracking freight.” While the last four patents were issued in June and July, 2015, none of them overcome an Alice rejection, and the court noted that “Nothing in the Reasons for Allowance dictate a finding that these concepts are inventive on the issue of patent-eligible subject matter.”

Over time we’ll see more post-Bilski patents being litigated, and then eventually a true test: a business method patent granted after Alice that overcame an Alice rejection. By my count, there are about 80 such patents thus far, and about another 90 that have been allowed. It will not be too long then before one of these patents is challenged under Section 101.

In my next column, I’ll review some very disturbing decisions by coming out of the Delaware district courts.

Categories
Administrative Agency Innovation Inventors Patent Law Patent Litigation Uncategorized

Overview of Comments on the USPTO's July 2015 Update to the Interim Examination Guidance

The following guest post from Robert R. Sachs, Partner at Fenwick & West LLP, first appeared on the Bilski Blog, and it is reposted here with permission.

By Robert R. Sachs

In late July, the USPTO issued its July 2015 Update to the 2014 Interim Section 101 Patent Eligibility Guidance (IEG). The July 2015 Update addresses a number of the issues and concerns raised in the public comments to the IEG and is supposed to assist examiners in applying the 2014 IEG during the patent examination process. The July 2015 Update also includes a new set of examples of claims involving abstract ideas and sample analysis under the Mayo framework. The USPTO is seeking public comments on the July 2015 Update, and comments are due on October 28, 2015, via email at 2014_interim_guidance@uspto.gov.

Here is an overview of what I think are the key issues and concerns with the July 2015 Update. Feel free to use any of my analysis in your comments to the USPTO.

1. Requirements of Prima Facie Case and the Role of Evidence

A significant number of the public comments on the 2014 IEG noted that examiners have the burden to make the prima facie case that a patent claim is ineligible, and that the Administrative Procedures Act (APA) and Federal Circuit case law requires that this determination be made based on “substantial evidence,” and not examiner opinion. In particular, all of the public comments that addressed this issue stated that examiners should have to provide documentary evidence to support a conclusion that a claim is directed to a judicial exception or that claim limitations are well understood, routine, and conventional.

In the July 2015 Update, the USPTO responded by stating that whether a claim is ineligible is a question of law and courts do not rely on evidence to establish that a claim is directed to a judicial exception, and therefore examiners likewise do not need to rely on any evidence that a particular concept is abstract, or a fundamental economic concept, or even a law of nature. The USPTO’s reliance on the judicial model is legally incorrect. First, examiners are bound by the APA and judges are not. Second, that eligibility is a question of law does not mean that there are not factual issues, as well—it merely determines whether the court or a jury is to make the finding. Obviousness is likewise a question of law, but there are clearly factual issues involved. Third, when judges take judicial notice, they are making a finding of fact, and they must do so under the requirements of Federal Rules of Evidence, Rule 201, which states that “The court may judicially notice a fact that is not subject to reasonable dispute because it: … can be accurately and readily determined from sources whose accuracy cannot reasonably be questioned.” This requirement is similar to the requirements of Official Notice set forth in MPEP 2144.03: “Official notice unsupported by documentary evidence should only be taken by the examiner where the facts asserted to be well-known, or to be common knowledge in the art are capable of instant and unquestionable demonstration as being well-known.” Thus, by its own logic, examiners should comply with the requirements of MPEP 2144.03.

As to the role of evidence, again the public comments that discussed this issue all took the position that examiners must cite authoritative documentary evidence, such as textbooks or similar publications to support a conclusion that a claim recites a judicial exception or that certain practices are well known, conventional or routine. The public comments on this issue all made the same argument: that the Supreme Court in Bilski and Alice cited references in support of their conclusions that the claims were ineligible.

In response to this uniform opinion, the USPTO maintained its position that citations of references was not necessary because the references in Bilski and Alice were technically not “evidence” since the Court is an appellate court, and further that the references were not necessarily prior art. This argument misses the point. Regardless of whether the references were evidence under the Federal Rules of Evidence, the Court felt it necessary and proper to cite them. Further, the Court did not cite references as prior art or suggest that they need to be prior art—rather, the Court cited the references as an authoritative basis to show that the claims were directed to longstanding, well-known concepts. That the Court did this not once, but twice, is strong guidance that the USPTO should follow suit.

Similarly, examiners should be instructed to accept and give substantial weight to documentary evidence submitted by applicants rebutting the examiner’s conclusions under either Step 2A or 2B of the Mayo framework. This includes declarations from the inventor or others showing that particular limitations are not considered judicial exceptions by a person of ordinary skill in the relevant technical or scientific community, or that claims limitations would be considered “significantly more” by such person, or that the claim limitations provide improvements to the art.

2. The Role of Preemption in the Mayo Framework

The majority of public comments stated that preemption is the core concern underlying the judicial exceptions to Section 101, and that the examiner should be required to establish that a claim preempts a judicial exception in order to find the claim ineligible. The USPTO again took an opposing view to this consensus interpretation, asserting that questions of preemption are inherently addressed in the two-part Mayo test. The USPTO also stated that “while a preemptive claim may be ineligible, the absence of complete preemption does not guarantee that a claim is eligible.” This has effectively eliminated arguments made by applicants that their claims were patent eligible because they did not preempt other practical applications of the judicial exception. Neither the Supreme Court nor the Federal Circuit has endorsed the concept that preemption does not matter given the Mayo framework. Instead, the courts continue to evaluate patent claims with respect to preemption even after the Mayo framework has been applied.

More significantly, the USPTO’s argument fails to address the more likely situation: that a claim blocks (preempts) only a narrow range of applications or implementations of the identified judicial exception. This is not merely a case of an absence of complete preemption; it is the absence of any significant degree of preemption at all. The Supreme Court recognized that preemption is a matter of degree and held that a claim is ineligible where there is a disproportionate risk that the judicial exception is fully preempted. In Alice, the Court stated:

The former [claims on fundamental building blocks] “would risk disproportionately tying up the use of the underlying” ideas, and are therefore ineligible for patent protection. The latter [claims with limitations that provide practical applications] pose no comparable risk of pre-emption, and therefore remain eligible for the monopoly granted under our patent laws.” 134 S.Ct. at 2354 (emphasis added).

Since by definition a claim must preempt something, it is only where the scope of the claim covers the full scope of the judicial exception that the claim is rendered ineligible. Judge Lourie, whose explanation of the Mayo framework in CLS v. Alice was directly adopted by the Supreme Court, put it this way:

Rather, the animating concern is that claims should not be coextensive with a natural law, natural phenomenon, or abstract idea; a patent-eligible claim must include one or more substantive limitations that, in the words of the Supreme Court, add “significantly more” to the basic principle, with the result that the claim covers significantly less. See Mayo 132 S. Ct. at 1294. Thus, broad claims do not necessarily raise § 101 preemption concerns, and seemingly narrower claims are not necessarily exempt. What matters is whether a claim threatens to subsume the full scope of a fundamental concept, and when those concerns arise, we must look for meaningful limitations that prevent the claim as a whole from covering the concept’s every practical application.

Thus, both the Supreme Court and the Federal Circuit use preemption as the mechanism to evaluate whether a claim is eligible or not by applying it on both sides of the question: ineligible if preemptive, eligible if not preemptive. In addition, over 100 district court decisions since Alice have expressly considered whether the claims preempt, even after applying the Mayo framework. If the Mayo framework inherently addressed the preemption issue as the USPTO asserts, there would be no reason for the courts to address it. Finally, by removing preemption from the Mayo framework, the USPTO has turned the framework into the sole test for patent eligibility—directly contrary to the Supreme Court’s holding in Bilski that there is no one sole test for eligibility.

Lourie’s statement that a claim is patent eligible when it includes “substantive limitations…with the result that the claim covers significantly less” than the judicial exception provides a simple and expedient basis for using preemption as part of the streamlined analysis–something the USPTO has resisted in the July 2015 Update. Examiners are well trained to evaluate the scope of a claim based on its express limitations. Accordingly, they can typically determine for the majority of claims that, whatever the claim covers, it has limitations that prevent it from covering the full scope of some judicial exception. If the point of the streamlined analysis is to avoid the unnecessary burden of the Mayo framework, then a preemption analysis provides the best way to achieve that goal.

Finally, to suggest that the Mayo framework is precise enough to be a definitive test is to ignore the obvious: both steps of the framework are undefined. See McRO, Inc. v. Sega of America,, Inc., No. 2:12-cv-10327, 2014 WL 4749601,at *5 (C.D. Cal. Sept. 22, 2014) (Wu, J.) (“[T]he two-step test may be more like a one step test evocative of Justice Stewart’s most famous phrase [‘I know it when I see it’].”). The Court refused to define the scope of abstract ideas in Alice (Step 2A), and Step 2B entails evaluating the subjective requirement of “significantly more” or “enough.” What is left, then, is analysis by analogy and example—and both common sense and life experience tell us that these approaches very often lead to mistakes. Analogies can be good or bad, and most examples can be argued either way. Preemption serves as a way of evaluating whether the outcome from such analysis is consistent with the underlying rationale for the judicial exceptions in the first place.

3. Abstract Ideas Must be Prevalent and Longstanding in the Relevant Community

The majority of public comments on the IEG argued that to establish that an idea is abstract, an examiner must show that the idea is “fundamental” in the sense of being “long-standing” and “prevalent,” following the statements of the Supreme Court. Various commentators suggested specific rules for examiners, such as evidence that the idea has been known and used in practice for a period of 25 or more years. Even those who supported a restrictive view of patent eligibility suggested that examiner should look to “basic textbooks” to identify abstract ideas.

The USPTO responded in the July 2015 Update by asserting that abstract ideas need not be prevalent and longstanding to be fundamental, arguing that even novel abstract ideas are ineligible: “examiners should keep in mind that judicial exceptions need not be old or long‐prevalent, and that even newly discovered judicial exceptions are still exceptions.” The USPTO stated that ”The term ’fundamental‘ is used in the sense of being foundational or basic.” This analysis begs the question. An idea is foundational or basic because it is widely accepted and adopted in the relevant community—it is fundamental to the practices of the community. Indeed, any textbook on the “foundations” of a particular scientific field would explain the principles and concepts that are long-standing and widely-accepted by scientists in that field. It would not be a significant burden on the examiner to cite to such publications to support a finding under Step 2A. Indeed, the inability of an examiner to do so would be strong evidence that a claim is not directed to a foundational or basic practice.

4. USPTO Reliance on Non-Precedential Federal Circuit Decisions

Public comments noted that the 2014 IEG included citations and discussions of non-precedential Federal Circuit cases, such as Planet Bingo, LLC v VKGS LLC, and SmartGene, Inc. v Advanced Biological Labs, and indicated that because the cases are non-precedential, they should not be cited and relied upon by the USPTO as the basis of its guidance to examiners. Further, it was pointed out that the 2014 IEG mischaracterizes the abstract ideas at issue in these cases.

For example, the USPTO characterizes SmartGene as holding that “comparing new and stored information and using rules to identify options” is an abstract idea. The Federal Circuit’s actual holding was much more specific: that “the claim at issue here involves a mental process excluded from section 101: the mental steps of comparing new and stored information and using rules to identify medical options.” The court itself unambiguously limited the scope of its decision: “[o]ur ruling is limited to the circumstances presented here, in which every step is a familiar part of the conscious process that doctors can and do perform in their heads.” Thus, the USPTO’s characterization removed key aspects of the court’s expressly-limited holding: that the comparing steps were inherently mental steps (not computer steps) performed by a doctor considering medical rules (not any type of rules) to evaluate medical options (not other types of options). The court’s ruling cannot be generalized to all types of comparisons on all types of information using all types of rules. The improper generalization of the court’s holding has resulting in examiners applying SmartGene to find many claims for computer-implemented inventions ineligible. This is because many, if not most, computer processes can be characterized as comparing stored and new information and applying a decision rule to produce a useful result. For example, most automobiles use computers and embedded software to monitor vehicle sensors and take actions. A typical fuel management computer compares a current measure of fuel (new value) with a predefined minimum amount of fuel (stored information) and determines whether to turn on a low fuel light (using rules to identify option). Under the USPTO’s characterization of SmartGene, a claim to such a process would be deemed an abstract idea, an obviously incorrect outcome.

The USPTO did not address any of the problems identified by the public comments regarding non-precedential cases. Instead, the July 2015 Update simply states that the “2014 IEG instructs examiners to refer to the body of case law precedent in order to identify abstract ideas by way of comparison to concepts already found to be abstract,” and makes multiple other references to precedent. Even so, the July 2015 Update relies repeatedly on non-precedential Federal Circuit decisions, such as Dietgoal Innovations LLC v. Bravo Media LLC, Fuzzysharp Technologies Inc. v. Intel Corporation, Federal Home Loan Mortgage Corp. aka Freddie Mac v. Graff/Ross Holdings LLP, Gametek LLC v. Zynga, Inc., Perkinelmer, Inc. v. Intema Limited, and Cyberfone Systems, LLC v. CNN Interactive Group, Inc.

The USPTO should eliminate any discussion of or reliance upon non-precedential decisions. In the alternative, the USPTO should at minimum explain to examiners that such decisions are limited to their specific facts and are not to be generalized into controlling examples or rules.

5. There is No Separate Category for Methods of Organizing Human Activity

Public comments to the 2014 IEG pointed out various issues with the category of “methods of organizing human activities” as a basis of abstract ideas, and in particular requested clarification as to which types of method would fall within the category. Here too there was a broad agreement among the commentators that the proper interpretation of Bilski and Alice: The Court found that the claims in Alice and Bilski were abstract ideas because they were directed to a fundamental economic practice, not because the claims were methods of organizing human activity. The Court noted that Bilski’s claims were methods of organizing human activity only to rebut Alice’s arguments that abstract idea must always be “fundamental truths.” The Court’s analysis does not logically imply that methods of organizing human activity are inherently abstract ideas.

The USPTO responded by broadly interpreting the scope of the category, stating that many different kinds methods of organizing human activity can also be abstract ideas, but providing no explanation (other than examples) to determine when this is the case and when is it not. The USPTO then mapped various Federal Circuit case into this category, even where the court itself did not expressly rely upon such categorization. For example, the USPTO listed buySAFE, DealerTrack, Bancorp, PlanetBingo, Gametex, and Accenture as examples of cases dealing with methods of organizing human activity. However, none of these cases actually held that the methods in suit were methods of organizing human activity. Instead, every single one of these cases held that the claims were abstract as either mental steps or fundamental economic practices. Attempting to map Federal Circuit cases into this category is both confusing to examiners and the public and unnecessary.

The USPTO should remove this category from the Guidance until such time as the Federal Circuit or the Supreme Court provides a clear definition of its bounds.

6. There is No Separate Category for “An Idea of Itself”

Public comments noted that this is catch-all category that the courts have mentioned in passing but have never provided any definition of its contours, and further suggested that the USPTO clarify that this is not a distinct category of abstract ideas.

In response, once again the USPTO broadly described the category and linked various Federal Circuit cases to it as examples, where the court itself never so characterized the invention. The USPTO lists in this category cases the court held to be ineligible in other categories, such as mental steps (Cybersource, Smartgene*, Classen*, Perkinelmer*, Ambry, Myriad CAFC*, Content Extraction); mathematical algorithms (In re Grams, Digitech); and economic activities (Ultramercial) (*indicates non-precedential decision). In fact, there is no precedential Federal Circuit or Supreme Court case that has defined “an idea of itself” as a distinct category. It is only mentioned in dicta, never in a holding.

The result of the USPTO’s categorization of cases into multiple, different undefined categories is to make it more difficult, not less, for examiners to properly determine which types of claims are within which category. Further, where an examiner asserts that a claim falls into multiple categories (which is a simple assertion to make, since most inventions deal with multiple different concepts), the applicant is forced to rebut each categorization.

7. “Mathematical Algorithms” Are Limited to Solutions to Problem in Pure Mathematics

This category, more than any other, reflects the USPTO’s failure to substantively and meaningfully analyze the issues and provide clear guidance. Public comments to the 2014 IEG provided extensive analysis of the case law and the problems arising from mathematical algorithms being considered abstract ideas. The USPTO did not respond to the substantive analysis at all. Instead, the July 2015 Update merely lists cases that have held claims invalid as mathematical algorithms, without explanation. This is inadequate for several reasons.

First, the USPTO must clarify that the presence of a mathematical algorithm in the specification or claims is not a per se indication that the claims are directed to an abstract idea. In Alice, the Court expressly stated that “[o]ne of the claims in Bilski reduced hedging to a mathematical formula, but the Court did not assign any special significance to that fact, much less the sort of talismanic significance petitioner claims.” Equally so, examiners must not assign any special significance to the presence of a mathematical formula either in the disclosure or in the claim. What matters is the underlying concept, not how it is expressed (e.g. “no special significance”), whether in words or mathematical symbols.

Second, the presence of a mathematical formula or equation does not make an invention abstract for a very simple reason: mathematics is a language that allows for the very precise and formal description of certain types of ideas. All modern engineering, including, civil, mechanical, electrical, chemical, computer, etc., as well as all of the physical sciences, relies on mathematical analysis for design and formulation. Using a mathematical equation is simply one—albeit highly precise—way of expressing concepts, which may be either patent-eligible or not. Thus, the presence of a mathematical equation does not by itself imply or suggest anything about the underlying concept, and should not be relied upon by examiners as an automatic evidence of an ineligible abstract idea. While mathematics may be used to describe abstract ideas like the laws of mathematics, it can equally be used to describe entirely mundane and non-abstract ideas like fuel-efficient aircraft approach procedures (U.S. Patent No. 8,442,707), compressing video for transmission on cell phones (U.S. Patent No 8,494,051), efficiently allocating farming resources (U.S. Patent No. 6,990,459), or calculating golf handicaps and the difficulty of golf courses (U.S. Patent No. 8,282,455).

The correct interpretation of “mathematical algorithms” as used by the Supreme Court are algorithms that are solutions to inherently mathematical problems. This was the specific definition used by the Supreme Court in Benson, and confirmed in Diehr. In Benson, the Court stated:

A procedure for solving a given type of mathematical problem is known as an “algorithm.” The procedures set forth in the present claims are of that kind; that is to say, they are a generalized formulation for programs to solve mathematical problems of converting one form of numerical representation to another.

Later, in Diehr, the Court stated that in Benson “we defined ‘algorithm’ as a ‘procedure for solving a given type of mathematical problem,” noting that “our previous decisions regarding the patentability of ’algorithms‘ are necessarily limited to the more narrow definition employed by the Court.” The Court expressly rejected a broader definition that covered any “sequence of formulas and/or algebraic/logical steps to calculate or determine a given task; processing rules.”

The USPTO should clarify that this more limited definition of mathematical algorithms is to be used. This approach beneficially distinguishes between inventions in pure mathematics—which as the Court stated are precisely those that have the disproportionate risk of preemption because they can be used in an unlimited number of different fields—from inventions in applied mathematics, the mathematics used in the engineering and physical sciences. Examiners are well-accustomed by their formal scientific and technical training to distinguish between claims to these two types of inventions making use of mathematical formulas and equations.

8. Identifying Whether a Claim Limitation Recites a Conventional, Routine, and Well-Understood Function of a Computer

The public comments to the 2014 IEG discussed the problems resulting from considering the normal operations of a computer to be merely “generic” functions that are conventional, well-understood, and routine, and therefore by definition insufficient to support eligibility of a patent claim.

In response, the USPTO again ignored the substantive arguments, instead simply stating that examiners may rely on what the courts have recognized as “well understood, routine, and conventional functions” of computers, including “performing repetitive calculations,” “receiving, processing, and storing data,” “receiving or transmitting data over a network”. The July 2015 Update goes on to state that “This listing is not meant to imply that all computer functions are well‐understood, routine and conventional.”

This caveat is hardly sufficient, since the list essentially wipes out all computing operations as they are typically claimed. Just as claims for mechanical processes use verbs and gerunds that describe well-known mechanical operations, so too do claims for computer-based inventions necessarily describe the operations of a computer: receive, transmit, store, retrieve, determine, compare, process, and so forth. There is no other way to claim the operations of a computer except to use such terminology.

Accordingly, since the Supreme Court did not hold that all software and computer-implemented inventions are per se ineligible, the proper interpretation of the Court’s discussion of the generic functions of a computer is more narrowly-focused. Specifically, it is necessary to consider the entirety of each claim limitation, not merely the gerund or verb that introduces a method step. The claim limitation as a whole must recite nothing more than generic functions. When considering computer processing steps on computer data, limitations as to the source of data, the types of data, the operations performed on the data, how the outputs is generated, where the data is stored or transmitted, must all be considered. This is because it is these limitations that distinguish between the merely generic operations (“receiving a data input and determining an output”) and the particular applications.