Categories
Copyright Internet Uncategorized

One Step Closer to Mega Justice: Kim Dotcom Loses Extradition Hearing

Cross-posted from the Law Theories blog.

The news broke last night that Kim Dotcom has suffered a major setback in his bid to avoid standing trial in the Eastern District of Virginia on multiple felony charges relating to his Megaupload website. District Judge N.R. Dawson of the Auckland District Court in New Zealand dropped his 271-page bombshell opinion, holding that Kim Dotcom and his three co-defendants are eligible to be extradited to the United States. It’s been nearly four years since Dotcom was arrested in a dramatic raid by the New Zealand Police in early 2012, and this decision means that the victims of the so-called “Mega Conspiracy” are finally one step closer to justice.

Judge Dawson’s opinion is a remarkable read. He spends much of the opinion (pp. 16–172) going over a “selection” of the evidence and its inferences as proffered by the United States. For people claiming that there is no evidence that Dotcom did anything wrong, this should be required reading. The number of smoking-gun emails and Skype conversations is staggering. Luckily for the victims, the members of the Mega Conspiracy were not very good at covering their tracks.

There’s no doubt that Dotcom and his international team of lawyers have done a good job of stalling the proceedings. I’m sure most criminal defendants would love to receive even half as much legal representation as Dotcom has been able to procure. But at the end of the day, Judge Dawson is completely unimpressed with all of the irrelevant things that Dotcom argued. From the get-go, Dotcom tried to turn this extradition hearing into a trial on the merits under U.S. law—incredibly ironic, given that such a trial is the last thing Dotcom really wants. Judge Dawson thoroughly casts aside Dotcom’s extraneous nonsense and gets right to the heart of the matter.

As a preliminary matter, Judge Dawson rejects all three of Dotcom’s applications to stay the proceedings. The legal issues to be decided under New Zealand’s Extradition Act 1999, he says, are simple:

The Court must be satisfied, [1] that the appropriate supporting documentation has been produced, [2] that the offence is an extradition offence in relation to the extradition country and [3] that there is sufficient evidence relating to the offending that would justify the person’s trial if the conduct constituting the offence had occurred in New Zealand.

Note that Dotcom’s guilt under U.S. law is not something Judge Dawson decides—and rightfully so. That’s a matter for District Judge Liam O’Grady to oversee once Dotcom stands trial in the Eastern District of Virginia. Judge Dawson notes that Dotcom is “entitled to challenge the correctness of the evidence,” but that “any challenge to the evidence must be more than simply an alternative explanation.” And this is a big problem for Dotcom: All of his denials and handwavings are simply alternative explanations, and they don’t undermine the reliability of the evidence. The question, as Judge Dawson sees it, is whether the evidence would support a trial under New Zealand law. U.S. law, he holds, is irrelevant.

Dotcom claimed that he couldn’t conduct a proper defense because he couldn’t hire the experts he wanted to rebut the U.S. government’s read of the applicable law, but Judge Dawson would have none of it: “It is not within the jurisdiction of this Court to rule upon competing views of USA law. That is a matter for trial in the US court.” Furthermore, the fact that Dotcom could offer alternative explanations for the evidence just didn’t matter: “It is not the role of this court to weigh the merits of the applicant’s case against the possible alternative explanations that might be provided by the respondents.”

In one of my favorite passages, Judge Dawson mentions Dotcom’s claim that he’s being railroaded and that “public confidence in the integrity of the criminal justice system would be undermined if a stay was not granted.” Judge Dawson returns fire by pointing out that Dotcom has enjoyed “full access to the New Zealand legal system” and that granting the stay “with total disregard for the law,” as Dotcom would have it, “is more likely to undermine public confidence in the criminal justice system[.]” Touché!

It’s not until page 221 that Judge Dawson finally turns to the merits of whether Dotcom is eligible to be surrendered to the United States. That analysis, under Section 24, has two steps: (1) whether the alleged offense is an “extradition offence,” and (2) whether a prima facie case been established. Judge Dawson runs through this two-step test for all thirteen counts: conspiracy to commit racketeering, conspiracy to commit criminal copyright infringement, conspiracy to commit money laundering, five counts of criminal copyright infringement, and five counts of fraud by wire. Referencing the 150-plus pages of evidence he laid out earlier in the opinion, Judge Dawson holds that a prima facie case for each of the counts has been established for each of the defendants.

This is a far cry from the affidavit submitted by Harvard’s Larry Lessig, who claimed that there was no prima facie case against any of the defendants on any of the counts.

One issue worth mentioning is Dotcom’s claim that he would be eligible for the DMCA safe harbors in the United States. Judge Dawson has nothing to say about this, as he doesn’t think U.S. law is relevant to this hearing. But he does have this to say about Dotcom’s eligibility for the safe harbors under Section 92B of New Zealand’s Copyright Act:

The purpose and intent of s 92B is to provide a “safe harbour” for ISPs that unintentionally have their storage used by others to store illegitimate materials, a true “dumb pipe”. . . . Protection under s 92B is not available on the evidence adduced at this hearing. There is evidence before the Court sufficient to establish that the respondents were in the course of a business that knowingly uploaded copies of copyright protected material, they kept it with a view to committing acts that would infringe the copyright, they were complicit in promulgating that they possessed such material, and distributed it to large numbers of people for their own financial gain.

In other words, Dotcom would get no safe harbors under New Zealand’s version of the DMCA.

Lastly, Judge Dawson entertains Dotcom’s entertaining claim that a special exception should be made just for him because he is the victim of a politically-motivated prosecution orchestrated by “US movie studios executives” and “the Vice President of the United States of America, Mr Joe Biden.” (Seriously, he argued this.) Judge Dawson easily brushes this off:

That Mr Dotcom has a different point of view about the use of the internet from others including the US movie studios does not have the hallmarks of what is ordinarily regarded as political persecution for political beliefs. It is a difference of opinion about a business matter, the use of the internet and the application of copyright law.

No dice.

In the end, Judge Dawson notes that “it is not possible to issue decisions that would be less than encyclopaedic in length in order to cover every minor point alluded to in the hearing.” Regardless, he finds that “[t]here is no need to do this” since none of Dotcom’s arguments “come near to undermining the applicant’s case[.]” As such, Judge Dawson concludes that “[t]he overwhelming preponderance of evidence . . . establishes a prima facie case to answer for all respondents on each of the counts” and that “the respondents are all eligible for surrender on all thirteen counts in the superseding indictment.”

Dotcom will appeal, naturally, and his Twitter feed indicates that he feels good about his chances. I honestly can’t fathom why. It seems to me that his only success so far has been in delaying the inevitable, but that inevitably he’ll have to do the one thing that he doesn’t really want to do—defend his case on the merits under U.S. law when it really counts.

Categories
Copyright Internet Uncategorized

BMG v. Cox: ISP Liability and the Power of Inference

Cross-posted from the Law Theories blog.

As readers are likely aware, the jury verdict in BMG v. Cox was handed down on December 17th. The jury found that BMG had proved by a preponderance of the evidence that Cox’s users were direct infringers and that Cox is contributorily liable for that infringement. The interesting thing, to me at least, about these findings is that they were both proved by circumstantial evidence. That is, the jury inferred that Cox’s users were direct infringers and that Cox had the requisite knowledge to make it a contributory infringer. Despite all the headlines about smoking-gun emails from Cox’s abuse team, the case really came down a matter of inference.

Direct Infringement of the Public Distribution Right

Section 106(3) grants copyright owners the exclusive right “to distribute copies . . . of the copyrighted work to the public[.]” In the analog days, a copy had to first be made before it could be distributed, and this led to much of the case law focusing on the reproduction right. However, in the digital age, the public distribution usually occurs before the reproduction. In an upload-download scenario, the uploader publicly distributes the work and then the downloader makes the copy. This has brought much more attention to the contours of the public distribution right, and there are some interesting splits in the case law looking at online infringement.

Though from the analog world, there is one case that is potentially binding authority here: Hotaling v. Church of Jesus Christ of Latter-Day Saints. Handed down by the Fourth Circuit in 1997, Hotaling held that “a library distributes a published work . . . when it places an unauthorized copy of the work in its collection, includes the copy in its catalog or index system, and makes the copy available to the public.” The copies at issue in Hotaling were in microfiche form, and they could not be checked out by patrons. This meant that the plaintiff could not prove that the library actually disseminated the work to any member of the public. Guided by equitable concerns, the Fourth Circuit held that “a copyright holder would be prejudiced by a library that does not keep records of public use,” thus allowing the library to “unjustly profit by its own omission.”

Whether this aspect of Hotaling applies in the digital realm has been a point of contention, and the courts have been split on whether a violation of the public distribution right requires actual dissemination. As I’ve written about before, the Nimmer on Copyright treatise now takes the position that “[n]o consummated act of actual distribution need be demonstrated in order to implicate the copyright owner’s distribution right,” but that view has yet to be universally adopted. Regardless, even if actual dissemination is required, Hotaling can be read to stand for the proposition that it can be proved by circumstantial evidence. As one court put it, “Hotaling seems to suggest” that “evidence that a defendant made a copy of a work available to the public might, in conjunction with other circumstantial evidence, support an inference that the copy was likely transferred to a member of the public.”

The arguments made by BMG and Cox hashed out this now-familiar landscape. Cox argued that merely offering a work to the public is not enough: “Section 106(3) makes clear that Congress intended not to include unconsummated transactions.” It then distinguished Hotaling on its facts, suggesting that, unlike the plaintiff there, BMG was “in a position to gather information about alleged infringement, even if [it] chose not to.” In opposition, BMG pointed to district court cases citing Hotaling, as well as to the Nimmer treatise, for the proposition that making available is public distribution simpliciter.

As to Cox’s attempt to distinguish Hotaling on the facts, BMG argued that Cox was the one that failed “to record actual transmissions of infringing works by its subscribers over its network.” Furthermore, BMG argued that “a factfinder can infer that the works at issue were actually shared from the evidence that they were made available,” and it noted that cases Cox had relied on “permit the inference that dissemination actually took place.” In its reply brief, Cox faulted BMG for reading Hotaling so broadly, but it noticeably had nothing to say about the propriety of inferring that dissemination had actually taken place.

In his memorandum opinion issued on December 1st, District Judge Liam O’Grady sided with Cox on the making available issue and with BMG on the permissibility of inference. Reading Hotaling narrowly, Judge O’Grady held that the Fourth Circuit merely “articulated a principle that applies only in cases where it is impossible for a copyright owner to produce proof of actual distribution.” And without the making available theory on the table, “BMG must show an actual dissemination of a copyrighted work.” Nonetheless, Judge O’Grady held that the jury could infer actual dissemination based on the circumstantial evidence collected by BMG’s agent, Rightscorp:

Cox’s argument ignores the fact that BMG may establish direct infringement using circumstantial evidence that gives rise to an inference that Cox account holders or other authorized users accessed its service to directly infringe. . . . Rightscorp claims to have identified 2.5 million instances of Cox users making BMG’s copyrighted works available for download, and Rightscorp itself downloaded approximately 100,000 full copies of BMG’s works using Cox’s service. BMG has presented more than enough evidence to raise a genuine issue of material fact as to whether Cox account holders directly infringed its exclusive rights.

The jury was ultimately swayed by this circumstantial evidence, inferring that BMG had proved that it was more likely than not that Cox’s users had actually disseminated BMG’s copyrighted works. But proving direct infringement is only the first step, and BMG next had to demonstrate that Cox is contributorily liable for that infringement. As we’ll see, this too was proved by inference.

Contributory Infringement of the Public Distribution Right

While the Patent Act explicitly provides circumstances in which someone “shall be liable as a contributory infringer,” the Copyright Act’s approach is much less direct. As I’ve written about before, the entire body of judge-made law concerning secondary liability was imported into the 1976 Act via the phrase “to authorize” in Section 106. Despite missing this flimsy textual hook, the Supreme Court held in Sony that nothing precludes “the imposition of liability for copyright infringements on certain parties who have not themselves engaged in the infringing activity.” Indeed, the Court noted that “the concept of contributory infringement is merely a species of the broader problem of identifying the circumstances in which it is just to hold one individual accountable for the actions of another.”

Arguments about when it’s “just” to hold someone responsible for the infringement committed by another have kept lawyers busy for well over a century. The Second Circuit’s formulation of the contributory liability test in Gershwin has proved particularly influential over the past four decades: “[O]ne who, with knowledge of the infringing activity, induces, causes or materially contributes to the infringing conduct of another, may be held liable as a ‘contributory’ infringer.” This test has two elements: (1) knowledge, and (2) induce, cause, or materially contribute. Of course, going after the service provider, as opposed to going after the individual direct infringers, often makes sense. The Supreme Court noted this truism in Grokster:

When a widely shared service or product is used to commit infringement, it may be impossible to enforce rights in the protected work effectively against all direct infringers, the only practical alternative being to go against the distributor of the copying device for secondary liability on a theory of contributory or vicarious infringement.

And this is what BMG has done here by suing Cox instead of Cox’s users. The Supreme Court in Grokster also introduced a bit of confusion into the contributory infringement analysis. The theory at issue there was inducement—the plaintiffs argued that Grokster induced its users to infringe. Citing Gershwin, the Supreme Court stated this test: “One infringes contributorily by intentionally inducing or encouraging direct infringement[.]” Note how this is narrower than the test in Gershwin, which for the second element also permits causation or material contribution. While, on its face, this can plausibly be read to imply a narrowing of the traditional test for contributory infringement, the better read is that the Court merely mentioned the part of the test (inducement) that it was applying.

Nevertheless, Cox argued here that Grokster jettisoned a century’s worth of the material contribution flavor of contributory infringement: “While some interpret Grokster as creating a distinct inducement theory, the Court was clear: Grokster is the contributory standard.” Cox wanted the narrower inducement test to apply here because BMG would have a much harder time proving inducement over material contribution. As such, Cox focused on its lack of inducing behavior, noting that it did not take “any active steps to foster infringement.”

Despite its insistence that “Grokster supplanted the earlier Gershwin formulation,” Cox nevertheless argued that BMG’s anticipated material contribution claim “fails as a matter of law” since the knowledge element could not be proved. According to Cox, “Rightscorp’s notices do not establish Cox’s actual knowledge of any alleged infringement because notices are merely allegations of infringement[.]” Nor does the fact that it refused to receive notices from Rightscorp make it “willfully blind to copyright infringement on its network.” Cox didn’t argue that its service did not materially contribute to the infringement, and rightfully so—the material contribution element here is a no-brainer.

In opposition, BMG focused on Gershwin, declaring it to be “the controlling test for contributory infringement.” BMG noted that “Cox is unable to cite a single case adopting” its narrow “reading of Grokster, under which it would have silently overruled forty years of contributory infringement case law” applying Gershwin. (Indeed, I have yet to see a single court adopt Cox’s restrictive read of Grokster. This hasn’t stopped defendants from trying, though.) Turning to the material contribution element, BMG pointed out that “Cox does not dispute that it materially contributed to copyright infringement by its subscribers.” Again, Cox didn’t deny material contribution because it couldn’t win on this argument—the dispositive issue here is knowledge.

On the knowledge element, BMG proffered two theories. The first was that Cox is deemed “to have knowledge of infringement on its system where it knows or has reason to know of the infringing activity.” Here, BMG had sent Cox “millions of notices of infringement,” and it argued that Cox could not “avoid knowledge by blacklisting, deleting, or refusing” to accept its notices. Moreover, BMG noted that “Cox’s employees repeatedly acknowledged that they were aware of widespread infringement on Cox’s system.” BMG additionally argued that Cox was willfully blind since it “blacklisted or blocked every single notice of copyright infringement sent by Rightscorp on behalf of Plaintiffs, in an attempt to avoid specific knowledge of any infringement.”

In reply, Cox cited Sony for the rule that “a provider of a technology could not be liable for contributory infringement arising from misuse if the technology is capable of substantial noninfringing uses.” And since Cox’s service “is capable of substantial noninfringing users,” it claimed that it “cannot be liable under Sony.” Of course, as the Supreme Court clarified in Grokster, that is not the proper way to read Sony. Sony merely says that knowledge cannot be imputed because a service has some infringing uses. But BMG here is not asking for knowledge to be imputed based on the design of Cox’s service. It’s asking for knowledge to be inferred from the notices that Cox refused to receive.

Judge O’Grady made short work of Cox’s arguments. He cited Gershwin as the controlling law and rejected Cox’s argument vis-à-vis Grokster: “The Court finds no support for Cox’s reading of Grokster.” In a footnote, he brushed aside any discussion of whether Cox materially contributed to the infringement since Cox failed to raise the point in its initial memorandum. Judge O’Grady then turned to the knowledge element, stating the test as this: “The knowledge requirement is met by a showing of actual or constructive knowledge or by evidence that a defendant took deliberate actions to willfully blind itself to specific infringing activity.” In a footnote, he declined to follow the narrower rule in the Ninth Circuit from Napster that requires the plaintiff to establish “actual knowledge of specific acts of infringement.”

Thus, Judge O’Grady held that three types of knowledge were permissible to establish contributory infringement: (1) actual knowledge (“knew”), (2) constructive knowledge (“had reason to know”), or (3) willful blindness. Rejecting Cox’s theory to the contrary, he held that “DMCA-compliant notices are evidence of knowledge.” The catch here was that Cox refused to receive them, and it even ignored follow-up emails from BMG. And this is where inference came into play: Judge O’Grady held that Cox could have constructive knowledge since “a reasonable jury could conclude that Cox’s refusal to accept Rightscorp’s notices was unreasonable and that additional notice provided to Cox gave it reason to know of the allegedly infringing activity on its network.”

Turning to willful blindness, Judge O’Grady stated that it “requires more than negligence or recklessness.” Citing Global-Tech, he noted that BMG must prove that Cox “took ‘deliberate actions to avoid confirming a high probability of wrongdoing and who can almost be said to have actually known the critical facts.’” The issue here was clouded by the fact that Cox didn’t simply refuse to accept BMG’s notices from Rightscorp, but instead it offered to receive them if certain language offering settlements to Cox’s users was removed. While it would be reasonable to infer that Cox was not “deliberately avoiding knowledge of illegal activity,” Judge O’Grady held that “it is not the only inference available.” As such, he left it for the jury to decide as a question of fact which inference was better.

The jury verdict is now in, and we don’t know whether the jury found for BMG on the constructive knowledge theory or the willful blindness theory—or perhaps even both. Either way, the question boiled down to one of inference, and the jury was able to infer knowledge on Cox’s part. And this brings us back to the power of inference. Cox ended up being found liable as a contributory infringer for its users’ direct infringement of BMG’s public distribution rights, and both of these verdicts were established with nothing more than circumstantial evidence. That’s the power of inference when it comes to ISP liability.

Categories
Copyright Innovation Internet Remedies Uncategorized

Protecting Artists from Streaming Piracy Benefits Creativity and Technology

Here’s a brief excerpt of an op-ed by Devlin Hartline & Matthew Barblan that was published in The Hill:

In his recent op-ed in The Hill, Mike Montgomery argues that “[m]aking streaming copyright infringement a felony is a terrible idea” that will create “further rifts between tech and entertainment at a time when these two sectors are not only reliant upon one another, but melding.” While it’s true that the line between art and technology has become less discernable, it’s simply false that creating felony penalties for criminal streamers will put a wedge between the two. Instead, protecting artists and authors from such criminal enterprises serves to level the playing field so that honest creators and innovators can work together even more closely.

To read the rest of this op-ed, please visit The Hill.

Categories
Administrative Agency Copyright Legislation Uncategorized

Principles and Priorities to Guide Congress’s Ongoing Copyright Review

Last week, CPIP published a new white paper, Copyright Principles and Priorities to Foster a Creative Digital Marketplace, by Sandra Aistars, Mark Schultz, and myself, which draws from the testimonies and scholarly writings of CPIP Senior Scholars in order to guide Congress as it continues its comprehensive review of the Copyright Act. The white paper discusses the constitutional origins of copyright protection and offers principles and priorities for Congress to consider as it moves forward with the copyright review process.

The current copyright review began in early 2013, when Register of Copyrights Maria Pallante threw down the gauntlet in her Horace S. Manges lecture by urging Congress to create “the next great copyright act.” While noting that minor legislative tweaks certainly have their place, Register Pallante suggested that it’s time for Congress to do something far more sweeping. Since then, Congress has embarked on a comprehensive review of our copyright laws, conducting over twenty hearings since mid-2013.

CPIP Senior Scholars have been actively engaged in that hearing process. Prof. Sandra Aistars (while she was CEO of the Copyright Alliance) testified on the creative community’s contributions to innovation and suggested several principles for the review process. Prof. Mark Schultz offered testimony on the scope and subject matter of copyright, and Prof. Sean O’Connor gave testimony on the failure of the DMCA’s notice-and-takedown regime.

As we discuss in the white paper, the premise of our copyright system is that copyrights are more than just incentives to create—they’re also rewards to authors for their productive labors. The Founders understood that authors’ rights and the public good are complementary, and they knew that public interests are best served when individual interests are properly secured. That understanding has proved quite prescient, as copyright today drives many innovations that provide remarkable benefits to our economy, society, and culture.

In the white paper, we propose the following organizing principles for any further work reviewing or revising the Copyright Act:

    A. Stay True to Technology-Neutral Principles and Take the Long View
    B. Strengthen the Ability of Authors to Create and to Disseminate Works
    C. Value the Input of Creative Upstarts
    D. Ensure that Copyright Continues to Nurture Free Speech and Creative Freedom
    E. Rely on the Marketplace and Private Ordering Absent Clear Market Failures
    F. Value the Entire Body of Copyright Law

We then note that these principles in turn suggest that Congress prioritize the following areas for action:

    A. Copyright Office Modernization
    B. Registration and Recordation
    C. Mass Digitization and Orphan Works
    D. Small Claims
    E. Notice and Takedown
    F. Streaming Harmonization

The ball is still rolling with the copyright review process. The House Judiciary Committee began a listening tour this fall that kicked off in Nashville and then traveled to Silicon Valley and Los Angeles. Moreover, those who testified at the earlier hearings have been invited back to meet with Committee staff and discuss any further input they might have. And the Committee is open to “any interested party” coming in to discuss their interests.

All told, this lengthy review process places Congress in a good position to take the next step in bringing us closer to Register Pallante’s “next great copyright act.” And to that end, we hope that our white paper will help Congress keep the constitutional premise of copyright protection in mind as it chooses where we go from here.

To read the full white paper, please click here.

Categories
Biotech Gene Patents Innovation Inventors Uncategorized

How IP-Fueled Innovations in Biotechnology Have Led to the Gene Revolution

scientist looking through a microscopeWe’ve released a new issue paper, The Gene Revolution, by Amanda Maxham, a research associate and writer at the Ayn Rand Institute.

Dr. Maxham explores how innovations in biotechnology, enabled by the intellectual property rights that protect them, have led to the “Gene Revolution,” where scientists use genetic engineering to dramatically improve human life. In order to combat widespread misinformation about genetically modified organisms (GMOs), she traces mankind’s long history of improving plants, animals, and microorganisms to better serve our needs.

We’ve included the Executive Summary below. To read the full issue paper, please click here.

The Gene Revolution

By Amanda Maxham

Executive Summary

Mankind has been improving plants and animals for millennia. Simply by selecting and breeding those they liked best, our ancestors radically improved upon wild species. Today’s biological inventors, with a deeper understanding of genetics, breeding, and heredity, and with the protection of intellectual property rights, are using the technology of genetic engineering to start a “Gene Revolution.”

In the field of medicine, custom-built genetically engineered microorganisms are brewing up rivers of otherwise rare human hormones, life-saving medicines, and much-needed vaccines. In agriculture, scientists are combining their understanding of plant genetics with laboratory techniques of modern molecular biology to “unlock” the DNA of crop plants. By inserting genes from other plants or even common microorganisms, they are able to give plants desirable traits, solving problems that farmers have faced for millennia—faster and more precisely than ever before.

But despite its successes and a bright future, biotechnology is under attack by activists who spread misinformation and foster consumer mistrust. They have been directly responsible for onerous regulations and other hurdles to innovation that are threatening to stifle what could and should be the “third industrial revolution.”

In an effort to combat this misinformation, this paper situates genetic engineering within mankind’s long history of food improvement and then highlights how genetic engineering has dramatically improved human life. In it, you’ll find 29 plants, animals, and microorganisms, from insulin-secreting E. coli to engineered cotton, from cheese-making fungus to chestnut trees, that represent the promise and possibilities that the Gene Revolution holds–if we hold precious and continue to protect the freedom to invent and the power of scientific innovation.

Categories
Administrative Agency Innovation Inventors Patent Law Patent Litigation Uncategorized

#AliceStorm for Halloween: Was it a Trick or a Treat?

The following guest post from Robert R. Sachs, Partner at Fenwick & West LLP, first appeared on the Bilski Blog, and it is reposted here with permission.

By Robert R. Sachs

Alice has been busy the last two months, continuing to haunt the federal courts and the Knox and Randolph buildings at the USPTO. Here are the latest #AliceStorm numbers through the end of October 2015:

There have been 34 district court decisions in the past two months, but the percentage of invalidity decision is holding constant at 70.5%. The number of patent claims invalidated is now over 11,000, but also holding steady at around 71%.

There have been no new Federal Circuit Section 101 decisions, but we’re going to see a flurry of activity in the next couple of months, as the court has recently heard oral argument in a number of patent eligibility cases, and more are on calendar for November.

Motions on the pleadings have soared, with 23 in the past two months alone, and the success rate is up a tick from 70.1% to 71.4%.

PTAB is a bit mixed: the CBM institution rate is down from 86.2% 83.7%, but the final decision rate is still 100%, with 6 decisions in the past two months invalidating the patents in suit.

Turning to the motion analysis, the motions on the pleadings are the second scariest thing to a patent holder after the specter of attorney fees under Octane Fitness:

The Delaware district court continues as the graveyard of business methods and software patents, with 31 eligibility decisions, up from 19 just two months ago, and their invalidity rate is up from 86.4% to 90.3%.

Jumping into second place is the Eastern District of Texas, with 23 decisions total (up from 16). Contrary to the rest of the rest of the bench, their invalidity rate is 34.8%. The Northern District of California edged up from 75% to 78.9% invalidity, and C.D. Cal is up almost 2%.

And finally, here is the run down on the all district court judges with two or more Section 101 decisions.

With today’s blog, I’m introducing some entirely new types of data, looking at the characteristics of the patents that have been subject to Section 101 motions.

As expected, business method patents are the most heavily litigated and invalidated (click to see full size):

The distribution of patents in terms of earlier priority dates shows a very large fraction of the invalidated patents were first filed in 2000:

Now compare that to the distribution of patent classes with respect to priority year as well:

Here too we see a very large number of the business method patents filed in 2000. I’ve coded all of the software related technologies as blue to group them visually.

Why the cluster around 2000? State Street Bank, which held that there was no rule against business method patents, was decided in mid-1998. As those of us who were practicing them remember, it took about two years before the impact of the decision was widespread. This was also the time of the Dotcom bubble when it seemed that just about everyone was starting up a business on Internet. Those two factors resulted in a surge of patent filings.

Of all the patents that have been thus challenged under Alice, only two have post-Bilski priority dates:

  • 8447263, Emergency call analysis system, filed in 2011, and litigated in Boar’s Head Corp. v. DirectApps, Inc., 2015 U.S. Dist. LEXIS 98502 (E.D. Cal., 7/28/2015). The court granted DirectApps motion to dismiss, finding the patent invalid.
  • 8938510, On-demand mailbox synchronization and migration system, filed in 2010, and litigated in BitTitan, Inc. v. SkyKick, Inc., 2015 U.S. Dist. LEXIS 114082 (W.D. Wash, 8/27/2015). BitTran’s motion for preliminary injunction was denied in part because of SkyKick successfully argued that BitTrans was not likely to succeed on the merits due to Alice problems.
  • 8,604,943; 9,070,295; 9,082,097; 9,082,098 and 9,087,313, all of which claim priority to March, 2012, and were invalidated just last week in MacroPoint LLC v. FourKites Inc., Case. No. 1:15-cv-01002 (N.D. Ohio, Nov. 5, 2015). The court invalided all 94 claims in these patents, as being directed to the abstract idea of “tracking freight.” While the last four patents were issued in June and July, 2015, none of them overcome an Alice rejection, and the court noted that “Nothing in the Reasons for Allowance dictate a finding that these concepts are inventive on the issue of patent-eligible subject matter.”

Over time we’ll see more post-Bilski patents being litigated, and then eventually a true test: a business method patent granted after Alice that overcame an Alice rejection. By my count, there are about 80 such patents thus far, and about another 90 that have been allowed. It will not be too long then before one of these patents is challenged under Section 101.

In my next column, I’ll review some very disturbing decisions by coming out of the Delaware district courts.

Categories
Innovation Legislation Trade Secrets Uncategorized

Debunking Myths About the Proposed Federal Trade Secrets Act

By Mark Schultz

Today, CPIP is proud to release a paper authored by the nation’s preeminent expert on trade secret law, James Pooley. Mr. Pooley’s paper explains the arguments in favor of the Defend Trade Secrets Act of 2015 (“DTSA”), which is currently being considered by Congress. To download the paper, please click here.

The DTSA would create a federal cause of action for trade secret misappropriation. The legislation has been proposed via identical House (H.R.3326) and Senate (S.1890) bills. While trade secret theft has been a federal crime since 1996 pursuant to the Economic Espionage Act, civil claims have been left to state laws. The new bill would provide nationwide federal jurisdiction, while retaining the parallel state laws.

Trade secrets have become increasingly important at the same time they have become more vulnerable. Research in the US and Europe shows that trade secrets are the kind of IP most widely and universally relied upon by businesses. They are particularly important to small businesses. However, they can be stolen more easily than ever. Vital proprietary information that once would have resided in file cabinets and that would have taken days to copy now can be downloaded at the speed of light.

The DTSA is needed to improve the speed and efficiency of trade secret protection in the US. By some measures, as my own research for the OECD with my co-author Doug Lippoldt showed, the US has the strongest laws protecting trade secrets in the world. However, the multi-jurisdictional approach taken by the US presents a unique challenge to enforcing trade secrets quickly and efficiently. Investigating claims, conducting discovery, and enforcing claims in multiple states takes time. In an ordinary tort or contract case, such delays are usually manageable. In a trade secret case, even small delays can make the difference between rescuing a multi-million dollar secret and seeing its value destroyed utterly.

The proposed DTSA has enjoyed broad support from a coalition of large and small businesses. The bill has been largely uncontroversial, except among some legal academics. We have become accustomed to reflexive academic skepticism of improving IP rights, but some of the arguments against the DTSA have been truly puzzling.

The most puzzling academic argument against the bill is the claim that adding federal jurisdiction to trade secret enforcement will give rise to a new class of trade secret “troll.” It’s hard to see this claim as anything other than a mere rhetorical attempt to piggyback on the (largely specious) patent “troll” issue. According to research conducted for the European Commission, as well as widespread anecdotal evidence, firms routinely forego litigating trade secret claims for fear of revealing their proprietary information. It is thus hardly credible that they would expose their secrets in order to “troll,” especially merely because they now have easier access to federal courts.

Mr. Pooley’s paper explains the benefits of the DTSA while carefully refuting the “troll” myth and other arguments against the bill. The article includes a timely response to an academic letter released today expressing opposition to the DTSA.

Categories
Copyright Innovation Internet Uncategorized

Protecting Authors and Artists by Closing the Streaming Loophole

U.S. Capitol buildingWe’ve released a new policy brief, Protecting Authors and Artists by Closing the Streaming Loophole, by Devlin Hartline & Matthew Barblan.

They argue that in order to protect authors and artists from having their works repeatedly stolen on the internet, it is long past time to harmonize the remedies for criminal copyright infringement to reflect the ways that copyrighted works are commonly misappropriated these days.

We’ve included the Introduction below. To read the full policy brief, please click here.

Protecting Authors and Artists by Closing the Streaming Loophole

By Devlin Hartline & Matthew Barblan

Introduction

Copyright protects the property rights of authors and artists through both civil and criminal remedies for infringement. While the civil remedies are commonplace, the sections of the Copyright Act that specify which forms of infringement qualify as criminal offenses are less familiar. Unfortunately for authors and artists, the remedies for criminal infringement have not been updated to reflect the realities of how copyrighted works are frequently misappropriated these days. Streaming has become more popular than ever, yet the law treats bad actors who traffic in illicit streams much more kindly than those who traffic in illicit downloads. This results in a loophole that emboldens bad actors and makes it harder for authors and artists to protect their property rights.

Authors and artists deserve better. It shouldn’t matter whether the works are illegally streamed to users or offered for download. From the perspective of a creator whose property rights are being ripped off, the result is exactly the same—the works are supplied to the public without the creator’s permission. Congress has a long history of modernizing copyright law to account for ever-changing technologies. Now that the internet has advanced to where streaming is a dominant method of illicitly disseminating copyrighted works, the time has come to close the streaming loophole and to harmonize the remedies for criminal copyright infringement.

Categories
Administrative Agency Innovation Inventors Patent Law Patent Litigation Uncategorized

Overview of Comments on the USPTO's July 2015 Update to the Interim Examination Guidance

The following guest post from Robert R. Sachs, Partner at Fenwick & West LLP, first appeared on the Bilski Blog, and it is reposted here with permission.

By Robert R. Sachs

In late July, the USPTO issued its July 2015 Update to the 2014 Interim Section 101 Patent Eligibility Guidance (IEG). The July 2015 Update addresses a number of the issues and concerns raised in the public comments to the IEG and is supposed to assist examiners in applying the 2014 IEG during the patent examination process. The July 2015 Update also includes a new set of examples of claims involving abstract ideas and sample analysis under the Mayo framework. The USPTO is seeking public comments on the July 2015 Update, and comments are due on October 28, 2015, via email at 2014_interim_guidance@uspto.gov.

Here is an overview of what I think are the key issues and concerns with the July 2015 Update. Feel free to use any of my analysis in your comments to the USPTO.

1. Requirements of Prima Facie Case and the Role of Evidence

A significant number of the public comments on the 2014 IEG noted that examiners have the burden to make the prima facie case that a patent claim is ineligible, and that the Administrative Procedures Act (APA) and Federal Circuit case law requires that this determination be made based on “substantial evidence,” and not examiner opinion. In particular, all of the public comments that addressed this issue stated that examiners should have to provide documentary evidence to support a conclusion that a claim is directed to a judicial exception or that claim limitations are well understood, routine, and conventional.

In the July 2015 Update, the USPTO responded by stating that whether a claim is ineligible is a question of law and courts do not rely on evidence to establish that a claim is directed to a judicial exception, and therefore examiners likewise do not need to rely on any evidence that a particular concept is abstract, or a fundamental economic concept, or even a law of nature. The USPTO’s reliance on the judicial model is legally incorrect. First, examiners are bound by the APA and judges are not. Second, that eligibility is a question of law does not mean that there are not factual issues, as well—it merely determines whether the court or a jury is to make the finding. Obviousness is likewise a question of law, but there are clearly factual issues involved. Third, when judges take judicial notice, they are making a finding of fact, and they must do so under the requirements of Federal Rules of Evidence, Rule 201, which states that “The court may judicially notice a fact that is not subject to reasonable dispute because it: … can be accurately and readily determined from sources whose accuracy cannot reasonably be questioned.” This requirement is similar to the requirements of Official Notice set forth in MPEP 2144.03: “Official notice unsupported by documentary evidence should only be taken by the examiner where the facts asserted to be well-known, or to be common knowledge in the art are capable of instant and unquestionable demonstration as being well-known.” Thus, by its own logic, examiners should comply with the requirements of MPEP 2144.03.

As to the role of evidence, again the public comments that discussed this issue all took the position that examiners must cite authoritative documentary evidence, such as textbooks or similar publications to support a conclusion that a claim recites a judicial exception or that certain practices are well known, conventional or routine. The public comments on this issue all made the same argument: that the Supreme Court in Bilski and Alice cited references in support of their conclusions that the claims were ineligible.

In response to this uniform opinion, the USPTO maintained its position that citations of references was not necessary because the references in Bilski and Alice were technically not “evidence” since the Court is an appellate court, and further that the references were not necessarily prior art. This argument misses the point. Regardless of whether the references were evidence under the Federal Rules of Evidence, the Court felt it necessary and proper to cite them. Further, the Court did not cite references as prior art or suggest that they need to be prior art—rather, the Court cited the references as an authoritative basis to show that the claims were directed to longstanding, well-known concepts. That the Court did this not once, but twice, is strong guidance that the USPTO should follow suit.

Similarly, examiners should be instructed to accept and give substantial weight to documentary evidence submitted by applicants rebutting the examiner’s conclusions under either Step 2A or 2B of the Mayo framework. This includes declarations from the inventor or others showing that particular limitations are not considered judicial exceptions by a person of ordinary skill in the relevant technical or scientific community, or that claims limitations would be considered “significantly more” by such person, or that the claim limitations provide improvements to the art.

2. The Role of Preemption in the Mayo Framework

The majority of public comments stated that preemption is the core concern underlying the judicial exceptions to Section 101, and that the examiner should be required to establish that a claim preempts a judicial exception in order to find the claim ineligible. The USPTO again took an opposing view to this consensus interpretation, asserting that questions of preemption are inherently addressed in the two-part Mayo test. The USPTO also stated that “while a preemptive claim may be ineligible, the absence of complete preemption does not guarantee that a claim is eligible.” This has effectively eliminated arguments made by applicants that their claims were patent eligible because they did not preempt other practical applications of the judicial exception. Neither the Supreme Court nor the Federal Circuit has endorsed the concept that preemption does not matter given the Mayo framework. Instead, the courts continue to evaluate patent claims with respect to preemption even after the Mayo framework has been applied.

More significantly, the USPTO’s argument fails to address the more likely situation: that a claim blocks (preempts) only a narrow range of applications or implementations of the identified judicial exception. This is not merely a case of an absence of complete preemption; it is the absence of any significant degree of preemption at all. The Supreme Court recognized that preemption is a matter of degree and held that a claim is ineligible where there is a disproportionate risk that the judicial exception is fully preempted. In Alice, the Court stated:

The former [claims on fundamental building blocks] “would risk disproportionately tying up the use of the underlying” ideas, and are therefore ineligible for patent protection. The latter [claims with limitations that provide practical applications] pose no comparable risk of pre-emption, and therefore remain eligible for the monopoly granted under our patent laws.” 134 S.Ct. at 2354 (emphasis added).

Since by definition a claim must preempt something, it is only where the scope of the claim covers the full scope of the judicial exception that the claim is rendered ineligible. Judge Lourie, whose explanation of the Mayo framework in CLS v. Alice was directly adopted by the Supreme Court, put it this way:

Rather, the animating concern is that claims should not be coextensive with a natural law, natural phenomenon, or abstract idea; a patent-eligible claim must include one or more substantive limitations that, in the words of the Supreme Court, add “significantly more” to the basic principle, with the result that the claim covers significantly less. See Mayo 132 S. Ct. at 1294. Thus, broad claims do not necessarily raise § 101 preemption concerns, and seemingly narrower claims are not necessarily exempt. What matters is whether a claim threatens to subsume the full scope of a fundamental concept, and when those concerns arise, we must look for meaningful limitations that prevent the claim as a whole from covering the concept’s every practical application.

Thus, both the Supreme Court and the Federal Circuit use preemption as the mechanism to evaluate whether a claim is eligible or not by applying it on both sides of the question: ineligible if preemptive, eligible if not preemptive. In addition, over 100 district court decisions since Alice have expressly considered whether the claims preempt, even after applying the Mayo framework. If the Mayo framework inherently addressed the preemption issue as the USPTO asserts, there would be no reason for the courts to address it. Finally, by removing preemption from the Mayo framework, the USPTO has turned the framework into the sole test for patent eligibility—directly contrary to the Supreme Court’s holding in Bilski that there is no one sole test for eligibility.

Lourie’s statement that a claim is patent eligible when it includes “substantive limitations…with the result that the claim covers significantly less” than the judicial exception provides a simple and expedient basis for using preemption as part of the streamlined analysis–something the USPTO has resisted in the July 2015 Update. Examiners are well trained to evaluate the scope of a claim based on its express limitations. Accordingly, they can typically determine for the majority of claims that, whatever the claim covers, it has limitations that prevent it from covering the full scope of some judicial exception. If the point of the streamlined analysis is to avoid the unnecessary burden of the Mayo framework, then a preemption analysis provides the best way to achieve that goal.

Finally, to suggest that the Mayo framework is precise enough to be a definitive test is to ignore the obvious: both steps of the framework are undefined. See McRO, Inc. v. Sega of America,, Inc., No. 2:12-cv-10327, 2014 WL 4749601,at *5 (C.D. Cal. Sept. 22, 2014) (Wu, J.) (“[T]he two-step test may be more like a one step test evocative of Justice Stewart’s most famous phrase [‘I know it when I see it’].”). The Court refused to define the scope of abstract ideas in Alice (Step 2A), and Step 2B entails evaluating the subjective requirement of “significantly more” or “enough.” What is left, then, is analysis by analogy and example—and both common sense and life experience tell us that these approaches very often lead to mistakes. Analogies can be good or bad, and most examples can be argued either way. Preemption serves as a way of evaluating whether the outcome from such analysis is consistent with the underlying rationale for the judicial exceptions in the first place.

3. Abstract Ideas Must be Prevalent and Longstanding in the Relevant Community

The majority of public comments on the IEG argued that to establish that an idea is abstract, an examiner must show that the idea is “fundamental” in the sense of being “long-standing” and “prevalent,” following the statements of the Supreme Court. Various commentators suggested specific rules for examiners, such as evidence that the idea has been known and used in practice for a period of 25 or more years. Even those who supported a restrictive view of patent eligibility suggested that examiner should look to “basic textbooks” to identify abstract ideas.

The USPTO responded in the July 2015 Update by asserting that abstract ideas need not be prevalent and longstanding to be fundamental, arguing that even novel abstract ideas are ineligible: “examiners should keep in mind that judicial exceptions need not be old or long‐prevalent, and that even newly discovered judicial exceptions are still exceptions.” The USPTO stated that ”The term ’fundamental‘ is used in the sense of being foundational or basic.” This analysis begs the question. An idea is foundational or basic because it is widely accepted and adopted in the relevant community—it is fundamental to the practices of the community. Indeed, any textbook on the “foundations” of a particular scientific field would explain the principles and concepts that are long-standing and widely-accepted by scientists in that field. It would not be a significant burden on the examiner to cite to such publications to support a finding under Step 2A. Indeed, the inability of an examiner to do so would be strong evidence that a claim is not directed to a foundational or basic practice.

4. USPTO Reliance on Non-Precedential Federal Circuit Decisions

Public comments noted that the 2014 IEG included citations and discussions of non-precedential Federal Circuit cases, such as Planet Bingo, LLC v VKGS LLC, and SmartGene, Inc. v Advanced Biological Labs, and indicated that because the cases are non-precedential, they should not be cited and relied upon by the USPTO as the basis of its guidance to examiners. Further, it was pointed out that the 2014 IEG mischaracterizes the abstract ideas at issue in these cases.

For example, the USPTO characterizes SmartGene as holding that “comparing new and stored information and using rules to identify options” is an abstract idea. The Federal Circuit’s actual holding was much more specific: that “the claim at issue here involves a mental process excluded from section 101: the mental steps of comparing new and stored information and using rules to identify medical options.” The court itself unambiguously limited the scope of its decision: “[o]ur ruling is limited to the circumstances presented here, in which every step is a familiar part of the conscious process that doctors can and do perform in their heads.” Thus, the USPTO’s characterization removed key aspects of the court’s expressly-limited holding: that the comparing steps were inherently mental steps (not computer steps) performed by a doctor considering medical rules (not any type of rules) to evaluate medical options (not other types of options). The court’s ruling cannot be generalized to all types of comparisons on all types of information using all types of rules. The improper generalization of the court’s holding has resulting in examiners applying SmartGene to find many claims for computer-implemented inventions ineligible. This is because many, if not most, computer processes can be characterized as comparing stored and new information and applying a decision rule to produce a useful result. For example, most automobiles use computers and embedded software to monitor vehicle sensors and take actions. A typical fuel management computer compares a current measure of fuel (new value) with a predefined minimum amount of fuel (stored information) and determines whether to turn on a low fuel light (using rules to identify option). Under the USPTO’s characterization of SmartGene, a claim to such a process would be deemed an abstract idea, an obviously incorrect outcome.

The USPTO did not address any of the problems identified by the public comments regarding non-precedential cases. Instead, the July 2015 Update simply states that the “2014 IEG instructs examiners to refer to the body of case law precedent in order to identify abstract ideas by way of comparison to concepts already found to be abstract,” and makes multiple other references to precedent. Even so, the July 2015 Update relies repeatedly on non-precedential Federal Circuit decisions, such as Dietgoal Innovations LLC v. Bravo Media LLC, Fuzzysharp Technologies Inc. v. Intel Corporation, Federal Home Loan Mortgage Corp. aka Freddie Mac v. Graff/Ross Holdings LLP, Gametek LLC v. Zynga, Inc., Perkinelmer, Inc. v. Intema Limited, and Cyberfone Systems, LLC v. CNN Interactive Group, Inc.

The USPTO should eliminate any discussion of or reliance upon non-precedential decisions. In the alternative, the USPTO should at minimum explain to examiners that such decisions are limited to their specific facts and are not to be generalized into controlling examples or rules.

5. There is No Separate Category for Methods of Organizing Human Activity

Public comments to the 2014 IEG pointed out various issues with the category of “methods of organizing human activities” as a basis of abstract ideas, and in particular requested clarification as to which types of method would fall within the category. Here too there was a broad agreement among the commentators that the proper interpretation of Bilski and Alice: The Court found that the claims in Alice and Bilski were abstract ideas because they were directed to a fundamental economic practice, not because the claims were methods of organizing human activity. The Court noted that Bilski’s claims were methods of organizing human activity only to rebut Alice’s arguments that abstract idea must always be “fundamental truths.” The Court’s analysis does not logically imply that methods of organizing human activity are inherently abstract ideas.

The USPTO responded by broadly interpreting the scope of the category, stating that many different kinds methods of organizing human activity can also be abstract ideas, but providing no explanation (other than examples) to determine when this is the case and when is it not. The USPTO then mapped various Federal Circuit case into this category, even where the court itself did not expressly rely upon such categorization. For example, the USPTO listed buySAFE, DealerTrack, Bancorp, PlanetBingo, Gametex, and Accenture as examples of cases dealing with methods of organizing human activity. However, none of these cases actually held that the methods in suit were methods of organizing human activity. Instead, every single one of these cases held that the claims were abstract as either mental steps or fundamental economic practices. Attempting to map Federal Circuit cases into this category is both confusing to examiners and the public and unnecessary.

The USPTO should remove this category from the Guidance until such time as the Federal Circuit or the Supreme Court provides a clear definition of its bounds.

6. There is No Separate Category for “An Idea of Itself”

Public comments noted that this is catch-all category that the courts have mentioned in passing but have never provided any definition of its contours, and further suggested that the USPTO clarify that this is not a distinct category of abstract ideas.

In response, once again the USPTO broadly described the category and linked various Federal Circuit cases to it as examples, where the court itself never so characterized the invention. The USPTO lists in this category cases the court held to be ineligible in other categories, such as mental steps (Cybersource, Smartgene*, Classen*, Perkinelmer*, Ambry, Myriad CAFC*, Content Extraction); mathematical algorithms (In re Grams, Digitech); and economic activities (Ultramercial) (*indicates non-precedential decision). In fact, there is no precedential Federal Circuit or Supreme Court case that has defined “an idea of itself” as a distinct category. It is only mentioned in dicta, never in a holding.

The result of the USPTO’s categorization of cases into multiple, different undefined categories is to make it more difficult, not less, for examiners to properly determine which types of claims are within which category. Further, where an examiner asserts that a claim falls into multiple categories (which is a simple assertion to make, since most inventions deal with multiple different concepts), the applicant is forced to rebut each categorization.

7. “Mathematical Algorithms” Are Limited to Solutions to Problem in Pure Mathematics

This category, more than any other, reflects the USPTO’s failure to substantively and meaningfully analyze the issues and provide clear guidance. Public comments to the 2014 IEG provided extensive analysis of the case law and the problems arising from mathematical algorithms being considered abstract ideas. The USPTO did not respond to the substantive analysis at all. Instead, the July 2015 Update merely lists cases that have held claims invalid as mathematical algorithms, without explanation. This is inadequate for several reasons.

First, the USPTO must clarify that the presence of a mathematical algorithm in the specification or claims is not a per se indication that the claims are directed to an abstract idea. In Alice, the Court expressly stated that “[o]ne of the claims in Bilski reduced hedging to a mathematical formula, but the Court did not assign any special significance to that fact, much less the sort of talismanic significance petitioner claims.” Equally so, examiners must not assign any special significance to the presence of a mathematical formula either in the disclosure or in the claim. What matters is the underlying concept, not how it is expressed (e.g. “no special significance”), whether in words or mathematical symbols.

Second, the presence of a mathematical formula or equation does not make an invention abstract for a very simple reason: mathematics is a language that allows for the very precise and formal description of certain types of ideas. All modern engineering, including, civil, mechanical, electrical, chemical, computer, etc., as well as all of the physical sciences, relies on mathematical analysis for design and formulation. Using a mathematical equation is simply one—albeit highly precise—way of expressing concepts, which may be either patent-eligible or not. Thus, the presence of a mathematical equation does not by itself imply or suggest anything about the underlying concept, and should not be relied upon by examiners as an automatic evidence of an ineligible abstract idea. While mathematics may be used to describe abstract ideas like the laws of mathematics, it can equally be used to describe entirely mundane and non-abstract ideas like fuel-efficient aircraft approach procedures (U.S. Patent No. 8,442,707), compressing video for transmission on cell phones (U.S. Patent No 8,494,051), efficiently allocating farming resources (U.S. Patent No. 6,990,459), or calculating golf handicaps and the difficulty of golf courses (U.S. Patent No. 8,282,455).

The correct interpretation of “mathematical algorithms” as used by the Supreme Court are algorithms that are solutions to inherently mathematical problems. This was the specific definition used by the Supreme Court in Benson, and confirmed in Diehr. In Benson, the Court stated:

A procedure for solving a given type of mathematical problem is known as an “algorithm.” The procedures set forth in the present claims are of that kind; that is to say, they are a generalized formulation for programs to solve mathematical problems of converting one form of numerical representation to another.

Later, in Diehr, the Court stated that in Benson “we defined ‘algorithm’ as a ‘procedure for solving a given type of mathematical problem,” noting that “our previous decisions regarding the patentability of ’algorithms‘ are necessarily limited to the more narrow definition employed by the Court.” The Court expressly rejected a broader definition that covered any “sequence of formulas and/or algebraic/logical steps to calculate or determine a given task; processing rules.”

The USPTO should clarify that this more limited definition of mathematical algorithms is to be used. This approach beneficially distinguishes between inventions in pure mathematics—which as the Court stated are precisely those that have the disproportionate risk of preemption because they can be used in an unlimited number of different fields—from inventions in applied mathematics, the mathematics used in the engineering and physical sciences. Examiners are well-accustomed by their formal scientific and technical training to distinguish between claims to these two types of inventions making use of mathematical formulas and equations.

8. Identifying Whether a Claim Limitation Recites a Conventional, Routine, and Well-Understood Function of a Computer

The public comments to the 2014 IEG discussed the problems resulting from considering the normal operations of a computer to be merely “generic” functions that are conventional, well-understood, and routine, and therefore by definition insufficient to support eligibility of a patent claim.

In response, the USPTO again ignored the substantive arguments, instead simply stating that examiners may rely on what the courts have recognized as “well understood, routine, and conventional functions” of computers, including “performing repetitive calculations,” “receiving, processing, and storing data,” “receiving or transmitting data over a network”. The July 2015 Update goes on to state that “This listing is not meant to imply that all computer functions are well‐understood, routine and conventional.”

This caveat is hardly sufficient, since the list essentially wipes out all computing operations as they are typically claimed. Just as claims for mechanical processes use verbs and gerunds that describe well-known mechanical operations, so too do claims for computer-based inventions necessarily describe the operations of a computer: receive, transmit, store, retrieve, determine, compare, process, and so forth. There is no other way to claim the operations of a computer except to use such terminology.

Accordingly, since the Supreme Court did not hold that all software and computer-implemented inventions are per se ineligible, the proper interpretation of the Court’s discussion of the generic functions of a computer is more narrowly-focused. Specifically, it is necessary to consider the entirety of each claim limitation, not merely the gerund or verb that introduces a method step. The claim limitation as a whole must recite nothing more than generic functions. When considering computer processing steps on computer data, limitations as to the source of data, the types of data, the operations performed on the data, how the outputs is generated, where the data is stored or transmitted, must all be considered. This is because it is these limitations that distinguish between the merely generic operations (“receiving a data input and determining an output”) and the particular applications.

Categories
Copyright History of Intellectual Property Innovation Inventors Trade Secrets Trademarks Uncategorized

Strong IP Protection Provides Inventors and Creators the Economic Freedom to Create

Here’s a brief excerpt of a post by Terrica Carrington that was published on IPWatchdog.

CPIP went against the grain with this conference, and showed us, bit by bit, what our world might look like today without intellectual property rights. Music wouldn’t sound the same. Movies wouldn’t look the same. You wouldn’t be reading this on your smartphone or have access to the cutting-edge biopharma and healthcare products that you rely on. And some of our greatest artists and inventors might be so busy trying to make ends meet that they would never create the amazing artistic works and inventions that we all enjoy. In short, CPIP explored how intellectual property rights work together as a platform that enables us to innovate, share, and collaborate across industries to develop incredible new products and services at an astounding rate.

To read the rest of this post, please visit IPWatchdog.