Categories
High Tech Industry

Privacy Law Considerations of AI and Big Data – In the U.S. & Abroad

By Kathleen Wills, Esq.*

Kathleen Wills is a graduate of Antonin Scalia Law School and former C-IP2 RA.

circuit boardArtificial Intelligence and Big Data

While many of us have come to rely on biometrics data when we open our phones with Apple’s “Face ID,” speak to Amazon’s Alexa, or scan our fingerprints to access something, it’s important to understand some of the legal implications about the big data feeding artificial intelligence (AI) algorithms. While “Big Data” refers to processing large-scale and complex data,[1] “biometrics data” refers to the physical characteristics of humans that can be extracted for recognition.[2] AI and biometrics work together in the dynamics as exemplified above, since AI is a data-driven technology and personal data has become propertised.[3] The type and sensitivity of the personal data used by AI depend on the application, and not all applications trace details back to a specific person.[4] The already-active field of Big Data analysis of biometrics working with AI continues to grow, promising to pose challenges and opportunities for consumers, governments, and companies.

A. How AI Uses Big Data

AI works with Big Data to accomplish several different outcomes. For example, AI can use Big Data to recognize, categorize, and find relationships from the data.[5] AI can also work with Big Data to adapt to patterns and identify opportunities so that the data can be understood and put into context. For organizations looking to improve efficiency and effectiveness, AI can leverage Big Data to predict the impact of various decisions. In fact, AI can work with algorithms to suggest actions before they have been deployed, assess risk, and provide feedback in real time from the Big Data pools. When AI works with Big Data and biometrics, AI can perform various types of human recognition for applications in every industry.[6] In other words, the more data AI can process, the more it can learn. Thus, the two rely on each other in order to keep pushing the bounds of technological innovation and machine learning and development.

B. How AI relates to Privacy Laws

Since AI involves analyzing and understanding Big Data, often the type involving biometrics, or personal information, there are privacy considerations and interests to protect. Further, since businesses want access to consumer data in order to optimize the market, governments are placing limits on the use and retention of such data. For some sectors, the boundary between privacy and AI becomes an ethical one. One can immediately imagine the importance of keeping biometric health data private, calling to mind the purpose of HIPAA, the Health Insurance Portability and Accountability Act,[7] even though AI can help doctors better understand patterns in their patients’ health, diagnoses, and even surgeries.

I. United States Privacy Law

A. Federal Privacy Law

 As concerns grow about the privacy and security of data used in AI, there is currently no federal privacy law in the United States. Senators Jeff Merkley and Bernie Sanders proposed the National Biometric Information Privacy Act in 2020, which was not passed into law; it contained provisions such as requiring consent from individuals before collecting information, providing a private right of action for violations, and imposing an obligation to safeguard the identifying information.[8] The act also required private entities to draft public policies and implement mechanisms for destroying information, limit collection of information to valid business reasons, inform individuals that their information is stored, and obtain written releases before disclosure.

B. State Privacy Laws

There are a few states that have passed their own privacy laws or amended existing laws to include protections for biometric data, such as Illinois, California, Washington, New York, Arkansas, Louisiana, Oregon, and Colorado. Other states have pending bills or have tried—and currently, failed—to pass biometric protection regulation.

The first, and most comprehensive, biometric regulation was enacted in 2008: the Illinois Biometric Information Privacy Act (BIPA), which governs collecting and storing biometric information.[9] The biometric law applies to all industries and private entities but exempts the State or any local government agency.[10] BIPA requires entities to inform individuals in writing that their information is being collected and stored and why, and restricts selling, leasing, trading, or profiting from such information. There is a right of action for “any person aggrieved by a violation” in state circuit court or a supplemental claim in federal district court that can yield $1,000 for negligence, and $5,000 for intentional and reckless violations, as well as attorneys’ fees and equitable relief. In 2018-2019, over 200 lawsuits have been reported under BIPA, usually in class action lawsuits against employers.[11]

Texas’s regulation, Chapter 503: Biometric Identifiers, varies greatly from Illinois’s act.[12] Under this chapter, a person can’t commercialize another’s biometric identifier unless they inform the person and receive consent; once consent is obtained, one can’t sell, lease, or disclose that identifier to anyone else unless the individual consents to that financial transaction or such disclosure is permitted by a federal or state statute. The chapter suggests a timeframe for destroying identifiers, sets a maximum of $25,000 civil penalty per violation, and is enforced by the state attorney general. Washington’s legislation, Chapter 19.375: Biometric Identifiers, is similar to Texas’s regulation in that the attorney general can enforce it; however, Washington carved out security purposes to the notice and consent procedures usually required before collecting, capturing, or enrolling identifiers.[13]

California enacted the CCPA, or California Consumer Privacy Act of 2018, which provides a broader definition of “biometric data” and that consumers have the right to know which information is collected and how it’s used, delete that information, and opt-out from the sale of that information.[14] This law applies to entities that don’t have a physical presence in the state but either (a) have a gross annual revenue of over $25 million, (b) buy, receive, or sell the personal information of 50,000 or more California residents, households, or devices, or (c) derive 50% or more of their annual revenue from selling California residents’ personal information.[15] This was amended by the CPRA (the California Privacy Rights and Enforcement Act), which will become effective January 1, 2023, and expands the CCPA.[16] One expansion of the CPRA is a new category of “sensitive personal information” which encompasses government identifiers; financial information; geolocation; race; ethnicity; religious or philosophical beliefs; along with genetic, biometric, health information; sexual orientation; nonpublic communications like email and text messages; and union membership. It also adds new consumer privacy rights including the right to restrict sensitive information and creates a new enforcement authority. Thus, the CRPA brings California’s privacy law closer to the European Union’s General Data Protection Regulation.[17]

New York amended its existing data breach notification law to encompass biometric information into the definition of “private information.”[18] Similar to California’s law, the SHIELD Act applies to all companies holding residents’ data; on the other hand, the SHIELD Act outlines various procedures companies should implement for administrative, technical, and physical safeguards. New York also passed a limited biometric legislation for employers, but there is no private right of action.[19] Similar to New York, Arkansas amended its Personal Information Protection Act so “personal information” now includes biometric data. Louisiana also amended its Data Breach Security Notification Law to do the same, as well as added data security and destruction requirements for entities.[20] Finally, Oregon amended its Information Consumer Protection Act to include protections for biometric data with consumer privacy and data rights.

Most recently, on July 8, 2021, Colorado enacted the Colorado Privacy Act (CPA) after the Governor signed the bill into law.[21] The state Attorney General explains that the law “creates personal data privacy rights” and applies to any person, commercial entity, or governmental entity that maintains personal identifying information. Like consumers in California, consumers in Colorado can opt out from certain provisions of the Act­­—but not all; residents cannot opt out from the unnecessary and irrelevant collection of information, and controllers must receive a resident’s consent before processing personal information. As for remedies, the CAP provides for a 60-day cure period to fix non-compliance of the Act, or controllers will face civil penalties, but consumers do not have a private right of action under this law.

II. International Privacy Law 

Other countries have pioneered data privacy regulations, as exemplified by the European Union’s (EU’s) regulation: General Data Protection Regulation (GDPR).[22] Since 2018, this regulation has been enforced against companies that operate within any EU member state in order to protect “natural persons with regard to the processing of personal data and rules relating to the free movement of personal data.” The GDPR “protects fundamental rights and freedoms of natural persons,” particularly personal data. The regulation is quite comprehensive, with chapters on rights of data subjects, transfers, remedies, and even provisions for particular processing situations such as freedom of expression and information. There are several carve-outs or “exceptions” to the regulation, such as where a citizen gives consent for a specific purpose or the data are necessary for preventative or occupational medicine. Citizens also have “the right to be forgotten” or withdraw consent at any time and can lodge a complaint for violations or seek judicial remedy, compensation, or administrative fines.

Since the GDPR protects data of EU citizens and residents, it has an extraterritorial effect. In January of 2021, the European Data Protection Board (EDPB) adopted written opinions for new standard contractual clauses of the GDPR jointly with the European Data Protection Supervisor. One clause will be for the transfer of personal data between processors to third countries outside of the EU.[23] The transfer of personal data to a third country or international organization may only take place if certain conditions are met, namely following some of the safeguards of European data protection law. However, enforcement of the GDPR is taking time, and Ireland’s data protection commissioner, Helen Dixon, has explained that enforcement goes beyond issuing fines. Interestingly, as Apple, Facebook, Google, LinkedIn, and Twitter are based in Ireland, the country takes the lead in investigating companies.[24]

The GDPR has influenced other countries’ privacy laws. For example, Canada has a federal privacy law, the Personal Information Protection and Electronic Documents Act, and provincial laws that protect personal information in the private sector, which were heavily influenced by the EU’s GDPR.[25] Argentina has begun the legislative process to update its National Data protection regime, and such resolution was passed in January 2019.[26] Further, Brazil’s General Data Protection Law replicates portions of the GDPR and includes extraterritoriality provisions, but it also allows for additional flexibility. The GDPR has also affected the Israeli regulatory enforcement, which has been recognized by the European Commission as an adequate jurisdiction for processing personal information. While the list of countries affected by, or taking notes from, the GDPR is quite extensive, it’s important to note that this is a global challenge and opportunity to protect the privacy of consumers when handling biometrics, Big Data, and using them in AI.

III. Why the Legal Considerations for AI Matter

AI and the usage of Big Data and biometric information in everyday life effect a multitude of individuals and entities. AI can use a consumer’s personal information and, often, highly sensitive information. Misappropriation or violations of that information are enforced against business entities. Governments all over the globe are working to determine which, if any, regulations to pass to protect AI and what the scope of such rules should be. In the U.S., some states require the Attorney General to enforce state privacy laws, while other state laws provide individuals with a private right of action. Interestingly, given the role AI plays in innovation and technology, venture capitalists (VC) might also play a role as the law develops, since VC firms can work with policy makers and lobbyists to determine potential market failure, risk assessments, and benefits from protecting AI and data.[27]

In addition to the individuals, governments, entities, and industries affected by AI and Big Data biometric analysis, there are also legal implications. While this article discusses, at a high level, the international and national privacy law considerations from AI, there are other constitutional and consumer protection laws implicated as well. AI and other uses of Big Data and biometric information have quickly become ingrained in our everyday lives since the first smartphone was created by IBM in 1992. As laws all over the world continue to be discussed, drafted, killed, adopted, or amended, it’s important to understand the importance of AI and the data it uses.


* The information in this article does not, nor is it intended to, constitute legal advice, and has been made available for general information purposes only.

[1] Shafagat Mahmudova, Big Data Challenges in Biometric Technology, 5 J. Education and Management Engineering 15-23 (2016).

[2] Ryan N. Phelan, Data Privacy Law and Intellectual Property Considerations for Biometric-Based AI Innovations, Security Magazine (June 12, 2020).

[3] Gianclaudio Malgieri, Property and (Intellectual) Ownership of Consumers’ Information: A New Taxonomy for Personal Data, 4 Privacy in Germany 133 ff (April 20, 2016).

[4] Jan Grijpink, Privacy Law: Biometrics and privacy, 17 Computer Law & Security Review 154-160 (May 2001).

[5] Jim Sinur and Ed Peters, AI & Big Data; Better Together, Forbes, https://www.forbes.com/sites/cognitiveworld/2019/09/30/ai-big-data-better-together/?sh=5c8ed5f360b3 (Sept. 30, 2019).

[6] Joshua Yeung, What is Big Data and What Artificial Intelligence Can Do?, Towards Data Science, https://towardsdatascience.com/what-is-big-data-and-what-artificial-intelligence-can-do-d3f1d14b84ce (Jan. 29, 2020).

[7] David A. Teich, Artificial Intelligence and Data Privacy – Turning a Risk into a Benefit, Forbes, https://www.forbes.com/sites/davidteich/2020/08/10/artificial-intelligence-and-data-privacy–turning-a-risk-into-a-benefit/?sh=5c4959626a95 (Aug. 10, 2020).

[8] Joseph J. Lazzarotti, National Biometric Information Privacy Act, Proposed by Sens. Jeff Merkley and Bernie Sanders, National Law Review, https://www.natlawreview.com/article/national-biometric-information-privacy-act-proposed-sens-jeff-merkley-and-bernie (Aug. 5, 2020).

[9] Natalie A. Prescott, The Anatomy of Biometric Laws: What U.S. Companies Need to Know in 2020, National Law Review (Jan. 15, 2020).

[10] Biometric Information Privacy Act, 740 ILCS 14 (2008).

[11] Supra note 9.

[12] Tex. Bus. & Com. Code § 503.001 (2009).

[13] Wash. Rev. Code Ann. § 19.375.020 (2017).

[14] California Consumer Privacy Act (CCPA), State of California Department of Justice, https://oag.ca.gov/privacy/ccpa (last accessed May 22, 2021).

[15] Rosenthal et. al., Analyzing the CCPA’s Impact on the Biometric Privacy Landscape, https://www.law.com/legaltechnews/2020/10/14/analyzing-the-ccpas-impact-on-the-biometric-privacy-landscape/ (Oct. 14, 2020).

[16] Brandon P. Reilly and Scott T. Lashway, Client Alert: The California Privacy Rights Act has Passed, Manatt, https://www.manatt.com/insights/newsletters/client-alert/the-california-privacy-rights-act-has-passed (Nov. 11, 2020).

[17] Peter Banyai et al., California Consumer Privacy Act 2.0 – What You Need to Know, JDSupra, https://www.jdsupra.com/legalnews/california-consumer-privacy-act-2-0-93257/ (Nov. 27, 2020).

[18] Samantha Ettari, New York SHIELD Act: What New Data Security Requirements Mean for Your Business, JDSupra, (June 1, 2020).

[19] Supra note 9, referring to N.Y. Lab. Law §201-a.

[20] Kristine Argentine & Paul Yovanic, The Growing Number of Biometric Privacy Laws and the Post-COVID Consumer Class Action Risks for Businesses, JDSupra,  https://www.jdsupra.com/legalnews/the-growing-number-of-biometric-privacy-2648/#:~:text=In%202019%2C%20Arkansas%20also%20jumped,of%20an%20individual’s%20biological%20characteristics.%E2%80%9D (June 9, 2020).

[21] The Colorado Privacy Act: Explained, Beckage, https://www.beckage.com/privacy-law/the-colorado-privacy-act-explained/ (last accessed July 13, 2021); see also Phil Weiser: Colorado Attorney General, Colorado’s Consumer Data Protection Laws: FAQ’s for Business and Government Agencies, https://coag.gov/resources/data-protection-laws/ (last accessed July 13, 2021).

[22] General Data Protection Regulation (GDPR), https://gdpr-info.eu/ (last accessed May 22, 2021).

[23] Update on European Data Protection Law, National Law Review, https://www.natlawreview.com/article/update-european-data-protection-law (Feb. 24, 2021).

[24] Adam Satariano, Europe’s Privacy Law Hasn’t Shown Its Teeth, Frustrating Advocates, New York Times, https://www.nytimes.com/2020/04/27/technology/GDPR-privacy-law-europe.html (April 28, 2020).

[25] Eduardo Soares et al., Regulation of Artificial Intelligence: The Americas and the Caribbean, Library of Congress Legal Reports, https://www.loc.gov/law/help/artificial-intelligence/americas.php (Jan. 2019).

[26] Ius Laboris, The Impact of the GDPR Outside the EU, Lexology, https://www.lexology.com/library/detail.aspx?g=872b3db5-45d3-4ba3-bda4-3166a075d02f (Sept. 17, 2019).

[27] Jacob Edler et al., The Intersection of Intellectual Property Rights and Innovation Policy Making – A Literature Review, WIPO (July 2015).

Categories
Biotech C-IP2 News International Law Patents

Panel Discussion: Vaccines, Intellectual Property, and Global Equity

scientist looking through a microscopeThe following post comes from Colin Kreutzer, a 2E at Scalia Law and a Research Assistant at C-IP2

The COVID-19 pandemic has shined a spotlight on the role of intellectual property in modern medicine and on the complex social questions surrounding a system that grants exclusive rights over life-or-death products. On the one hand, there is clearly a difference between public access to lifesaving medicines and other patented goods, such as consumer electronics. However, creating these drugs required billion-dollar investments and enormous risk, made feasible only by that promise of IP rights. Wouldn’t taking that promise away harm future development of new medicines? As the world considers a waiver of IP rights over COVID-19 vaccines and other technologies, experts are analyzing not only what’s right and what’s wrong, but also what works and what doesn’t.

On June 10, 2021, C-IP2 and the Smithsonian’s Lemelson Center for the Study of Invention and Innovation held a panel discussion on vaccines, intellectual property, and global equity. With opening remarks by Lemelson Director Arthur Daemmrich, and moderated by C-IP2 Faculty Director Professor Sean O’Connor, the panel featured Dan Laster, Director of the Washington State COVID-19 Vaccine Action Command and Coordination System (VACCS) Center; Professor Arti K. Rai, Elvin R. Latty Professor of Law and Co-Director of the Duke Law Center for Innovation Policy; and Eric Aaronson, Senior Vice President and Chief Counsel, Corporate Affairs, Intellectual Property and Intellectual Property Enforcement, Pfizer Inc.

Opening Remarks

Mr. Daemmrich began with a historical perspective of medical developments in this country, as well as the social, economic, and regulatory issues that would invariably be tangled up within them. His tale foretold many of the conflicts we see today—going from a time when most modern medicines didn’t exist, and high mortality was a fact of life, to a time when vaccines and other treatments existed, but access depended partly on wealth. In between those two periods, we saw rapid growth in IP protection that helped move society from one to the other. But whether in the form of religious opposition to smallpox inoculation, regulatory reforms after tragedies from bad medicine, or protests from a marginalized community during the AIDS crisis, legal and social issues have always played a prominent role in the story of medical science.

Building on this historical base, Mr. Daemmrich posed the problem now facing us: compared to other medicines, there are relatively few vaccines. On a grand scale, the entire field of vaccination is still in a stage of early development, and there exists great potential for growth in the future. The question is how to best stimulate that growth, or rather, how to ensure the greatest access to already-developed vaccines without stifling the creation of new ones?

Prof. O’Connor then led the panel with a series of questions. He began by asking about the difference between two classes of medicine. Vaccines are generally thought of as biologics—treatments that are derived from live cells­—whereas pharmaceuticals belong to the class of “small-molecule” drugs. They are primarily chemical compounds rather than a biological product.

Q: From an IP perspective, are vaccines different from small molecule pharmaceuticals? What role does IP play in making vaccines available?

 Prof. Rai responded that vaccines are indeed very different from small molecule drugs. From an IP perspective, the two classes derive their greatest protection from different sources.

Small molecule drugs can be produced without the need for company trade secrets. All the most critical information can be found within the text of the patent. So, the greatest protection comes from the patent itself, which grants its owner the right to exclude others from making or using the drug, and from data exclusivity, which prevents other companies from using the original developer’s clinical data to obtain regulatory approval of its own product.

Vaccines, on the other hand, cannot be quickly copied solely by reading the patent. There is a great deal of “know-how” involved in the manufacturing process. Because of this, trade secrets can be just as important to vaccine protection as the patent.

The role of IP in vaccine access, she said, is an interesting question. While public funding exists in the world of small molecule drugs, it has a “heavier footprint” in vaccine development, which then has some impact on the incentive model as it applies to vaccines.

Mr. Laster said the role of public funding was critical to his prior work at PATH, an organization devoted advancing global healthcare equity through public-private partnerships and other initiatives. Public funding has a “de-risking” effect in that the high costs and uncertainty of clinical trials are not borne entirely by the private sector. And because vaccine development typically requires cooperation among many parties, it is valuable to have different types of incentives in play (i.e., “pull”-type incentives, such as patent grants, as well as “push”-types, such as public funding). But from an IP perspective, exclusivity can pose a challenge to those cooperative efforts.

Additionally, he said that the detailed know-how involved with vaccines makes technology transfer incredibly difficult. If the intended receiver in a developing nation lacks the capacity to utilize the technology, how can effective tech transfer work in real-world practice? The question is less about whether we should be transferring vaccine technology to developing nations than it is about whether we can.

Mr. Aaronson said that a key piece of our IP system is that it does allow for greater cooperation by providing a means of transferring technology among partners while preventing that technology from being used for unauthorized purposes. He credits that cooperative system for enabling Pfizer to partner with BioNTech, producing a vaccine in record time. He added that this vaccine is currently supplied in 116 countries and counting, that they have committed to supplying at least 2.5 billion doses, and that they have just struck a purchase agreement with the United States for 500 million doses to supply lower-middle income nations. The required research, discovery, and development would not have been possible without a strong IP system that provides the right incentives and enables secure technology sharing among a large host of players.

Q: While we don’t know what final form the waiver might take, do you see it playing a necessary role in actually increasing vaccine supply and access in the coming year or two? Are there potential downsides to an IP waiver that should be considered?

Prof. Rai said that the biggest effect of a waiver would likely be its “symbolic” value, as other factors will have a much greater impact on vaccine access. But even if there were no substantive effect, it would be good for high-income nations to demonstrate an interest in global health issues. However, she considered the waiver issue “a little bit of a sideshow,” saying it likely would be “neither as bad as opponents fear nor as good as proponents hope.”

Prof. O’Connor noted that this is a particularly difficult question to answer when nobody knows what form any potential waiver would eventually take.

Mr. Laster based his perspective on his ten years of negotiating vaccine development and distribution efforts with PATH, saying he is “not sure [the waiver] aligns well” with what’s needed. Recognizing the importance of trade secrets and the complexity of the partnerships involved, he says a successful system must encourage willing cooperation. Simply waiving IP rights won’t necessarily do that. He also cautioned against taking a “static view” of the problem by taking for granted that the vaccine already exists rather than considering the IP system that helped create it, and failing to ensure that the same system is incentivizing new vaccines in the future. That said, the threat of a waiver might provide enough encouragement to bring about voluntary participation before an actual waiver becomes a reality. He credits this threat with already having a noticeable effect on pricing and other strategies.

Mr. Aaronson added that we are dealing with multiple vaccines based on very different technologies. Concentrating “a little more on the practical versus the theoretical,” he noted that the impacts of an IP waiver can vary greatly from one technology to another. The mRNA vaccine is the first drug of its type to ever receive approval. Much of the necessary tech transfer would not be limited to COVID-19, but could apply to the entire mRNA technology platform, drastically impacting its value. Waiving the rights to a groundbreaking technology could reduce the incentive to explore uncharted technological fields.

He also said it’s not certain that waiving IP rights would yield a net increase in the number of doses produced. The existing developers are producing large amounts of the vaccine. Opening the supply chain up to new entrants who may not be able to effectively utilize those supplies could yield a net decrease in production.

Prof. O’Connor also took audience questions for the panel. Some are listed below, starting with a “great foundational question.”

Q: How would it be ethical to allow lifesaving medicines and vaccines to be patented?

Prof. O’Connor began by addressing the purely legal perspective—that such patents are allowed under U.S. law, although there have been exceptions in some other countries at certain times because of this complex ethical question.

Mr. Aaronson said it’s important to think about patents as a part of a broader incentive structure. Are we putting the incentives in place to get someone to get up every morning and put in the work, money, and risk to create a product? We need an incentive structure, or there won’t be anyone making those lifesaving medicines. A patent system is one way to achieve this.

Q: If patent disclosures cannot teach producers how to make a vaccine without also getting corresponding know-how, how can they satisfy the disclosure requirement for patentability?

Prof. Rai has written multiple articles about this question (see one here) and offered several reasons. Some of the know-how is not easily written down. The need for shared know-how could possibly be satisfied by depositing biological materials with the Patent Office, but this is unlikely to happen. Another reason is that the final product that emerges from a years-long regulatory approval process is not always identical to the product described in the patent. There is also a mistaken view that patents and trade secrets cannot protect the same product. It is true that a singular feature cannot be both patented and kept as a trade secret, but a single product may have different features that are protected under one regime or the other.

Mr. Aaronson also pointed out that a single drug may be protected by many patents. Some of the know-how simply involves knowing how to properly combine the patented technologies.

Q: If most of the medical innovations occur in wealthy nations, IP laws will lock developing nations out, at least initially. Is there a way to include developing nations earlier in the innovation process?

All panelists agreed on the importance of this issue, as well as on the fact that it’s much easier said than done. Prof. Rai said that every nation must begin to create its own manufacturing capacity to avoid reliance on others, but this requires large amounts of human capital and infrastructure. The problem really goes beyond medicine to the balance of rich and poor nations generally. Mr. Laster said this is the sort of thing he was working on with PATH, which has created some networks, but there is a long way to go. Building the required skillsets and infrastructure locally takes time, but public-private partnerships can help. Mr. Aaronson said that it’s essentially like asking a nation to stop being a low-income country. It’s a somewhat circular issue, in which money is required to build infrastructure, but infrastructure is required to make money. However, this is where IP is not the problem; it is the solution. A strong IP system can create the necessary investment incentives to begin building a better future in any nation.

Closing Remarks

In closing, Prof. Rai said that “regrettably, the public debate on the . . . waiver has been very simplistic.” She hoped that the panel had “shed some light” on the issue and thanked her fellow panelists for a respectful and productive dialogue. Mr. Last er agreed that “it is a complex topic” but said that “it’s not about the waiver;  I do think there are mechanisms that can lead more likely to the outcomes we want.” Mr. Aaronson finished by saying that “we all have the same goal, to figure out ways to bring medicines and vaccines to patients, no matter where they are in the world. We’re fortunate and thrilled that our vaccine has had that potential to change lives, and our goal is to continue . . . to ensure access” to both this and to future vaccines.

A recording of the panel is available here.