paint-brush
Coders Who Brought Lawsuit Against GitHub, Microsoft & OpenAI Seek $9 Billion in Damagesby@legalpdf

Coders Who Brought Lawsuit Against GitHub, Microsoft & OpenAI Seek $9 Billion in Damages

by Legal PDFSeptember 22nd, 2023
Read on Terminal Reader
Read this story w/o Javascript
tldt arrow

Too Long; Didn't Read

Two anonymous detractors bring this lawsuit, claiming “software piracy on an unprecedented scale,” and seeking penalties in excess of $9 billion.
featured image - Coders Who Brought Lawsuit Against GitHub, Microsoft & OpenAI Seek $9 Billion in Damages
Legal PDF HackerNoon profile picture

Github Motion to dismiss Court Filing, retrieved on January 26, 2023 is part of HackerNoon’s Legal PDF Series. You can jump to any part in this filing here. This part is 2 of 26.

MEMORANDUM OF POINTS AND AUTHORITIES

INTRODUCTION AND SUMMARY OF ISSUES


Since the advent of open source software, developers have been free—indeed, encouraged—to use, study, change, and share source code for the broader good. It’s an extraordinary idea, one that has fostered an immense body of public knowledge, ever-evolving and ever-available for the next generation of developers to build, collaborate, and progress. The open source model is essential to collaborative and communal software development. GitHub was founded on the basis of these ideals, and when Microsoft invested billions to acquire GitHub in 2018, it cemented its commitment to them. The transformative technology at issue in this case, Copilot, reflects GitHub and Microsoft’s ongoing dedication and commitment to this profound human project. Copilot is a coding assistant tool that crystallizes the knowledge gained from billions of lines of public code, harnessing the collective power of open source software and putting it at every developer’s fingertips.


Two anonymous detractors bring this lawsuit, claiming “software piracy on an unprecedented scale,” and seeking penalties in excess of $9 billion. Despite that rhetoric, they do not advance a copyright infringement claim at all—doubtless an attempt to evade the limitations on the scope of software copyright and the progress-protective doctrine of fair use. Copilot withdraws nothing from the body of open source code available to the public. Rather, Copilot helps developers write code by generating suggestions based on what it has learned from the entire body of knowledge gleaned from public code. In so doing, Copilot advances the very values of learning, understanding, and collaboration that animate the open source ethic. With their demand for an injunction and a multi-billion dollar windfall in connection with software that they willingly share as open source, it is Plaintiffs who seek to undermine those open source principles and to stop significant advancements in collaboration and progress.


Plaintiffs’ Complaint fails on two intrinsic defects: lack of injury and lack of an otherwise viable claim. Plaintiffs do not allege that this extraordinary new tool has harmed them in any way. They do not explain how teaching Copilot about their code has taken anything from them. They do not allege that Copilot has done anything improper with their code, nor that GitHub or Microsoft did anything improper with their personal information. They seem to say that Copilot could theoretically suggest a snippet of code that matches something they have published, and do so without giving them proper attribution. How that series of hypothetical events could harm them is also unexplained. There is no case or controversy here—only an artificial lawsuit brought by anonymous Plaintiffs built on a remote possibility that they will fail to be associated with an AI-generated code snippet that could in theory be connected to copyrightable aspects of their source code. The Complaint’s failure to make out a case for actual injury to named people requires dismissal under Rules 10 and 12(b)(1). Part I, infra.


Additionally, Plaintiffs do not state a claim. Their Complaint cycles through twelve purported counts, some of which embrace multiple theories, in search of a hook for their abstract grievance against Copilot. Despite their attempt to make a federal case out of it, none of these counts plausibly establishes that training or using Copilot violates any legal right. Part II, infra.



Continue Reading Here.


About HackerNoon Legal PDF Series: We bring you the most important technical and insightful public domain court case filings.


This court case 4:22-cv-06823-JST retrieved on September 11, 2023, from documentcloud.org is part of the public domain. The court-created documents are works of the federal government, and under copyright law, are automatically placed in the public domain and may be shared without legal restriction.