Over the weekend, President Donald Trump terminated Shira Perlmutter, the Register of Copyrights and leader of the U.S. Copyright Office, just one day after the office published a contentious report on artificial intelligence. This report, which questions the legality of AI companies using copyrighted materials for training their models under the fair use doctrine, may have significant repercussions for ongoing litigation involving major tech entities like OpenAI and Meta.
Despite the report being advisory, its findings could sway future court rulings. It was issued as a “pre-publication version” late on Friday, May 9, lacking the typical fanfare. The following day, Perlmutter was relieved of her duties. Just one day prior, on Thursday, May 8, President Trump also removed Dr. Carla Hayden, the Librarian of Congress who supervised the Copyright Office. The White House stated that her dismissal was related to the Library’s diversity, equity, and inclusion (DEI) initiatives.
The swift progression of events—the ousting of Dr. Hayden, the abrupt release of the AI report, and the later dismissal of Perlmutter—has sparked concerns among legal professionals and copyright experts. Cornell H. Winston, the President of the American Association of Law Libraries, voiced “deep concern” regarding the firings in a communication to members, although he did not reference the AI report specifically.
President Trump has expressed a clear desire to cultivate a business-friendly atmosphere for AI progress. In April, he enacted two executive orders aimed at enhancing U.S. leadership in the AI field.
The AI report being discussed is the third part of a comprehensive three-part series from the U.S. Copyright Office analyzing the convergence of copyright law and artificial intelligence. This concluding section, titled “Copyright and Artificial Intelligence Part 3: Generative AI Training,” specifically evaluates whether utilizing copyrighted books, films, articles, and images for training AI models qualifies as fair use.
The report implies that such practices may not be safeguarded under fair use, a finding that could weaken the legal defenses of companies like Meta and OpenAI. It also underscores potential financial harm to artists whose works are imitated by AI-generated content, who may miss out on licensing opportunities if their work is exploited without remuneration.
Interestingly, the report was released quietly and in an incomplete “pre-publication” format, which has drawn scrutiny from legal analysts. Blake E. Reid, a copyright attorney and law professor at the University of Colorado, speculated on social media that the timing of the release and the dismissals could point to a more extensive overhaul within the Copyright Office. “The ‘Pre-Publication’ status is quite odd and suspiciously timed,” Reid remarked, indicating that the office may have hastily put out the report prior to further personnel changes.
In a communication to Mashable, a representative for the U.S. Copyright Office confirmed that Perlmutter was informed of her termination through email on Saturday, May 10, and declined to provide additional comments regarding the timing of the report’s release. The White House has not yet addressed inquiries related to the firings or the report.
Reid characterized the report as a “straight-ticket loss for the AI companies,” suggesting that its findings are unlikely to bolster their legal positions in court. “The AI firms were counting on the Office to provide them some lifelines,” he stated. “Instead, the report indicates that certain uses of copyrighted material do not fall within fair use parameters.”
Pamela Samuelson, a law professor and copyright authority, reiterated this view in a post on Bluesky, emphasizing that the report undermines fair use defenses and supports a new theory of market harm.
While the timing of the report and the dismissals has raised eyebrows, there is no conclusive proof linking the two occurrences. The pre-publication version of the report is accessible for public viewing on the U.S. Copyright Office website.