What your business clients need to know about generative AI risks

By David Gambrill | May 4, 2023 | Last updated on October 30, 2024
3 min read
Version of Michelangelo's painting "The Creation of Adam" depicting the development of generative AI and machine learning

Commercial brokers and insurers should alert their business clients to new liability risks around intellectual property (IP), copyright and licensing associated with using generative artificial intelligence (AI) technologies like ChatGPT, Dall-E and Bard.

“Generative AI is a type of artificial intelligence technology that can produce various types of content including text, imagery, audio, and synthetic data,” as defined by George Lawton in a blog for TechTarget.com.

One of the key issues with generative AI in its nascent form, Lawson observes, is that the technology doesn’t always identify its sources of information. That creates several legal issues, including plagiarism (breaking copyright law); generating seemingly original designs based on material protected by intellectual property laws, and violating licensing restrictions (for example, some open-source content might be protected by a license excluding “commercial use”).

“Generative AI systems are trained on large datasets that can include works that are themselves protected under…copyright law,” Goodwin Procter LLP partner Stephen D. Carroll writes in a Mondaq blog posted for his U.S. law firm. “That raises the possibility that outputs from generative AI tools could infringe on the copyrights of those underlying works….

“Understanding this is particularly important for businesses that use generative AI tools to write software code for use in products.”

In the future, Lawton writes, the use cases of generative AI in business might include, among other things:

  • writing email responses, business reports, research/white papers, or crafting news releases,
  • creating photorealistic art in a particular style,
  • improving product demonstration videos,
  • suggesting new drug compounds to test,
  • designing physical products and buildings,
  • optimizing new chip designs,
  • writing and producing music in a specific style or tone.

But when businesses use generative AI to aid in creating products or material, they must keep in mind “the output from a generative AI tool does not have to be an exact replica to infringe on the copyright of the original work,” Carroll notes. “The output may infringe if it is deemed derivative.

“For example, if someone prompted a generative AI tool to use the characters and themes from the first seven Harry Potter novels to write an eighth novel in the series, the resulting work would almost certainly be considered a derivative work under the Copyright Act and constitute infringement.”

For businesses to protect themselves when using generative AI tools to write software code, Carroll recommends three things:

  • “Get a license or a representation and warranty from the provider of the generative AI tool ensuring that the source works on which the tool is trained are licensed — and that the license extends to you, the user.”
  • “Run a source code audit program to analyze any code you create using generative AI tools to determine whether it is similar to any other code, open source or otherwise. If it is, you can take steps to comply with the relevant open-source license or excise the code. Importantly, running a source code audit program can itself be evidence against a claim of willfulness in a copyright action.”
  • “Conduct due diligence on the provider of the generative AI tool to understand what source materials it uses. Some generative AI tools may give users a degree of choice in determining what training materials are included when they use the tool.”

 

Feature image courtesy of iStock.com/Marcio Binow Da Silva

David Gambrill

David Gambrill