AI can process vast quantities of data and without much noteworthy human intervention, transform it into an AI-generated output. The discussion on how to treat any intellectual property rights arising in both the materials used to train the AI (input) and the results created by the AI (output) is still in its early days. To keep these issues comprehensible, we focus in our report on legal questions concerning copyright laws, but the same concepts are likely to be applicable to other sorts of protected IP.
Generative AI systems both ingest and generate large amounts of data, including images, text, speech, video, code, business plans and technical formulae. Training, testing, uploading, analysing, consulting or otherwise processing such input and output data requires various levels of protection. In our report, we turn to some of the main challenges organisations face when using personal and confidential data in Generative AI systems, as well as the measures they could adopt to mitigate the relevant legal risks.
Across jurisdictions, there are several common personal data principles and protections that are highly affected by Generative AI systems. When using Generative AI, organisations should pay specific attention to the following aspects of the solutions they use: transparency, data minimisation, lawfulness, sensitive data and individual rights.
Given the legal risks associated with the use of Generative AI in a business context, when licensing or otherwise entering into a contract that relates to a Generative AI solution, careful consideration to the terms of the contract under which the solution is procured is important. There are a number of key points that will likely be required to be addressed and understood—from liability to AI regulations.
While the competitive advantage of Generative AI is enticing, adoption of this powerful, differentiating technology demands attention to the risks that could imperil an enterprise’s brand, reputation, stakeholder trust or critically, its compliance with legal and regulatory obligations.
Opens in new window