Artificial intelligence (AI) tools continue to proliferate, with many aiming to automate processes and increase productivity. But customers of these tools or customers of vendors who use them must understand what’s going on behind the scenes with respect to access and use, of the tool as well as how customer data will be used.
In some cases, customers may need to read between the lines of the tech contract to determine whether an AI tool is being used by the vendor. Last month we looked at handling AI licensing and related data from the perspective of the vendor. This month we’ll turn to issues facing customers when they leverage AI technology.
AI Tool Access
When a customer seeks to use an AI tool offered by a vendor, a frank conversation about how the customer will access the tool should occur before any contract review takes place. Customers may want to ask if the tool will be installed locally at the customer’s place of business or be hosted and accessed via a software-as-a-service (SaaS) on a remote server. If the AI tool can be accessed only through the customer’s own systems, the customer’s information security infrastructure will help protect the confidentiality of any information, data, or content ingested by the tool.
But local installations of AI tools are more the exception than the rule. Instead, for scaling purposes, AI tools are typically offered on a SaaS basis. For confidentiality and security, when AI tools are offered via SaaS, customers should ask if they will receive a dedicated instance of the AI tool (on a server), or if the tool will be a shared resource to be used with other users and their data. Note that a dedicated instance will likely provide greater security, given the customer’s use, and data will be isolated, depending on the hosting entity’s information security program.
Data and Rights to Use Data
Regardless of whether a customer is seeking to use an AI tool or if a vendor would like to use such a tool, figuring out how customer data will be used is critical. As a first step, customers will want to review the data it will provide to the AI tool or vendor. If the data is personal, sensitive, confidential, privileged, etc., then customers should consider the data use rights the vendor will have in its contract to help ensure compliance with any privacy or federal laws, as well as confidentiality obligations customers may owe to any third party.
To this end, customers should ask the vendor what rights the vendor will have to use the data. For example, consider whether the data use rights will be limited to performance of the services and provision of the AI tool, or whether vendors may use the data for AI training, marketing, or other commercial purposes not directly related to the customer’s receipt of or use of the services or tool. This is especially important, as vendors, in many cases, will try to shift legal liability to customers via their tech contracts for the customer’s failure to obtain appropriate consents or permissions in accordance with vendor’s rights to use customer data as permitted in such contract.
Unknown AI Tool Use by Vendors
Through their tech contracts, some vendors may not explicitly disclose that they are using AI tools to process customer data. Rather, vendors may have broad (and seemingly vague) rights built into their tech contract package that provide permission for the vendor to use customer data to, for example, “improve the services.” Vendors may interpret this phrase as allowing use of customer data for AI training purposes, subject to other confidentiality obligations in the contract. So, customers should ask the vendor what they mean by this or any similar phrase. Then, based on the type of customer data that will be provided to the vendor, customers may adjust the phrase and other terms accordingly.
These customer issues are not intended to be exhaustive, and additional considerations exist, depending on the use of the AI tool, the data involved, and the vendor’s tech contract terms.