Staying Secure: Best Practices for AI in eDiscovery

Lighthouse
Contact

Lighthouse

Summary: For better security when using AI in eDiscovery, cloud-based large language models and AI governance practices are critical for your program. We outline some key security considerations and benefits of these approaches.

‍Security is a hot topic for legal teams considering AI. The technology is new to many legal and non-legal teams, it uses and generates data in new ways, and the consequences of mishandling data are sky high.

Two of the primary concerns with hosting, storage, or IT and IT for eDiscovery:

  1. The use of AI on collected data
  2. Using generative AI to create content that could be subject to discovery

We will focus on the first: The use of AI on collected data.

By making informed choices about hosting, storage, and other aspects of AI integration and data management, you and your organization can satisfy your security and privacy needs and reap the efficiency and other benefits of AI for eDiscovery.

Cloud vs on-prem

Naturally, the cloud—like AWS, Google Cloud, or Azure—is an important factor when using AI as large language models (LLMs) need cloud computing for the analysis and scale that best-in-class solutions have to offer. While there are some on-prem AI solutions, there are a few things to consider.

Cloud LLMs offer innovation and scale

Cloud LLMs evolve and improve as the companies behind them have huge resources to continue to evolve and adapt to stay competitive. This evolution includes updates to security. On-prem AI is likely to be more resource constrained, relying on smaller teams to update the technology and security.

Also, working in the cloud allows solutions to scale, while on-prem is more limited. This is important because LLMs work best at scale: The more data they have access to, the more effective they are and more value they provide. As this work evolves, it helps improve LLM scalability in legal technology overall.

Securing cloud for AI

Although AI is specifically regulated and governed differently from the cloud, the processes developed to secure the cloud-based solutions you use every day has laid a good foundation for securing cloud for AI. In other words, if your organization already practices good cloud security hygiene, you’re in good shape to use a cloud-based LLM.

Given the sensitive nature of legal data, initially there was hesitation about the cloud. However, as other corporate business units successfully integrate cloud-based software into their work, legal leaders are starting to recognize the value and robust security measures in cloud platforms. We now benefit from the comprehensive security infrastructure provided by industry leaders like Microsoft in addition to our own rigorous security measures, offering our clients the highest levels of data protection.

Maintaining good cloud hygiene

One example of good hygiene is reading the fine print about how cloud solutions share your data outside your organization and finding a way to opt out. For example, Microsoft's Azure OpenAI Service abuse monitoring policy gives authorized Microsoft employees to retain and view privileged client information.

While this is appropriate in some instances, this is not acceptable for eDiscovery use cases where high-risk, proprietary, or confidential information is at play.

When building AI solutions, I work with our security team to deploy our standard process for cloud. Some key steps include:

  • Read the entire documentation, including the data privacy agreement and any available data protection impact assessments.
  • Have security experts evaluate the setup and call out concerns, requirements, and constraints.
  • Make sure your data governance practices and cloud policies are up to date.
  • Work with our Azure partners.

If you follow your standard process, which includes these best practices, you will get to the right conclusions (like you always do).

Conclusion

Legal teams are right to be cautious and prudent when bringing on new technology.

To maintain security when using AI, you can choose a cloud-based LLM and follow your best practices for governing AI in eDiscovery. It’s also important to work with the right partners who provide the transparency you need to feel confident adding new technology to your workflows.

[View source.]

Written by:

Lighthouse
Contact
more
less

PUBLISH YOUR CONTENT ON JD SUPRA NOW

  • Increased visibility
  • Actionable analytics
  • Ongoing guidance

Lighthouse on:

Reporters on Deadline

"My best business intelligence, in one easy email…"

Your first step to building a free, personalized, morning email brief covering pertinent authors and topics on JD Supra:
*By using the service, you signify your acceptance of JD Supra's Privacy Policy.
Custom Email Digest
- hide
- hide