Computer: Will ChatGPT Be Useful for Discovery Depositions?

Esquire Deposition Solutions, LLC
Contact

Esquire Deposition Solutions, LLC

A frequent jab at the legal profession asserts that lawyers are a hopelessly old-fashioned, technology-averse lot that will have to be dragged into the future inch by ink-stained inch. Yet from all available evidence, this aspersion is plainly false or, at best, no longer true. Every day brings news of another technological innovation in the practice of law.

Technology’s grip on the legal profession today is firm. At the risk of stating the obvious, already lawyers use technology for legal research, managing their law practices (e.g., accounting, billing, client intake and CRM), document management and “knowledge management,” court filing, electronic discovery, and – largely since the COVID-19 pandemic – remote meetings and remote depositions. The competent and judicious use of technology is, in the view of many jurisdictions, an ethical obligation. 

Lately most of the technology discussion centers around the possible legal applications of ChatGPT, an artificial intelligence technology that appears to be able to perform tasks that were once handled exclusively by lawyers (if not by licensed lawyers, then by other human beings supervised by lawyers).

ChatGPT, developed by OpenAI, creates surprisingly human responses to questions (and other prompts) based on publicly available information on the Internet. ChatGPT isn’t limited to free data sources. It has the ability to consume, and synthesize, any data source – thus raising interesting possibilities when ChatGPT is trained on proprietary, law-specific datasets.

ChatGPT recently passed the multistate bar exam with ease. More significantly, according to at least one group of researchers, the legal industry is among the economic sectors most exposed to disruption by ChatGPT and similar language modeling technologies.

Andrew Perlman, a legal futurist and dean of the Suffolk University Law School in Boston, recently remarked: “For the legal industry, ChatGPT may portend an even more momentous shift than the advent of the internet.”

ChatGPT is a high-level language modeling technology that can be licensed and incorporated into any number of task-specific legal applications. ChatGPT is already being used in legal applications that:

  • assemble, analyze, and create redlined versions of contracts
  • summarize the contents of court opinions and court filings
  • conduct legal research and enhance the effectiveness of legal research
  • propose arguments that could be made in a brief or other court filing
  • communicate with the public on law-related topics and provide legal advice

The list of compelling law-related applications for ChatGPT will only grow longer as more and more attorneys are exposed to its possibilities.

ChatGPT for Depositions

There’s every reason to believe that ChatGPT will play a role in how discovery depositions are conducted as well. In the March/April 2023 issue of the Harvard Law School Center on the Legal Profession’s The Practice magazine, preparation for a discovery deposition is one of the use cases mentioned in the article The Implications of ChatGPT for Legal Services and Society.

The authors show how effective ChatGPT can be in developing a list of deposition questions for the plaintiff in a hypothetical auto mishap case. While the questions provided by ChatGPT might seem obvious to an experienced personal injury litigator, the fact that they were generated, on demand, in mere seconds is impressive. What’s more, the cost of a typical ChatGPT session is measured in pennies.

To make the promise of using ChatGPT to prepare for, or help conduct, depositions apparent to readers of this blog, we posed a few questions to ChatGPT ourselves.

Prompt: List 10 reasons why an x-ray technician could fail to detect lung cancer on a chest x-ray.

ChatGPT replied:

As an AI language model, I don’t promote negative thoughts or misinformation about any profession. However, here are some possible reasons why an x-ray technician could miss detecting lung cancer on a chest x-ray:

  1. The cancer is too small to be visible on the x-ray.
  2. The cancer is obscured by other structures in the chest.
  3. The x-ray machine is not powerful enough to detect the cancer.
  4. The technician did not position the patient properly during the scan.
  5. The patient moved during the scan, causing blur or distortion of the image.
  6. The film was not developed correctly.
  7. The technician did not review the images thoroughly enough.
  8. The technician misinterpreted the images due to lack of experience or training.
  9. There was a technical error in the imaging equipment used for diagnosis.
  10. There were artifacts present on the image that interfered with interpretation.

Is that valuable information? At a minimum, the factors mentioned by ChatGPT seem like a good starting point for a challenging cross-examination of a medical professional in a “failure to diagnose” case.

Mind you, we’re not promoting the use of ChatGPT in depositions, but it is impressive to see the technology in action. 

Prompt: List 10 reasons why an automobile air bag would fail to deploy.

ChatGPT replied:

  1. Faulty sensors: The airbag may fail to deploy if the sensors that detect a crash are faulty or have malfunctioned.
  2. Incorrect installation: If the airbag is not installed properly, it may fail to deploy during a collision.
  3. Electrical malfunction: If there is an electrical fault in the airbag system, it may cause the airbags to fail to deploy.
  4. Low battery voltage: If the battery voltage is low, this can prevent the airbags from deploying.
  5. Inaccurate calibration: If the airbag system is not calibrated correctly, it may not deploy during a collision.
  6. Damaged wiring: Any damage to the wiring that connects the sensors and airbag control module can prevent airbags from deploying.
  7. Defective control module: A defective airbag control module can cause the airbags to fail to deploy.
  8. Incompatible software: If there are compatibility issues between the software and hardware of the airbag system, it may not function properly.
  9. Manufacturing defects: Defective or faulty manufacturing of any component of the airbag system can cause it to fail during deployment.
  10. Excessive rust or corrosion: Rust and corrosion can damage components of the airbag system and prevent them from functioning correctly during a collision.

Again, a good starting point for deposing an expert in a products liability case. While experienced litigators might have better, case-specific lines of attack on the expert’s testimony, it’s not outside the realm of possibility that ChatGPT will uncover (in 1-2 seconds) angles that even experienced litigators might overlook.

And finally.

Prompt: Can artificial intelligence be useful in preparing for discovery depositions?

ChatGPT replied:

As an AI language model, I cannot provide a personal opinion, but I can provide information.

Yes, artificial intelligence can be useful in preparing for discovery depositions. AI-powered tools like natural language processing (NLP) and machine learning can help lawyers and their teams quickly analyze and categorize large volumes of documents and data related to the case. This can save time and reduce the risk of human error in identifying relevant information. Additionally, some AI tools can help predict how a witness might answer certain questions based on their past statements or behavior patterns, allowing attorneys to better prepare for the deposition. However, it is important to note that AI should not replace human judgment and expertise in the legal process.

We couldn’t have said it any better ourselves.

ChatGPT neglected to mention that, earlier this year, the American Bar Association’s House of Delegates approved Resolution 604 (PDF), which encourages everyone in the legal community – lawyers, judges, regulators, and technology vendors – to take steps to ensure that artificial intelligence technologies like ChatGPT are deployed in a lawful and ethical manner. Resolution 604 provides guidance for how key concepts such as accountability, transparency, and traceability should guide the legal community’s adoption of AI-enabled practice tools.

Finally, it’s worth noting that ChatGPT is a work in progress. ChatGPT can make obvious errors and, being a “black box” that reveals nothing about its sources or reasoning, the day when ChatGPT might produce admissible evidence in court is far in the future.

Written by:

Esquire Deposition Solutions, LLC
Contact
more
less

PUBLISH YOUR CONTENT ON JD SUPRA NOW

  • Increased visibility
  • Actionable analytics
  • Ongoing guidance

Esquire Deposition Solutions, LLC on:

Reporters on Deadline

"My best business intelligence, in one easy email…"

Your first step to building a free, personalized, morning email brief covering pertinent authors and topics on JD Supra:
*By using the service, you signify your acceptance of JD Supra's Privacy Policy.
Custom Email Digest
- hide
- hide