A court gavel creating ripples on a grid-like digital surface, symbolizing the impact of legal decisions in AI

This article is part of "Build IT," a series about digital-tech trends disrupting industries.

In December, Michael Cohen, the lawyer who gained notoriety working for Donald Trump, asked a federal judge to overlook his latest transgression: citing cases fabricated by generative AI. Cohen had used Google Bard, a predecessor of Google Gemini, to cite cases that didn't exist. Cohen claimed ignorance, saying he misunderstood the chatbot "to be a supercharged search engine."

Cohen wasn't the only lawyer to make this mistake. A federal judge last year fined two lawyers $5,000 for citing fictitious cases. And in February, a court imposed $10,000 in sanctions over an appeal that cited nearly two dozen fake cases.

These failures could suggest AI has no place in the practice of law. But some lawyers and legal experts told BI that that isn't always the case. Generative AI's accuracy can make it a minefield, but the legal industry's increasing complexity has many lawyers using it for help.

Danielle Benecke, the head of machine-learning practice at the international law firm Baker McKenzie, said AI models were getting good at "interpreting and generating complex legal language," a core part of the business.

Danielle Benecke wearing a red blazer in a headshot
Danielle Benecke, the head of machine-learning practice at Baker McKenzie.

A lawyer's copilot

Founded in 1949, Baker McKenzie has over 6,500 lawyers working in 70 offices worldwide. Benecke said the firm's interest in AI predated generative AI, but the recent arrival of large language models, or LLMs, kicked off a wave of innovation. The firm's work building generative AI to produce legal draft advice for high-volume employment-law questions recently won an award from Law.com.

Benecke said AI tools were especially useful for handling the legal fallout from common issues like cybersecurity incidents. Even a minor incident can overwhelm a company with regulatory-compliance requirements that necessitate several days of work from a small team of lawyers, racking up steep fees.

The pinnacle of AI application in the next five to 10 years is going to be empowering lawyers. Cecilia Ziniti, the CEO and cofounder of GC AI

Benecke said the firm's tools were designed to provide accurate advice in order to significantly reduce the time lawyers spend on navigating a client's regulatory requirements.

Benecke stressed that the firm's goal is quality, not efficiency. She said the time saved sorting through regulatory requirements is better spent strategizing on the client's response to the incident.

Cecilia Ziniti, the CEO and cofounder of GC AI, predicted this dynamic would come to dominate discussions of AI in the legal profession. "The pinnacle of AI application in the next five to 10 years is going to be empowering lawyers," she said. "It's a lawyer copilot."

Popular media often focuses on the most romantic aspects of law, like a prosecutor grilling a defendant on the stand or a hardworking lawyer crafting a novel legal strategy. But the reality, Ziniti said, is often less glamorous, as the legal industry spans a "very long tail" of tedious tasks.

Cecilia Ziniti wearing a gold necklace and black blazer in a headshot
Cecilia Ziniti, the CEO and cofounder of GC AI.

Ziniti, like Benecke, gave the example of regulatory requirements. In January, the Federal Trade Commission sent a request for information to five companies, including Microsoft and OpenAI, asking for things like emails that might span hundreds of documents. The request was made as part of the FTC's investigation into competition in the AI industry.

Replying to such a request can require hundreds of hours of work as lawyers sift through documents to find relevant information. It's important work — a failure to comply may be met with stiff penalties and further scrutiny — but it's also repetitive, dull, and time-consuming.

Ziniti said an AI "copilot" allowed lawyers "to do what we call practicing at the top of our license," meaning "we can do the things that we are most capable of doing, that are the fun part."

GPT-4 enters the courtroom

The allure of a tool that tirelessly digs through documents on a lawyer's behalf is significant but shadowed by AI's biggest bugbear: hallucination.

IBM describes a hallucination as when an AI tool's LLM perceives nonexistent patterns and generates "outputs that are nonsensical or altogether inaccurate." As Cohen discovered, this can happen when an AI chatbot is prompted to answer a specific query that's not well represented in its training data.

It might come as a surprise that AI tools built for lawyers generally don't use models specifically trained for the industry. Most rely on the same generalized LLMs anyone can access, and OpenAI's GPT is by far the most popular. "There's not a model out there more powerful than GPT-4 right now," Ziniti said.

CoCounsel, an AI legal-assistant product, says it takes several steps to reduce hallucinations. It uses retrieval-augmented generation, a technique to ground an AI's response in documents provided to it, in combination with prompted instructions for the LLM to keep its responses focused on documents' contents.

OpenAI operates a set of servers dedicated to CoCounsel, giving CoCounsel's engineers more control over the model's output. That also helps with regulatory compliance, as information provided to CoCounsel isn't shared more widely.

Jake Heller, the head of product for CoCounsel at Thomson Reuters, said Thomson Reuters had established a "trust team" of lawyers and AI engineers to ensure CoCounsel "is getting the right answer." The AI assistant also provides citation links to alleviate accuracy concerns.

Jake Heller wearing a white button-up under a dark blazer in a headshot
Jake Heller, the head of product for CoCounsel at Thomson Reuters.

AI won't replace lawyers

There's another fear likely to push lawyers toward AI: other lawyers.

Heller said all law firms and lawyers exist within a "competitive dynamic." Law firms fight over a limited pool of clients, and plaintiffs and defendants compete to win cases. Ziniti described the practice of law as an "adversarial system" meant to push each lawyer to present the best case possible on behalf of their client.

Because of this, it's unlikely AI will eliminate lawyers' jobs. Instead, AI could be viewed as an extension of trends rooted in the dawn of the computer age.

"We used to physically review every single document in every case," Heller said. "Even every potentially relevant email, we would physically print them out, and they'd be sitting in banker boxes in a basement."

In a way, you have the problem advancing in tandem with the solution. Danielle Benecke, the head of machine-learning practice at Baker McKenzie

Times have changed. Electronic review has replaced manual review wherever practical. The legal industry has an entire subfield, electronic discovery, dedicated to finding and sorting electronic documents.

Lawyers might also turn to AI to address a force meant to tame AI: regulation. Benecke said the complexity of government regulation was "on an exponential curve," adding, "In a way, you have the problem advancing in tandem with the solution." This is especially relevant for an international firm, like Baker McKenzie, that advises clients in dozens of countries.

Ultimately, the adoption of AI in the legal industry comes down to a fact of life: There are only so many hours in a day. While manually reviewing every document that could be relevant to a case may sound great, it's often not the best use of a lawyer's time.

"I think in three to five years, not using AI for legal work will be tantamount to refusing to use online search for legal work today," Ziniti said.

She added that lawyers have a professional responsibility to avoid inflating billable hours. That responsibility is codified by many legal organizations, including the American Bar Association.

Read the original article on Business Insider