Generative AI Rules for Lawyers
The steep rise in popularity and consumer access to artificial intelligence (AI) has prompted questions in the legal community. Can lawyers and legal professionals use AI? Should they? If they do, must they disclose their use? Are there any limits to what a lawyer can do with AI?
Traditional AI and Generative AI
First, a brief overview of traditional AI and generative AI. Traditional artificial intelligence has been around in some form since the 1950s. It is trained on large datasets to learn patterns and uses predetermined algorithms and rules to make predictions and perform tasks, such as translating a book from one language to another. It can appear very smart because it is trained to do a specific job well, but it cannot generate something new. Other examples of traditional AI in action include email spam filters, voice assistants, or computerized chess opponents.
Generative AI, in contrast, is capable of producing new content, like text and images. One of the more well-known generative AI tools is ChatGPT, which allows users to interact with an AI model in natural language conversations. There are also generative AI tools, such as DALL-E, that create images and art based on a user’s natural language prompts. Like traditional AI, generative AI is trained on large datasets, but generative AI takes the patterns it learns from those datasets to come up with something new.
Generative AI Use in the Legal Community
There are a host of potential benefits to using generative AI. It can be more efficient than performing the same task manually. It can also assist users to expand their skills and capabilities. In the legal profession, generative AI can be used to help communicate with clients and other parties, draft pleadings, perform legal research, draft contract language, complete discovery, develop trial strategies, and market a lawyer’s firm.
Generative AI in legal research, for example, may not only allow legal professionals to query using natural language prompts, but might be able to draft a legal research memorandum based on research results. Using generative AI to draft contracts can help lawyers draft new contract language in a matter of seconds. Asking the right AI tool to help a lawyer craft a convincing opening statement can inspire the persuasive thread the lawyer will weave throughout their argument. However, anything generated by AI should be closely scrutinized.
AI can get the law or the facts wrong, resulting in an inaccurate legal research memo that a lawyer who does not carefully check the AI’s work might then use to draft a problematic brief. A lawyer who does not check other sources or have personal knowledge about a particular contract provision drafted by AI might be in hot water for approving language that does not protect their client. A lawyer who depends on AI to draft client communications might run afoul of Rule 1.4 of the Rules of Professional Conduct, which requires lawyers to keep clients reasonably informed about the status of a matter and sufficiently explain a matter so that their client can make informed decisions.
Regulations for Generative AI Use in the Legal Community
As some lawyers began to use generative AI in their practice, judges began issuing standing orders regarding the use of AI in court filings. Judge Brantley Starr of the U.S. District Court for the Northern District of Texas was the first to require attorneys and pro se litigants to file a certificate attesting that either they would not use generative AI to draft any portion of any filing, or that language drafted using generative AI would be checked for accuracy by a human being using print reporters and traditional legal databases.
Judge Starr is not the only judge who issued a standing order. Other standing orders range from a simple warning about generative AI’s potential complications to a prohibition against generative AI use outside of search engines. Concerns raised in standing orders include issues of accuracy, hallucinations, biases, confidentiality, and improper disclosures. In some cases, a lawyer may be required to disclose the use of AI in research as well as drafting.
For this reason, it is important for lawyers and litigants to check a judge’s standing orders periodically. The use of AI in the legal space and courts’ opinions and regulations of such use are sure to evolve.
It is possible that generative AI regulations will start to appear in local rules, as well. Some argue that using local rules rather than individualized standing orders may lessen confusion and allow others to comment on the rules, sometimes revealing unintended consequences.
Advisory Opinions and Guidelines for Generative AI Use
In November 2023, the Florida Bar and the California Bar took additional steps to regulate AI use in the legal field. The Florida Bar issued a proposed advisory opinion requiring lawyers who use generative AI to take reasonable precautions to protect confidential client information, develop policies for oversight of generative AI use, ensure that fees and costs are reasonable, and adhere to related ethics and advertising regulations.
The California Bar’s AI guidance document outlines the potential risks associated with using artificial intelligence in legal practice and the professional obligations, such as the Rules of Professional Conduct, that lawyers must continue to comply with. These rules include the duty of confidentiality, the duties of competence and diligence, and the duty to supervise lawyers and nonlawyers, among others.
The New Jersey Bar, the Texas Bar, and the American Bar Association have created task forces to study how the legal professional may be impacted by AI. Others will surely follow suit.