AI can draft a contract. It can’t defend one in court.
If you’re using artificial intelligence to act as your business lawyer, you could be incurring higher costs than if you employed an actual lawyer. AI platforms like ChatGPT, Gemini, and Claude are not legal experts or legally licensed to practice law.
Take these two cases of AI legal mistakes, for example. In South Korea, Changhan Kim, the CEO of the gaming company Krafton, attempted to avoid a $250 million contract by relying on some ChatGPT-generated strategy rather than using his legal team. The court rejected his AI approach and reinstated the contract, exposing the company to significant financial and legal consequences
Likewise, in a recent lawsuit by Nippon Life Insurance Company of America in Illinois, OpenAI was accused of acting as an unlicensed lawyer. The ChatGPT engine provided
As the founder of a Century City law firm at the intersection of technology, AI, entertainment, and media, my team and I are seeing this play out in real time, almost daily.
Law is judgment, not information.
Practicing law isn’t like playing chess or like the Chinese strategy game Go, where a computer can win by learning the rules. Legal practice requires judgment and accountability. It involves human insight.
Legal analysis requires understanding how facts evolve, how opposing counsel will respond, how a judge will likely evaluate a position, and how a decision will impact broader business objectives. In short, practicing law involves understanding the real world.
AI counsel is often misguided.
The problem with AI-generated legal analysis is that it is often overconfident, imprecise, and materially incomplete. The content is longer than necessary, less tailored to the facts, and contains errors that are not immediately obvious.
Practitioners see these consistent failure points:
- Misstated legal standards
- Wrong jurisdiction applied
- Overly broad conclusions
- Strategies that collapse under scrutiny
- Confident answers that ignore real-world consequences
These issues are not minor grammatical errors. They are outcome-determinative mistakes with legal consequences for your business.
AI legal counsel increases costs.
What your business expects to save on legal fees often ends up costing you more. What seems like efficiency turns into rework.
My team and I now spend additional time unwinding and correcting AI-generated strategies from clients before beginning actual legal work. That increases costs, delays decisions, and introduces avoidable exposure. Across the profession, experienced attorneys are reporting the same issue from their business clients.
Use AI as a tool, not as your lawyer.
So, should you not use AI for the law at all? Of course not. Attorneys use AI all the time as a tool to accelerate research, assist with drafting text, and improve efficiency.
AI is a tool, not legal counsel. Business clients who benefit most from AI use it to ask better legal questions, which they can then ask their counsel. AI platforms can also organize your information, summarize data, and surface issues quickly. However, they are not a substitute for legal judgment.
In high-stakes matters, the difference between a surface-level answer and real expertise is how much you pay legally and financially.
This article as originally published by Inc. April 11, 2026.