Switch to ADA Accessible Theme
Close Menu
Portland Personal Injury Lawyers / Blog / Personal Injury / Don’t Ever Rely on AI to Handle Your Personal Injury Case

Don’t Ever Rely on AI to Handle Your Personal Injury Case

PI_Claims7

Artificial intelligence like ChatGPT has revolutionized how people search the internet. Media stories claim that AI will eventually make all of us unemployed, because the technology can perform most of the tasks that professionals do on a daily basis. Increasingly, many people are using Chat and other AI programs to handle their own personal injury lawsuits.

Unfortunately, AI is not a substitute for hiring an experienced personal injury lawyer, as we explain below. You will benefit from confidential, individual advice about your legal rights.

What’s Wrong with AI?

Although AI can handle certain simple tasks, the technology is not sophisticated enough to perform legal work. Some of the problems with AI:

  • Wrong legal information. AI programs still give inaccurate information. Indeed, Chat has a disclaimer telling users that information might not be accurate. Anyone relying on AI could submit a claim too late or make some other error with catastrophic legal consequences. We see this constantly when potential clients contact us and provide us with summaries that are clearly written using AI programs.  These inquiries are riddled with legal inaccuracies and misstatements of the law.  Please do not use ChatGPT or any other AI program when summarizing your situation for a potential lawyer—use your own words as they are far more powerful than something an untrustworthy program wrote.
  • Invented cases. If you file a lawsuit and have to litigate any single issue, you will need to reference cases for Oregon law. ChatGPT and other AI programs are notorious for inventing cases. This is called “hallucination,” and many litigants have been sanctioned for presenting hallucinated cases in support of their legal pleadings.
  • Not a substitute for legal strategy. Anyone injured in a car or truck accident can possibly file a lawsuit. However, you might be better off negotiating a settlement. Whether to settle and for how much are critical issues. A lawyer will rely on their experience to determine what is a fair settlement and the best strategy.
  • Lack of privacy. Typing in requests, or uploading your medical information, to Chat or another program could lead to your information being disclosed since usage of most GAI programs requires your consenting to their keeping any information you provide them. This is especially true if you are searching for information on public networks or in the library.
  • No oversight. Artificial intelligence is still largely unregulated. If you get inaccurate information, you might have no recourse. By contrast, if you hire a lawyer who does a bad job, you can complain to the state bar association, which can investigate the lawyer. The legal profession has more accountability.

Suppose you try to negotiate a settlement with an insurer without hiring a lawyer. The insurance company knows you are at a disadvantage. They also know that AI will not provide you with fully accurate information, so the insurance company can play “hardball” and make a low offer. They might even drag out the case because they are not afraid of you suing in court.

Get Human, Accurate Legal Advice: Contact Us Today

Rosenbaum Law Group, PC, has provided legal services for decades. We stay abreast of all legal change and believe in empowering our clients to make sensible choices. There is no reason to rely on Chat when you can ask any question you have and get accurate, timely legal advice. Call our office to speak with a Portland personal injury lawyer today at 503-288-8000. We never charge a fee unless we win your case, so there is no risk of reaching out to our office.

Sources:

reason.com/volokh/2025/09/12/10k-sanction-for-ai-hallucination-in-appellate-brief/

thomsonreuters.com/en-us/posts/technology/genai-hallucinations/

Facebook Twitter LinkedIn