[mc4wp_form id="5389"]

The Danger of Substituting Artificial Intelligence for Your Attorney

Posted on

Share this article

“Don’t confuse your Google Search with My Law Degree.” This is a popular saying found on attorney coffee mugs these days, and this sentiment does not seem so farfetched as we live in a world increasingly dependent on autonomy and artificial intelligence (“AI”).

It’s rare to see people without their phone glued to their hands, and not so much for one-on-one conversations with others. Phones have become a portal of vast information. People regularly rely on search engines like Google, platforms like “Siri” and even social media outlets for answers, all of which utilize AI algorithms to dictate outcomes and results for answers. The hype around AI only increased with the release of ChatGPT in late 2022. ChatGPT is praised for its ability to generate sophisticated, human-like responses to questions posed.

But how much trust should one put into the accuracy of these results? Algorithms, which are essentially a process or set of rules to be followed in calculations or other problem-solving operations (especially by a computer), are the foundation of AI. Behind every source of AI is data gathered from human beings. AI platforms are designed to intake data, synthesize it, and spit out results based on a predetermined algorithm that continuously gathers data, values the majority principle, and continuously adjusts answers according to the nonstop evaluation of said data.

How much weight can be placed on results based primarily on the most popular response? There have been countless examples throughout history where accepted norms have been challenged based not only on changes in data but especially human emotion, something that computers or algorithms do not have.

Further, given that AI is ultimately a mass collection of human data, AI is naturally subjected to human bias of those individuals in which data is being collected. This can be particularly problematic with regards to the influence of AI on the legal field. If the field of law was simply based on legal definitions of terms AI such as ChatGPT could definitely serve that purpose. However, legal analysis requires more than a mere scan of what the Federal, State and Local laws, and case law say about a given subject. Proper legal analysis is subjective, and the legal premise must be analyzed based on the particular circumstances and other items such as executed agreement or Governing Documents.

Despite the growing advancements in AI, we are not at a point where AI can provide accurate legal advice. If you ask ChatGPT for the definition of a legal term contained in a statute, ChatGPT will likely perform very well, correctly providing an answer in line with how the term has been defined in state and federal law. However, if this AI is asked for a legal opinion using Florida law on a different legal subject, it is likely to give you an inaccurate answer. Although the AI is likely to present you with its answer in an authoritative-sounding manner, any reliance on the conclusion reached would be grossly misplaced with potentially disastrous l consequences for its unwary user.

Obtaining a free and fast response for a legal opinion instead of paying an hourly rate and waiting for the results may be quite tempting, but obtaining the wrong instant response and relying on it can be far more costly in the long run. However, as history has shown, new laws and amendments have been instituted based on new data and information presented. AI relies on information that has already been fed into the system to spout out a response. An attorney will request additional information such as written agreements, association governing documents, knowledge of local ordinances, and upcoming changes in the law being discussed in Tallahassee, as well as other relevant information prior to formulating such opinion. Therefore, as you can see the legal opinion process goes beyond a review of the current applicable state and federal law and involves an additional complex ethical, moral, and societal evaluation of issues that computers and AI are not able to fully understand or perform.

All board members have a fiduciary obligation to ensure that they are seeking and obtaining sound legal advice regarding legal situations facing an association. Given that AI does not provide legal advice from a licensed attorney, it is unlikely that the reliance on a computer-generated legal opinion as a substitute for the legal analysis and opinion of a licensed attorney would afford any protection under the business judgment rule.

Therefore, the next time you consider “asking google” as opposed to your attorney, consider the potential risks that you are taking, knowing that google or GPChat is generating the most popular response based on generic information as opposed to a legal opinion based on your particular circumstance.

Share this article