Tax Notes logo

Issues with Automated Guidance

Posted on July 6, 2021

Following my post from last month, I write again about academic research related to the IRS’s technology plans. This time with an additional focus on some of the potential pitfalls and concerns. The government provides resources, information and assistance to help the public understand and apply the law. Technology has allowed the government to automate its guidance-giving function through online tools, including artificial intelligence (“AI”) powered virtual assistants or chatbots.

Articles written by Prof. Joshua Blank and Prof. Leigh Osofsky have distinguished them as experts in this area. This post focuses primarily on their 2020 article, Automated Legal Guidance (available here), but Legal Calculators and the Tax System and Simplexity: Plain Language and the Tax Law are also relevant and thought provoking.

Luckily for us, their research and insight will have a real impact in this area because they have recently been selected by the Administrative Conference of the U.S. (“ACUS”) to build upon their existing work and study the use of automated legal guidance by federal agencies. ACUS is an independent federal agency that gathers experts from the public and private sectors to recommend improvements to administrative process and procedure.

Profs. Blank and Osofsky have found that with automation comes risks of over-simplification and other issues. In Automated Legal Guidance, Profs. Blank and Osofsky argue that taxpayers may rely more heavily on automated guidance since it is seemingly tailored to their specific circumstances, unlike other forms of IRS guidance, such as notices, press releases, and publications.

As seen in the Taxpayer First Act Report to Congress, and discussed in my last post, the IRS plans to utilize technology to a greater degree, especially on the assistance and information side. Most immediately relevant to Profs. Blank and Osofsky’s article is the IRS’s plan to use a chatbot. The IRS has indicated that this chatbot will attempt to answer questions or direct taxpayers to helpful information or to their online account. This technology will potentially build upon the infrastructure that already exists for the IRS’s Interactive Tax Assistant (“ITA”).

Profs. Blank and Osofsky are quite familiar with the ITA. Their article includes a case study on it which highlights several different scenarios where the ITA provided taxpayers with incorrect answers. The scenarios involved common questions related to the permissibility of certain types of deductions (for example, the cost of work clothes, medical expenses, and charitable contributions). All of the details about Blank and Osofsky’s ITA study are in their article, but they found that the ITA over-simplified the rules, didn’t ask enough questions and didn’t incorporate all of the necessary legal requirements which resulted in some answers that were inconsistent with the law.

Artificial intelligence is better able to develop scripts and algorithms when the law is expressed in simple terms. This results in simplexity, a term Blank and Osofsky have coined in their research, where the law is presented “as simpler than it is, leading to less precise advice, and potentially inaccurate legal positions.”  Although not specific to automated guidance,the Plain Writing Act of 2010 has also impacted this area since it requires the federal government to explain the law to members of the public using plain language. There are issues that arise from this over-simplification.

In light of this and other automated guidance related concerns, Profs. Blank and Osofsky make three recommendations to policymakers: 1) the government should prevent automated legal guidance from widening the gap between high- and low-income individuals with respect to access to legal advice; 2) the government should introduce a more robust oversight and review process for automated legal guidance; and 3) the government should allow individuals to avoid certain penalties and sanctions if they detrimentally rely upon automated legal guidance.

The Legal Advice Access Gap

The reliability and correctness of automated guidance can be misleading to low- or average-income taxpayers, because it requires users to input personal details and generates guidance with personalized language which may have the effect of convincing users that the information has directly addressed their inquiries.

By contrast, other forms of guidance (such as IRS publications) are less specific, use second person pronouns, identify exceptions along with the general information, and have other indicators that the guidance is written for every reader in a generalized way.

Profs. Blank and Osofsky argue that the “personalized, non-qualified and immediate” nature of automated guidance is more powerful and pervasive and low- and average-income taxpayers may not have the resources available to verify its correctness.

In order to ensure that the gap between legal advice access isn’t widened, Profs. Blank and Osofsky recommend that the government take “time to better understand the audience that may utilize or more heavily rely upon AI and tailor the outcomes for that audience.” This can be accomplished by asking questions that are designed to gauge a user’s level of legal sophistication, or alternatively, “to the extent that simplexity remains a feature of automated legal guidance, it may make sense to offer automated legal guidance specifically for those populations for whom understanding the law is likely to be overly burdensome” by developing guidance on legal topics that affect them most often.

Some of these potential gaps may also be narrowed with IRS partnerships with academic institutions and non-profit organizations, which Prof. Afield suggested in his article (which I covered last month) and which the IRS is already exploring.

At the very least, the IRS has proposed developing automated guidance that can identify times when more information is necessary and direct taxpayers to additional assistance and live support.

An Oversight and Review Process for Automated Guidance

All forms of guidance play a crucial role in informing the public about what the law is and how it might apply in each situation. Most guidance is not subject to formal promulgation requirements, and it also generally is not subject to judicial review.

IRS publications have historically been the primary communication that the agency uses to explain the tax law in clear and simple terms. Publications are relied upon by tax accountants, tax lawyers and developers of commercial tax preparation software, as well as the IRS itself when training assistors and designing its automated taxpayer guidance systems. Changes to publications are extensively reviewed, discussed and documented by groups within the IRS.

Due to the powerful and pervasive nature of automated guidance, discussed above, Profs. Blank and Osofsky recommend that the government implement a robust oversight and review process for it, ”[h]owever automated legal guidance evolves, it is essential to consider what the administrative process around such guidance will be. When agencies offer automated legal guidance, they are inevitably making decisions about what the law is, or at least how it is going to be represented to the public, in a variety of situations. The question is how to ensure that such decisions are infused with legitimating values such as transparency, accountability, and non-arbitrariness.”

Decisions that direct automated guidance are also being made by computer coders, in a way that legal officials within the IRS, much less the public, may not be fully equipped to understand.

Profs. Blank and Osofsky find that the use of automated guidance shines a light on the “already endemic problem of ensuring appropriate process around agency statements of the law.” A promising reform may be to subject all automated legal guidance to some form of centralized oversight, review, and public comment “which may not only be a salutary, but also a critical way of assuring that agency guidance is instilled with legitimacy.”

Penalties

Even if automated guidance is subject to a more legitimate review process, it still won’t be error proof. The law allows taxpayers to potentially avoid penalties if they can show they relied upon expert advice. As it currently exists, the ITA disclaims that its “[a]nswers do not constitute written advice in response to a specific written request of the taxpayer within the meaning of section 6404(f) of the Internal Revenue Code,” a provision that deals with avoiding penalties based on provision of written advice by the IRS.

The inability to rely upon guidance made available to the non-expert public is not specific to automated guidance, taxpayers are also not allowed to rely upon responses to telephone inquiries or customer service centers to avoid penalties.

This creates inequity between taxpayers who pay for private, expert advice and those who do not. To remedy this, Profs. Blank and Osofsky argue more nuanced approach to penalties is necessary in an era of legal automation. The law should allow individuals to avoid certain penalties and sanctions if they can prove they reasonably relied upon automated guidance.

They suggest that the Treasury Department could revise the regulations that govern defenses against accuracy-related tax penalties to allow taxpayers to assert a reasonable basis defense based on guidance they receive from ITA, if they disclose this guidance to the IRS when they file their tax returns. Or, the Treasury Department should consider adding “answers provided to ITA” to the list of sources upon which a taxpayer can state they’ve relied in order to claim a tax position and avoid a penalty for disregard of rules and regulations.

Profs. Blank and Osofsky don’t ignore the risk that automated guidance can be manipulated to generate answers after-the-fact to avoid penalties, but counter that the reasonable basis penalty defense still requires a showing of reasonableness and the IRS and courts would retain the ability to question whether a taxpayer reasonably relied upon a statement from ITA in good faith.

Further, if the ITA, chatbot, or other forms of automated guidance could easily reproduce a written record of every input and its ultimate answer, taxpayers could produce it to establish certain tax penalty defense. The IRS could also potentially use the records to improve the automated guidance by allowing it to identify circumstances when confusing or incorrect information is provided. In the Taxpayer First Act Report to Congress, the IRS stated it is open to learning and adapting based on data and information they collect.

For additional and ongoing information on this topic, ACUS has launched a public website for Profs. Blank and Osofsky’s project, which will be updated with content over the next 18 months.

DOCUMENT ATTRIBUTES
Copy RID