Menu
Tax Notes logo

TIGTA Cites Flaws in IRS Correspondence Audit Review System

SEP. 20, 2013

2013-30-099

DATED SEP. 20, 2013
DOCUMENT ATTRIBUTES
Citations: 2013-30-099
Actions Are Needed to Strengthen the National Quality Review System for Correspondence Audits

 

September 20, 2013

 

 

This report has cleared the Treasury Inspector General for Tax Administration disclosure review process and information determined to be restricted from public release has been redacted from this document.

 

HIGHLIGHTS

 

 

Final Report issued on September 20, 2013

 

 

Highlights of Reference Number: 2013-30-099 to the Internal Revenue Service Commissioner for the Small Business/Self-Employed Division.

IMPACT ON TAXPAYERS

An audit is one of the primary enforcement tools the IRS uses to address noncompliance with the tax laws. Because problems with correspondence audits are not always recognized and reported, the IRS may be missing opportunities to reduce the noncompliance that contributes to the Tax Gap and promote tax system fairness among the tax-paying public.

WHY TIGTA DID THE AUDIT

This audit was initiated to determine the accuracy of the results from the National Quality Review System (NQRS) and how management uses the feedback to enhance the quality of correspondence audits. The review is part of our Fiscal Year 2013 Annual Audit Plan and addresses the major management challenge of Tax Compliance Initiatives.

WHAT TIGTA FOUND

The NQRS is designed to provide IRS managers at all levels with estimates of audit quality from a sample of audits to use in identifying areas in which corrective actions are needed. However, TIGTA identified areas that could be strengthened to increase the accuracy of NQRS review results, enhance the ability of managers to identify and address quality problems with correspondence audits, and ensure that the NQRS sample is selected at random.

The auditing standards and NQRS quality measures need to be better aligned. The auditing standards, including the consideration given to significant issues, contain key requirements not evaluated under the NQRS. This can create inconsistencies in how examiners conduct audits and in how the NQRS evaluates the quality of those audits to identify errors.

For example, TIGTA evaluated a statistical sample of 127 of 2,913 correspondence audits that had been reviewed by the NQRS during an 18-month period and found errors with penalty determinations in 65 of the audits (51 percent) that had not been detected and reported by NQRS quality reviewers.

IRS executives and stakeholders should be provided with a more comprehensive snapshot of audit quality so that needed corrective actions can be timely recognized and taken. Only one overall measure of audit quality is currently reported quarterly by the NQRS to IRS executives and other key stakeholders even though as many as 71 items are reviewed.

Finally, the random selection of audits for NQRS review could not be verified. As such, TIGTA was not able to confirm the statistical validity of the NQRS results.

WHAT TIGTA RECOMMENDED

TIGTA recommended that the IRS ensure that 1) the auditing standards align with the NQRS quality measures, 2) a more complete picture of correspondence audit quality is provided to NQRS customers on a regular basis, and 3) audits are selected randomly for NQRS review.

In their response to the report, IRS management agreed with the first two recommendations and disagreed with the third, indicating that they do not have a cost-effective way to allow the randomness of the NQRS case selection process to be verified. Because the IRS's conclusion was reached after the draft report was issued, the underlying details supporting the conclusion were not evaluated. If the sample selection process cannot be verified, the IRS cannot be assured of the statistical validity of NQRS results.

 

September 20, 2013

 

 

MEMORANDUM FOR

 

COMMISSIONER, SMALL BUSINESS/SELF-EMPLOYED DIVISION

 

 

FROM:

 

Michael E. McKenney

 

Acting Deputy Inspector General for Audit

 

 

SUBJECT:

 

Final Audit Report -- Actions Are Needed to Strengthen the

 

National Quality Review System for Correspondence Audits

 

(Audit # 201130027)

 

 

This report presents the results of our review to determine the accuracy of results from the National Quality Review System and how management uses the feedback to enhance the quality of correspondence audits. The review is included in our Fiscal Year 2013 Annual Audit Plan and addresses the major management challenge of Tax Compliance Initiatives.

Management's complete response to the draft report is included as Appendix VII. Copies of this report are also being sent to Internal Revenue Service managers affected by the report.

If you have any questions, please contact me or Nancy Nakamura, Assistant Inspector General for Audit (Compliance and Enforcement Operations).

                          Table of Contents

 

 

 Background

 

 

 Results of Review

 

 

      A Comprehensive Quality Review System Has Been Established to

 

      Measure the Quality of Correspondence Audits

 

 

      Steps Can Be Taken to Strengthen the National Quality Review

 

      System

 

 

           Recommendation 1:

 

 

           Recommendations 2 and 3:

 

 

 Appendices

 

 

      Appendix I    -- Detailed Objectives, Scope, and Methodology

 

 

      Appendix II   -- Major Contributors to This Report

 

 

      Appendix III  -- Report Distribution List

 

 

      Appendix IV   -- Quality Attributes Considered by National

 

                       Quality Review System Reviewers

 

 

      Appendix V    -- Results From the Correspondence Audit Fiscal

 

                       Year 2013 Business Performance Review

 

                       (1st Quarter)

 

 

      Appendix VI   -- Glossary of Terms

 

 

      Appendix VII  -- Management's Response to the Draft Report

 

 

                            Abbreviations

 

 

      IRS       Internal Revenue Service

 

      NQRS      National Quality Review System

 

      SB/SE     Small Business/Self-Employed

 

      TIGTA     Treasury Inspector General for Tax Administration

 

Background

 

 

The Internal Revenue Service (IRS) estimates that $235 billion of the $450 billion in taxes that should have been reported and paid on time but were not is caused by individuals underreporting their income tax liabilities. An audit is one of the primary enforcement tools the IRS uses to address the noncompliance that contributes to the Tax Gap,1 and the cornerstone of the IRS audit efforts is the correspondence audit program.

In Fiscal Years 2008 through 2012, IRS statistics show that it conducted almost 5.7 million correspondence audits and, in the process, recommended approximately $40.4 billion in additional taxes. This represents about 77 percent of all audits the IRS conducted of individual income tax returns and about 56 percent of the estimated $72.4 billion in recommended additional taxes resulting from those audits. The responsibility for conducting correspondence audits rests largely with the IRS's Small Business/Self-Employed (SB/SE) Division, which handles complex individual tax returns, and its Wage and Investment Division, which handles simple tax returns filed by individuals reporting wages, interest, dividends, and other investment income.

In contrast to the more detailed and lengthy face-to-face audit at an IRS office or in the field at a taxpayer's place of business, the correspondence audit process is less intrusive, more automated, and conducted by examiners who are trained to deal with less complex tax issues. Because of its automated features and less complex tax issues, the correspondence audit process enables the IRS to reach more taxpayers at a lower cost. The IRS currently conducts correspondence audits in approximately 37 program areas.

Regardless of the program, the audit process typically begins with the IRS mailing a computer-generated letter from one of its campuses to a taxpayer. The letter outlines the examination process, identifies one or more items on the tax return being questioned, and requests supporting information to resolve the questionable items. Once the requested information is returned, examiners review it to determine whether it resolves the questions. If the questions can be sufficiently answered by the information provided, the audit is generally closed without any changes to the tax; if not, the taxpayer is sent a letter requesting more information or indicating a recommended change to the tax.

The taxpayer at this point can do one of the following:

  • Agree with the examiner.

  • Provide the examiner with clarifying information.

  • Appeal the decision to the IRS's Office of Appeals.

 

In instances where the taxpayer does not respond to IRS letters, the examiner's recommended tax changes are assessed by default and the taxpayer will generally have to petition the court system to contest the assessment.

Prior Treasury Inspector General for Tax Administration (TIGTA) audits

In June 2010,2 TIGTA reported that penalties were not considered and assessed in accordance with IRS policies and procedures for 211 (92 percent) of 229 sample audits closed in Fiscal Year 2008.3 As a result, opportunities were missed to promote the preparation and submission of complete and correct information on tax returns, impose an economic cost on those who chose not to voluntarily comply with Federal tax laws, and increase revenue for the Department of the Treasury by an estimated $17.5 million.

In February 2010, TIGTA reported that tests of a statistical sample of 298 correspondence audits involving sole proprietors closed in Fiscal Year 2007 identified that significant issues were not addressed in 129 (43 percent) of the audits.4 As a result, it was estimated that taxpayers may have avoided tax and interest assessments totaling approximately $83 million.5

This review was performed at the IRS National Headquarters in Washington, D.C., the SB/SE Division Headquarters in Lanham, Maryland, and the IRS Campus in Ogden, Utah, during the period September 2011 through April 2013. This performance audit was conducted in accordance with generally accepted government auditing standards. Those standards require that we plan and perform the audit to obtain sufficient, appropriate evidence to provide a reasonable basis for our findings and conclusions based on our audit objectives. We believe that the evidence obtained provides a reasonable basis for our findings and conclusions based on our audit objectives. Detailed information on our audit objectives, scope, and methodology is presented in Appendix I. Major contributors to the report are listed in Appendix II.

 

Results of Review

 

 

A Comprehensive Quality Review System Has Been Established to Measure the Quality of Correspondence Audits

To ensure that correspondence audits are conducted in a quality manner, the IRS uses a comprehensive quality review system. The system includes a statistical sampling of correspondence audits. The IRS has also integrated various internal controls into its quality review system as outlined in the Standards for Internal Control in the Federal Government.6 These help the IRS determine the effectiveness and efficiency of its correspondence audits and whether examiners are ensuring that taxpayers are complying with laws and regulations.

Establishment of standards and performance measures

The IRS has established seven auditing quality standards. Each standard has key elements that elaborate on and further define the overall standard. For example, one of the key elements for Adequate Consideration of Significant Issues instructs examiners to consider and/or pursue audits of the prior and/or subsequent year returns when they contain the same issues as in the year examined. Appendix IV contains additional details on the key elements associated with each quality standard. Figure 1 explains each standard and provides an overview of the standard.

    Figure 1: Quality Auditing Standards for Correspondence Audits

 

 _____________________________________________________________________

 

 

          Standard                            Overview

 

 _____________________________________________________________________

 

 

 Adequate Consideration of   Measures whether consideration was given

 

 Significant Issues          to large, unusual, or questionable items.

 

 

 Examination Depth and       Determines whether the issues were pursued

 

 Conclusions Reached         to the extent needed to determine the

 

                             substantially correct tax.

 

 

 Workpapers Support          Addresses whether the audit case file

 

 Conclusions                 documents the audit procedures followed

 

                             and conclusions reached.

 

 

 Report Writing Procedures   Assesses the written presentation of

 

 Followed                    audit findings in terms of content,

 

                             format, and accuracy.

 

 

 Penalties Properly          Rates whether applicable penalties were

 

 Considered                  considered and applied correctly.

 

 

 Timely Actions              Addresses the timeliness of actions taken

 

                             during the audit.

 

 

 Case Administration         Verifies whether administrative

 

                             procedures were followed such as those

 

                             related to Power of Attorney privileges.

 

 _____________________________________________________________________

 

 

 Source: Internal Revenue Manual 4.19.13.2.

 

 

Accurate and timely recording of events

The Internal Revenue Manual, quality standards, and examiner training courses emphasize the need for examiners to document their decisions and the actions taken during each audit in the audit case file. For example, the standard for considering penalties requires that examiners document their decisions to assess or not assess all applicable penalties, along with the basis for that decision. Such documentation is an important control activity because it helps ensure that examiners have considered all aspects of an audit and provides the information needed in subsequent reviews that evaluate whether examiners made the correct decisions.

Management reviews at the activity level

First-line managers review the documentation for a sample of audit case files to identify and correct quality problems in conjunction with evaluating the performance of the examiners they supervise. Managers rate examiner actions against established attributes based on the type of audit being reviewed. Attributes are collected into specific groups that follow the progression of a case and are mapped to the critical job elements for examiner performance evaluations. In addition to rating whether the examiner action met the attribute, managers may also enter narrative comments.

Management reviews at the functional level

While first-line manager reviews serve as the initial mechanism for identifying and correcting quality problems, each of the five IRS campus sites that conduct correspondence audits also perform quality review audits as part of the IRS-wide National Quality Review System (NQRS).7 Unlike first-line manager reviews that are generally used to evaluate a specific examiner's performance, NQRS reviews focus on providing IRS managers at all levels with statistically reliable estimates of audit quality by evaluating a small sample of closed audits.

Steps Can Be Taken to Strengthen the National Quality Review System

The NQRS data related to correspondence audits is not accurate. Although three of the four quality attributes we tested on a statistical sample of correspondence audits identified no significant concerns, tests of the accuracy-related penalty determinations identified a significant number of errors that had not been detected and reported by NQRS quality reviewers. The errors were caused, in large part, by inconsistencies between the NQRS attributes and the auditing standards.

Our audit tests included a review of statistical samples of 66 discretionary8 correspondence audits and 61 Earned Income Tax Credit correspondence audits (127 total audits) from a population of 2,913 audits reviewed by the NQRS between October 1, 2009, and March 31, 2011. Auditors' case reviews focused on four technical quality attributes that determined whether:

 

(1) Audits properly addressed all large, unusual, and questionable issues.

(2) Certification techniques were used to verify that required documentation was received from taxpayers.

(3) Penalties were properly considered and assessed when warranted.

(4) Tax account information residing on IRS automated data systems was used when appropriate to resolve large, unusual, and questionable issues.

 

Limiting the review to these four attributes enabled the auditors to focus on how well quality reviewers were detecting and reporting areas determined in the past by TIGTA to be problematic. These four quality attributes are critical to addressing underreporting noncompliance.

Except for a few instances, TIGTA did not identify any concerns with how NQRS reviewers rated the quality of the large, unusual, and questionable issues that examiners were assigned to audit. In addition, nearly all the closed audit case files reviewed contained the required tax account information from IRS automated systems that examiners could use to help address the large, unusual, and questionable issues they were assigned to audit. However, we did find a concern with penalty determinations.

Penalty determination errors were undetected

Sixty-five (51 percent) of the 127 total correspondence audits in our two statistical samples contained errors in accuracy-related penalty determinations that NQRS quality reviewers had not detected and reported. As such, the reported NQRS quality measure of 95 percent appears to be significantly overstated.

During audits, examiners are responsible for considering accuracy-related penalties when recommending adjustments to tax liabilities. These penalties, which typically include negligence and substantial understatement penalties in correspondence examinations, are designed to promote the preparation and submission of complete and correct information on tax returns as well as impose an economic cost on taxpayers who choose not to comply with the tax law.

 

The negligence penalty can be assessed when a taxpayer fails to make a reasonable attempt to comply with the tax law, exercise ordinary and reasonable care in preparing his or her return, or keep adequate books and records.

The substantial understatement penalty can be assessed when an understatement exceeds 10 percent of the tax required to be shown on the tax return or the understatement is equal to or greater than $5,000.

The penalty for both is 20 percent of the underpayment.

 

The cases identified as having errors in penalty determinations involved straight-forward issues, which indicated that reasonable care was not exercised in preparing the tax return but the case files did not provide reasons why the penalties should not have been applied. For example, in nine cases the examiners disallowed itemized deductions in excess of $20,000 due to lack of documentation but did not assess the negligence penalty as required for the taxpayers' failure to keep adequate books and records. In four of the nine audits, the taxpayers understated their tax liabilities by more than $4,000 but penalties were not considered.

There are inconsistencies between the auditing standards and the NQRS attributes and job aids used to evaluate correspondence audits

The auditing standards laid out in the IRS's Internal Revenue Manual and the attributes used for NQRS rating are not all aligned. Two inconsistencies identified are penalty determinations and multiyear examinations.

Auditing standards require that accuracy-related penalties, as well as all other applicable penalties, be considered in all audits and assessed when warranted. The standards also specify that penalties are to be accurately computed if they are recommended for assessment.

However, the NQRS job aids call the penalty quality measure "Penalty Computation" and instruct quality reviewers to rate the measure any time a penalty is calculated or determined by the examiner. Four of the seven quality reviewers we interviewed advised us that their interpretation of this NQRS quality measure meant limiting the review to checking if penalties were accurately computed, not whether a penalty should have been considered. Three quality reviewers' responses were in line with audit standards because they evaluated if applicable penalties were considered and computed accurately when recommended for assessment.

There is also inconsistency and confusion over when, or if, the scope of single-year audits should be expanded to include the prior and/or subsequent year returns. As of January 1, 2013, the auditing standards clearly indicate that the scope of single-year audits are to be expanded to the prior and/or subsequent year returns when they contain the same or similar large, unusual, or questionable issues as in the year examined. In contrast, the NQRS quality attribute that deals with audit scopes and large, unusual, and questionable issues does not discuss considering and pursuing these issues in prior/subsequent year returns. However, it does instruct quality reviewers to assess if all related tax years were properly considered in conjunction with addressing all large, unusual, and questionable issues. This slightly different wording led several quality reviewers to review cases differently than what is required in the auditing standards.

During interviews with NQRS quality reviewers in December 2012, TIGTA auditors received a variety of answers when asked how the reviewers rate the attribute "Are all related tax years considered?" One reviewer was unaware of the requirement or that any other quality attribute required examiners to evaluate related tax returns. Some reviewers stated that looking at related tax returns depends on the type of issues that were identified for audit. Other reviewers stated that prior and/or subsequent year tax returns are never considered in the context with single-year audits.

IRS officials stated that correspondence examiners, unlike examiners who conduct face-to-face audits, are not required to perform Required Filing Checks of other tax years during correspondence audits. Instead, the IRS relies on its audit sources and return selection process for correspondence audits to determine if a prior and/or subsequent year return should be audited. Required filing checks are used during face-to-face audits, in part, to determine whether the same pattern of noncompliance identified on the audited tax return is present on the prior and/or subsequent year tax returns and if those tax returns also warrant auditing. Therefore, the NQRS attribute that asks if related tax years were considered is not valid for correspondence audits.

If the NQRS attributes do not measure quality according to the auditing requirements, the NQRS results will be incorrect and could be misleading. The NQRS attributes need to be adjusted to align with the actual requirements for correspondence examinations.

A more complete picture of audit quality could be provided to NQRS customers

The ability of IRS executive managers and other NQRS customers to identify and correct quality concerns with correspondence audits could be facilitated if the NQRS reporting mechanisms provided not just information on overall audit quality but also specific information on the rating of the various quality attributes evaluated by NQRS reviewers during their case reviews. Considering the potential effect that poor audit quality can have on addressing underreporting noncompliance, such information could be very useful for isolating problem areas so that corrective actions can be taken. It would also help ensure that quality problems in key areas are not masked by reporting one overall quality score.

The various IRS functional areas summarize high-level performance information on a quarterly basis through the IRS's Business Performance Review reporting process. The process is designed to provide the opportunity for IRS executive managers to share accomplishments and compliance concerns with the IRS Commissioner, Deputy Commissioner, and IRS Oversight Board. However, because the Business Performance Review reports are intended to present high-level summary information, just one overall quality score is typically included in the quarterly review even though as many as 71 items are reviewed.9

Although not specifically required by the Business Performance Review reporting process, the Large Business and International Division quality reviewers also supplement the reports each quarter by providing executive managers with a detailed analysis of the attributes within each auditing standard for which improvements can be made. For example, the analysis for the first quarter of Fiscal Year 2013 showed that although the Large Business and International Division examiners achieved an overall quality rating of 90 percent, there were at least two attributes that were rated below 60 percent. When shared with examiners and their managers, such analyses can serve as a means to communicate and reinforce the importance of adhering to IRS audit quality guidelines and directives along with focusing attention on areas in the audit process that need improvement. This practice should also be considered for the SB/SE Division executives.

The random selection of audits for NQRS review cannot be verified

The NQRS is designed to provide statistically reliable estimates of audit quality nationwide by evaluating a relatively small sample of audits. The results, in turn, provide IRS management with data that can be used to identify training and educational needs and improve audit quality. To help ensure that the sample audits reviewed by the NQRS provide measures of audit quality that are representative of all correspondence audits and can be relied upon to identify areas that need improvement, IRS statisticians design and provide sampling plans for each campus. Every three months, the statisticians reassess the sampling plans and provide the quality reviewers in each campus with a memorandum that specifies the number of closed audits that are to be randomly selected at specific intervals for review.

Audit tests of the SB/SE Division campuses quality review data showed only minor differences between the number of reviews IRS statisticians designated for the SB/SE Division sampling plans and the number of reviews completed by quality reviewers at all SB/SE Division campuses. Figure 2 shows the results of our analysis comparing the number of NQRS quality reviews planned to the actual number of quality reviews completed from October 2009 through March 2011.

            Figure 2:  Comparison of the Number of Quality

 

             Reviews Planned to the Actual Number Reviewed

 

                 From October 2009 Through March 2011

 

 _____________________________________________________________________

 

 

                        Reviews Required                  Difference

 

                        by NQRS Sampling  Completed NQRS  (Completed

 

 Fiscal Year Quarters   Plan              Reviews         -- Required)

 

 _____________________________________________________________________

 

 

       Earned Income Tax Credit Audits and Discretionary Audits

 

 

 October-December 2009          502              528             26

 

 January-March 2010             521              483            (38)

 

 April-June 2010                498              574             76

 

 July-September 2010            490              450            (40)

 

 October-December 2010          506              578             72

 

 January-March 2011             490              495              5

 

 

 Total Audits                 3,007            3,108            101

 

 _____________________________________________________________________

 

 

 Source:  Our analysis of IRS data.

 

 

We also consulted with our statistician, who verified that the sample size was sufficient to measure the statistical reliability of the results. Additionally, the differences between the number of reviews required by the NQRS sampling plan and the number of reviews actually completed should have no material impact on the estimates of audit quality.

Although the requisite number of audits called for in the sampling plans was reviewed, we were not able to verify or replicate the sampling process to determine if the audits were randomly selected for review at the specified intervals because such documentation was not maintained. Therefore, we were not able to confirm the statistical validity of the NQRS results.

According to published statistical sampling procedures from both within and outside of the Federal Government, random selection is critical to ensure that samples are representative of the population from which they were selected and to eliminate personal bias or subjective considerations from the selection process. Moreover, the Office of Management and Budget guidance on statistical surveys specifies that sample documentation should include information necessary to allow a third party to replicate and evaluate statistical sampling results.

Recommendations

The Commissioner, SB/SE Division, should ensure that:

Recommendation 1: The auditing standards are better aligned with the NQRS attributes and quality measures.

 

Management's Response: IRS management agreed with this recommendation and will determine if discrepancies exist between the auditing standards and the quality attribute coding and adjust the process as needed.

Office of Audit Comment: As discussed in the report, we identified two discrepancies between IRS auditing standards and the attributes used by NQRS for evaluating correspondence audits. These discrepancies involve the consideration of penalties and the requirement for examiners to perform filing checks on other tax years. IRS management should ensure that, at a minimum, these two attributes are corrected in the NQRS.

 

Recommendation 2: A more complete picture of correspondence audit quality is provided to NQRS customers on a regular basis.

 

Management's Response: IRS management agreed with this recommendation and will determine if providing their executives and program managers additional quality data with greater frequency would be helpful and feasible, and make any appropriate changes.

Office of Audit Comment: In their response, IRS management stated that they regularly share more specific information about program performance with executives and managers, who use it to monitor, evaluate, and take action to improve the programs. However, IRS management did not provide us with copies of these reports during the audit and did not specify the additional quality data it would consider providing to executives and program managers. As such, we do not know if IRS management's corrective action will include the types of quality errors we identified in this audit.

 

Recommendation 3: The audits selected for NQRS review are randomly selected and documentation is maintained that will allow IRS management and third parties to verify that the sampling plan was properly implemented.

 

Management's Response: IRS management does not agree with this recommendation. The IRS selects cases based upon a statistically valid methodology using a skip interval of every "Nth" case to ensure randomness. The IRS has explored the feasibility of using a reproducible listing process to select cases for review. It does not have the capability to pull a daily automated listing using the current systems. This automation change would take funding and years to implement. The manual gathering of this information would be extremely labor intensive and subject to error; therefore, it would not be feasible to implement the recommendation at this time.

Office of Audit Comment: Because the IRS's conclusion was reached after our audit work was completed and the draft report issued, we did not review the underlying details supporting the IRS's conclusion. As a result, we cannot confirm the limitations and feasibility of verifying the IRS's sample selection process. If documentation of the sample selection process is not maintained and cannot be verified or replicated later, the IRS cannot be assured of the statistical validity of the NQRS results.

FOOTNOTES

 

 

1 See Appendix VI for a glossary of terms.

2 TIGTA, Ref. No. 2010-30-059, Accuracy-Related Penalties Are Seldom Considered Properly During Correspondence Audits (Jun. 2010).

3 The projection was made using a confidence level of 95 percent, precision rate of ± 5 percent, and expected occurrence rate (error rate) of 79 percent.

4 TIGTA, Ref. No. 2010-30-024, Significant Tax Issues Are Often Not Addressed During Correspondence Audits of Sole Proprietors (Feb. 2010).

5 The projection was made using a confidence level of 95 percent, precision rate of ± 5.32 percent, and expected occurrence rate (error rate) of 43.29 percent.

6 Government Accountability Office (formerly known as the General Accounting Office), GAO/AIMD-00-21.3.1, Standards for Internal Control in the Federal Government (Nov. 1999).

7 Appendix IV provides an overview of the quality attributes considered by NQRS reviewers.

8 Discretionary audits focus on a variety of tax issues other than Earned Income Tax Credits and are identified from a number of sources, including the Discriminant Index Function (an automated system for scoring individual tax returns according to their audit potential), studies/research projects, third-party document matching, and Federal and State Government agency referrals.

9 Appendix V contains the correspondence audit portion of the IRS's campuses Business Performance Review report for the first quarter of Fiscal Year 2013.

 

END OF FOOTNOTES

 

 

* * * * *

 

 

Appendix I

 

 

Detailed Objectives, Scope, and Methodology

 

 

The objectives of this review were to determine the accuracy of results from the NQRS and how management uses the feedback to enhance the quality of correspondence audits.

To accomplish these objectives, we:

I. Evaluated the controls and procedures for the NQRS program when considering correspondence audits.

 

A. Reviewed Internal Revenue Manual sections, management directives, quality reviewer training materials, and notices that provide controls and procedures for NQRS correspondence audits.

B. Interviewed officials and conducted a walkthrough of the NQRS process to evaluate the controls and procedures for quality reviewers to follow when reviewing correspondence audits.

 

II. Evaluated the accuracy of the sampling process for the NQRS program to determine whether NQRS quality reviewers are accurately identifying deficiencies for correspondence audits.

 

A. Determined whether the NQRS case sampling process in Fiscal Years 2010 and 2011 (through March 2011) for correspondence audits used an acceptable statistical sampling method.

 

1. Evaluated the sampling process by consulting with an outside statistician.

2. Evaluated NQRS management reports displaying sampling sizes and reviewed results in Fiscal Years 2010 and 2011 (through March 2011) for correspondence audits to determine if any anomalies are identified related to the sampling process.

 

B. Reviewed a statistically valid sample of Fiscal Years 2010 and 2011 (through March 2011) NQRS cases for correspondence audits to determine the accuracy of actions by quality reviewers.

 

1. Obtained NQRS computer data for correspondence audits in Fiscal Years 2010 and 2011 (through March 2011). We validated the data to ensure that we could reasonably rely on the completeness and accuracy of the data provided by comparing NQRS management report totals to the totals from the data provided, conducting scans of the provided data fields and elements, and judgmentally researching cases on the Integrated Data Retrieval System and/or Audit Information Management System to confirm a correspondence audit took place.

2. Selected two statistically valid attribute samples of NQRS correspondence audits for Earned Income Tax Credit "Business" Programs and Discretionary "Non-Business" Programs closed during Fiscal Years 2010 and 2011 (through March 2011), using a confidence level of 90 percent, precision rate of ± 10 percent, and expected occurrence rate (error rate) of 50 percent. A statistical sample was taken because we wanted to estimate the number of correspondence audits with deficiencies. The total number of NQRS correspondence audits was 3,098 records for Fiscal Year 2010 (2,039 records) and Fiscal Year 2011 through March 2011 (1,059 records). After eliminating invalid Taxpayer Identification Numbers, the total population was 3,091 records, and after eliminating duplicate Taxpayer Identification Numbers, the review population was 2,913 records. We over sampled by ordering 99 Discretionary and 93 Earned Income Tax Credit cases. Our final sample sizes were 61 audits for Earned Income Tax Credit "Business" Programs and 66 audits for Discretionary "Non-Business" Programs. We shared our sampling methodology with an outside statistician who confirmed the accuracy of our methodology.

3. Obtained and evaluated the sample correspondence audit case files to determine if quality deficiencies existed and if the quality reviewer also identified the deficiencies. We confirmed our exceptions with a designated IRS employee.

4. Estimated the number of correspondence audit deficiencies that should have been identified through the NQRS and compared that with NQRS management reports to determine the potential number of unreliable records in the NQRS.

5. Determined if quality reviewers missed deficiencies that could lead to taxes being avoided by the taxpayer and estimated the potential number of correspondence audits with the amount of taxes that the NQRS did not identify.

6. Determined if quality reviewers missed deficiencies that could lead to taxpayer burden and estimated the potential number of correspondence audits with taxpayer burden that the NQRS did not identify.

 

C. Determined whether NQRS quality reviewers were receiving consistent training and feedback.

 

1. Evaluated the extent of NQRS training that quality reviewers received by reviewing the training records of those quality reviewers included in our case reviews.

2. Determined how often the quality reviewers' work is reviewed and feedback is provided by NQRS management.

3. Interviewed NQRS officials to evaluate how review results are being used to improve future NQRS quality review activities.

III. Evaluated how effectively management uses feedback from NQRS results to enhance the quality of correspondence audits.

 

A. Obtained copies of Fiscal Years 2008 through 2012 NQRS quality review reports and other NQRS feedback provided to correspondence audit managers to identify the top 10 reported quality concerns for correspondence audits. We also assessed the corrective actions taken in response to the concerns and whether any follow-up was taken to ensure that the corrective action was effective

B. Determined if concerns and corrective actions reported by the NQRS were reported in SB/SE Division business performance reports.

C. Reviewed TIGTA and Government Accountability Office reports to identify weaknesses and recommended corrective actions for correspondence audits and whether they were similar to concerns identified by the NQRS.

 

IV. Evaluated the risk for fraud, waste, and abuse to obtain reasonable assurance that widespread improprieties do not exist by considering actions and/or trends within our sample case review of closed correspondence examinations.

Internal controls methodology

Internal controls relate to management's plans, methods, and procedures used to meet their mission, goals, and objectives. Internal controls include the processes and procedures for planning, organizing, directing, and controlling program operations. They include the systems for measuring, reporting, and monitoring program performance. We determined the following internal controls were relevant to our audit objectives: IRS policies, procedures, and practices for determining whether examiners are meeting certain NQRS attributes during correspondence audits. We evaluated these controls by reviewing source materials, interviewing management, and reviewing a sample of 127 closed correspondence audits that had been previously reviewed by the NQRS.

 

* * * * *

 

 

Appendix II

 

 

Major Contributors to This Report

 

 

Nancy A Nakamura, Assistant Inspector General for Audit (Compliance and Enforcement Operations)

Augusta R. Cook, Acting Assistant Inspector General for Audit (Compliance and Enforcement Operations)

Carl Aley, Acting Assistant Inspector General for Audit (Compliance and Enforcement Operations)

Frank Dunleavy, Director

Michelle Philpott, Acting Director

Robert Jenness, Audit Manager

Debra Mason, Lead Auditor

Donna Saranchak, Senior Auditor

William Tran, Senior Auditor

Ali Vaezazizi, Auditor

 

* * * * *

 

 

Appendix III

 

 

Report Distribution List

 

 

Acting Commissioner

Office of the Commissioner -- Attn: Chief of Staff C

Office of the Deputy Commissioner for Services and Enforcement SE

Deputy Commissioner, SB/SE Division SE:S

Director, Campus Compliance Services, SB/SE Division SE:S:CCS

Director, Communications, Liaison, and Disclosure, SB/SE Division SE:S:CSO

Director, Examination, SB/SE Division SE:S:E

Director, Campus Reporting Compliance, SB/SE Division SE:S:CCS:CRC

Director, Exam Planning and Delivery, SB/SE Division SE:S:E:EPD

Director, Exam Policy, SB/SE Division SE:S:E:EP

Chief Counsel CC

National Taxpayer Advocate TA

Director, Office of Legislative Affairs CL:LA

Director, Office of Program Evaluation and Risk Analysis RAS:O

Office of Internal Control OS:CFO:CPIC:IC

Audit Liaison: Commissioner, SB/SE Division SE:S

 

* * * * *

 

 

Appendix IV

 

 

Quality Attributes Considered by National

 

Quality Review System Reviewers

 

 

For NQRS quality reporting purposes, attributes are grouped into the following categories:

 

_____________________________________________________________________

 

 

Attribute Category

 

Customer Accuracy

 

Category Definition

 

Giving the correct answer with the correct resolution.

 

Sample Attributes Used by NQRS Quality Reviewers

 

Attribute 715, Correct/Complete Response/ Resolution: Used to identify if the employee provided the taxpayer with the correct response or resolution to their case or issue and, if appropriate, took the necessary case actions or case disposition to provide response or resolution.
_____________________________________________________________________

 

 

Attribute Category

 

Regulatory Accuracy

 

Category Definition

 

Adhering to statutory/regulatory requirements when making determinations on taxpayer accounts/issues.

 

Sample Attributes Used by NQRS Quality Reviewers

 

Attribute 502, Penalty Computation: Used to identify if the employee correctly determined/computed the proposed or actual assessment(s) and/or abatement(s) of penalty as required.
_____________________________________________________________________

 

 

Attribute Category

 

Procedural Accuracy

 

Category Definition

 

Adhering to nonstatutory/nonregulatory internal process requirements when making determinations on taxpayer accounts/issues.

 

Sample Attributes Used by NQRS Quality Reviewers

 

Attribute 100, Complete Research-Account Related Systems: Used to identify if the employee properly researched and interpreted account-related systems.

Attribute 708, Addressed Full Scope of IRS Issues: Used to identify if the employee addressed all applicable open IRS issues when considering the full scope of the call/case.

_____________________________________________________________________

 

 

Attribute Category

 

Professionalism

 

Category Definition

 

Promoting a positive image of the IRS by using effective communication.

 

Sample Attributes Used by NQRS Quality Reviewers

 

Attribute 801, Clear/Professional Written Communication: Used to identify if all correspondence/documentation is professional. This includes the use of clear and appropriate language with no jargon to ensure that written communication is complete.
_____________________________________________________________________

 

 

Attribute Category

 

Timeliness

 

Category Definition

 

Resolving an issue in the most efficient manner through proper workload management and time utilization techniques.

 

Sample Attributes Used by NQRS Quality Reviewers

 

Attribute 904, Appropriate Timely Actions: Used to determine if appropriate timely actions were taken to resolve the case or issue.
_____________________________________________________________________

 

 

Source: Internal Revenue Manual.

 

* * * * *

 

 

Appendix V

 

 

Results From the Correspondence Audit Fiscal Year 2013 Business

 

Performance Review (1st Quarter)

 

 

 ______________________________________________________________________________

 

 

              HISTORICAL

 

                 DATA               FISCAL YEAR CUMULATIVE DATA         TARGETS

 

          ________________ ___________________________________________  _______

 

 

                                                                        FY 2012

 

 CAMPUS                                               FY 2012           Full

 

 REPORTING   FY             June     June     %       Plan to  % Plan   Year

 

 COMPLIANCE  2010  FY 2011  FY 2011  FY 2012  Change  Date     Accomp.  Plan

 

 ______________________________________________________________________________

 

 

 Correspondence

 

 Exam

 

 EITC

 

 STAFFING

 

 

 FTE       394.08   412.94   308.20   324.80    5.4%   322.05   100.9%   412.25

 

 

 CLOSURES/

 

 PRODUCTIVITY

 

 

 EITC Clo-

 

 sures    149,315  145,865  109,602  107,335   -2.1%  104,494   102.7%  148,855

 

 

 IMF >

 

 $200K         11       10        9       12   33.3%

 

 

 IMF >

 

 $1Mil          3        2        2        4    100%

 

 

 Closures

 

 per FTE      379      353      356      330   -7.1%      324   101.9%      361

 

 

 TIMELINESS

 

 

 Cycle Time   199      200      204      207    1.7%      193   107.3%      193

 

 

 CUSTOMER

 

 SATISFACTION

 

 

 % Satis-

 

 fied      47.00%   47.00%   47.00%   47.00%   0.00%   47.00%  100.00%    47.0%

 

 

 QUALITY

 

 

 Paper

 

 Accuracy  95.86%   95.63%   96.43%   93.67%  -2.86%    95.0%   98.60%    95.0%

 

 

 DISCRETIONARY

 

 STAFFING

 

 

 FTE

 

 (Gross) 1,813.74 1,758.09 1,296.59 1,230.57     -5% 1,234.17    99.7% 1,659.16

 

 

 Field

 

 Support/

 

 Non-

 

 Com-

 

 pliance   413.53   361.85   272.79  261.87      -4%   265.86    98.5%   356.94

 

 

 True

 

 Corr

 

 Exam

 

 FTE     1,400.21 1,396.24 1,023.81  968.7       -5%   968.31   100.0% 1,302.22

 

 

 TEFRA

 

 FTE       290.00   299.27   217.45  241.87      11%   237.70   101.8%   317.62

 

 

 Discre-

 

 tionary

 

 Non-

 

 TEFRA

 

 FTE     1,110.20 1,096.98   806.35  726.84     -10%   730.61    99.5%   984.60

 

 

 CLOSURES/

 

 PRODUCTIVITY

 

 

 Non-EITC

 

 Closures 412,063  405,542  324,046 321,308      -1%  306,644   104.8%  365,247

 

 

 TEFRA     31,870   22,957   17,599  32,274    83.4%   31,057   103.9%   18,202

 

 

 Discre-

 

 tionary

 

 Non-

 

 TEFRA    380,193  382,585  306,447 289,034    -5.7%  275,587   104.9%  347,045

 

 

 Total

 

 IMF      377,743  379,272  301,836 296,620      -2%  288,549           341,056

 

 

 IMF >

 

 $200K     65,760   66,938   52,557  63,354      21%   60,741   104.3%   64,990

 

 

 IMF >

 

 $1Mil     13,555   12,927    9,742  15,949      64%   15,420   103.4%   12,442

 

 

 Total

 

 BMF       34,320   26,270   22,210  24,688      11%   18,095   136.4%   24,191

 

 

 TIMELINESS

 

 

 Cycle

 

 Time         170     167       170     177       4%     177    100.2%      177

 

 ______________________________________________________________________________

 

 

 Source:  Correspondence Audit Fiscal Year 2013 Business Performance Review

 

 (1st Quarter).

 

 

 Note:  BMF = Business Master File, EITC = Earned Income Tax Credit, FTE =

 

 Full-Time Equivalent, FY = Fiscal Year, IMF = Individual Master File, TEFRA =

 

 Tax Equity and Fiscal Responsibility Act.

 

* * * * *

 

 

Appendix VI

 

 

Glossary of Terms

 

 

Activity Codes -- A code that identifies the type and condition of returns selected for audit.

Audit Information Management System -- A computer system used to control returns, input assessments/adjustments to the Integrated Data Retrieval System, and provide management reports.

Campus -- The data processing arm of the IRS. The campuses process paper and electronic submissions, correct errors, and forward data to the Computing Centers for analysis and posting to taxpayer accounts.

First-Line Manager -- A group manager in the Examination function responsible for supervision of IRS examiners.

Fiscal Year -- A 12-consecutive-month period ending on the last day of any month. The Federal Government's fiscal year begins on October 1 and ends on September 30.

Individual Master File -- The IRS database that maintains transactions or records of individual tax accounts.

Integrated Data Retrieval System -- The IRS computer system capable of retrieving or updating stored information; it works in conjunction with a taxpayer's account records.

National Quality Review System -- Allows national reviewers to evaluate closed audit files to determine whether examiners complied with quality attributes established by the IRS.

Skip Interval -- The number of elements in the population divided by the number of sampling units in the sample.

Tax Gap -- The difference between taxes that are legally owed and taxes that are paid on time.

 

* * * * *

 

 

Appendix VII

 

 

Management's Response to the Draft Report

 

 

August 20, 2013

 

 

MEMORANDUM FOR

 

MICHAEL E. MCKENNEY

 

ACTING DEPUTY INSPECTOR GENERAL FOR AUDIT

 

 

FROM:

 

Faris R. Fink

 

Commissioner,

 

Small Business/Self-Employed Division

 

 

SUBJECT:

 

Draft Audit Report -- Actions Are Needed to Strengthen the

 

National Quality Review System for Correspondence Audits

 

(Audit #201130027)

 

 

Thank you for the opportunity to review your draft report titled, "Actions Are Needed to Strengthen the National Quality Review System for Correspondence Audits". The audit is one of the IRS' primary enforcement tools to address noncompliance with the tax law. Over the last five years, our statistics show we conducted almost 5.7 million correspondence audits. The quality of our work is important and we thank you for acknowledging that we have established a comprehensive system to measure the quality of correspondence audits.

The auditing standards outline the criteria determined to produce a quality examination and are guidelines to assist examiners in the completion of their cases. They are used in conjunction with applicable program procedures for evaluating case quality. In the campus environment, due in part to the type of examination conducted and in part to the nature of our processes, some auditing standards will be less applicable. Our program procedures and quality measures define how they align with our work. Campus examinations are generally narrowly focused. They are limited to a particular schedule or issue, or focus on stopping erroneous refunds. We will determine if discrepancies exist between the auditing standards and the quality attribute coding and adjust the process or guidance as needed.

Our campus organizations manage a wide variety of programs. Because of the number and variety, we limit our highest level reporting, the Business Performance Review (BPR), to an overall quality score. However, we measure and evaluate each of these programs on a number of attributes.

We regularly share more specific information about the program performance with executives and all levels of management who use it to monitor, evaluate and take action to improve the programs. We will determine if providing additional quality data with more frequency will be helpful.

We do not agree with your third recommendation. Audit cases selected for NQRS are determined based upon a statistically valid methodology designed by Statistics of Income (SOI) as described in the sample plan. We ensure the randomness of a sample by selecting the "Nth" case using a skip interval based on the number of required reviews and the population of work. Upon closure of paper cases they are placed in a designated area ready for potential review. The clerk pulls the cases by applying the skip interval as determined in the sample plan. Once the sample has been selected the unused cases are removed and the area is stocked with the next day's cases. While the process is valid, it is not reproducible because the cases are removed daily. We have reviewed our current system to determine alternatives for creating a reproducible process. Given the volume of open cases is over 300,000, the resources necessary to implement a reproducible manual process would not be cost effective. In the alternative, a systemic report could be created, but this would take funding and years to implement. Based upon labor and cost restrictions, we are unable to implement a systemic method as per your recommendation.

Attached is a detailed response outlining our corrective actions.

If you have any questions, please contact me, or a member of your staff may contact Denice Vaughan, Director, Campus Compliance Services, at (404) 338-9116.

Attachment

 

* * * * *

 

 

Attachment

 

 

RECOMMENDATION 1:

Ensure that the auditing standards are better aligned with the NQRS attributes and quality measures.

CORRECTIVE ACTION:

We will determine if discrepancies exist between the auditing standards and the quality attribute coding and adjust the process as needed.

IMPLEMENTATION DATE:

October 1, 2014

RESPONSIBLE OFFICIAL(S):

Director, Campus Reporting Compliance, Small Business/Self-Employed

CORRECTIVE ACTION MONITORING PLAN:

IRS will monitor this corrective action as part of its internal management system of controls.

RECOMMENDATION 2:

Ensure that a more complete picture of correspondence audit quality is provided to NQRS customers on a regular basis.

CORRECTIVE ACTION:

We will determine if providing our executives and program managers additional quality data with greater frequency would be helpful and feasible, and make any appropriate changes.

IMPLEMENTATION DATE:

October 1, 2013

RESPONSIBLE OFFICIAL(S):

Director, Campus Reporting Compliance, Small Business/Self-Employed

CORRECTIVE ACTION MONITORING PLAN:

The IRS will monitor this corrective action as part of its internal management system of controls.

RECOMMENDATION 3:

Ensure that the audits selected for NQRS review are randomly selected and documentation is maintained that will allow IRS management and third parties to verify that the sampling plan was properly implemented.

CORRECTIVE ACTION TAKEN:

We do not agree with this recommendation. We have explored the feasibility of using a reproducible listing process to select cases for review. We do not have the capability to pull a daily automated listing using our current systems. This automation change would take funding and years to implement. The manual gathering of this information would be extremely labor intensive and subject to error; therefore, it would not be feasible to implement your recommendation at this time.

IMPLEMENTATION DATE:

N/A

RESPONSIBLE OFFICIAL:

N/A

CORRECTIVE ACTION MONITORING PLAN:

N/A

DOCUMENT ATTRIBUTES
Copy RID