Menu
Tax Notes logo

SB/SE Audit Controls Require Improvements, GAO Says

DEC. 16, 2015

GAO-16-103

DATED DEC. 16, 2015
DOCUMENT ATTRIBUTES
  • Institutional Authors
    Government Accountability Office
  • Subject Area/Tax Topics
  • Jurisdictions
  • Language
    English
  • Tax Analysts Document Number
    Doc 2016-802
  • Tax Analysts Electronic Citation
    2016 TNT 9-15
Citations: GAO-16-103
IRS RETURN SELECTION Certain Internal Controls for Audits in the Small Business and Self-Employed Division Should Be Strengthened

 

United States Government Accountability Office

 

 

December 2015

 

 

_____________________________________________________________________

 

 

Why GAO Did This Study

 

 

IRS audits small businesses and self-employed individuals to ensure compliance with tax laws. Audits can help improve reporting compliance and reduce the tax gap -- the difference between taxes owed and those voluntarily paid on time, which is estimated at $385 billion annually after late payments and enforcement actions. Therefore, it is important that IRS makes informed decisions about how it selects taxpayers for audit.

GAO was asked to review IRS's processes and controls for selecting SB/SE taxpayers for audit. This report (1) describes these processes and (2) determines how well SB/SE's selection processes and controls support its mission to apply the tax law with integrity and fairness to all.

GAO reviewed IRS criteria, processes, and control procedures for selecting taxpayers for audit; assessed whether IRS control procedures followed Standards for Internal Control in the Federal Government; and reviewed nonprobability samples of over 200 audit files. GAO also conducted eight focus groups with SB/SE staff who review or make audit selection decisions and interviewed IRS officials.

What GAO Recommends

 

 

GAO recommends that IRS take seven actions to help ensure that the audit selection program meets its mission, such as establishing and communicating program objectives related to audit selection and improving procedures for documenting and monitoring the selection process. In commenting on a draft of this report, IRS agreed with the recommendations.

View GAO-16-103. For more information, contact James R. McTigue, Jr. at (202) 512-9110 or mctiguej@gao.gov

_____________________________________________________________________

 

 

What GAO Found

The Small Business/Self-Employed (SB/SE) division of the Internal Revenue Service (IRS) uses over 30 methods, called workstreams, to identify and review tax returns that may merit an audit. These returns were initially identified through seven sources which include referrals; computer programs that run filters, rules, or algorithms to identify potentially noncompliant taxpayers; and related returns that are identified in the course of another audit.

SB/SE's workstreams follow a general, multiphase process for identifying, reviewing (classifying), and selecting returns for audit. Within this general approach, the selection process varies across workstreams. Differences include the number of review steps and manual processes, which are greater for field audits compared to correspondence audits which generally focus on a single compliance issue and are identified using automated processes. For fiscal year 2013, IRS reported that SB/SE's primary workstream for field audits identified about 1.6 million returns as potentially most noncompliant. About 77,500 returns (5 percent) were selected for audit, a much smaller pool of returns than was initially identified.

SB/SE has control procedures for safeguarding data and segregating duties across the overall selection process, among others, but it has not implemented other key internal controls. The lack of strong control procedures increases the risk that the audit program's mission of fair and equitable application of the tax laws will not be achieved. Examples of internal control deficiencies include the following:

Program objectives and key term of fairness are not clearly defined. Fairness is specified in SB/SE's mission statement and referenced in IRS's procedures for auditors. However, IRS has not defined fairness or program objectives for audit selection that would support its mission of treating taxpayers fairly. GAO heard different interpretations of fairness from focus group participants. Not having a clear definition of fairness can unintentionally lead to inconsistent treatment of taxpayers and create doubts as to how fairly IRS administers the tax law. Further, the lack of clearly articulated objectives undercuts the effectiveness of SB/SE's efforts to assess risks and measure performance toward achieving these objectives.

Procedures for documenting and monitoring selection decisions are not consistent. SB/SE does not always require selection decisions and rationales to be documented. For example, SB/SE requires that some workstreams document survey decisions (when returns are not assigned for audit), rationale, and approval using a form. Other workstreams, such as its primary workstream for field audits, require a group manager stamp but do not require the rationale to be documented. Also, SB/SE does not always require classification decisions (when returns are assessed for audit potential and compliance issues) to be reviewed. Having procedures to ensure that selection decisions and rationale are consistently documented and reviewed can reduce the potential for error and unfairness.

                                Contents

 

 

 Letter

 

 

      Background

 

 

      SB/SE Uses a Multiphase Process and Many Methods to Identify and

 

      Review Returns for Potential Audit; Most Returns Are Not Selected

 

 

      Some SB/SE Procedures for Selecting Returns for Audit Met

 

      Internal Control Standards, but Objectives Were Unclear and

 

      Documentation and Monitoring Procedures Were Inconsistent

 

 

      Conclusions

 

 

      Recommendations for Executive Action

 

 

      Agency Comments and Our Evaluation

 

 

 Appendix I

 

 

      Objectives, Scope, and Methodology

 

 

 Appendix II

 

 

      Summary of Small Business/Self-Employed (SB/SE) Audit Results,

 

      Fiscal Year 2014

 

 

 Appendix III

 

 

      Description of Small Business/Self-Employed (SB/SE) Selection

 

      Methods or Workstreams

 

 

 Appendix IV

 

 

      Small Business/Self Employed (SB/SE) Selection Methods by Broad

 

      Identification Source

 

 

 Appendix V

 

 

      Examples of Similarities and Variations across Selection Methods

 

 

 Appendix VI

 

 

      Small Business/Self-Employed (SB/SE) Field Audit Sources and

 

      Audit Information Management System (AIMS) Source Codes

 

 

 Appendix VII

 

 

      Comments from the Internal Revenue Service

 

 

 Appendix VIII

 

 

      GAO Contact and Staff Acknowledgments

 

 

 Tables

 

 

      Table 1: Number of Returns Identified, Reviewed, and Selected for

 

               Field Audit for DIF and NRP Workstreams, Fiscal Year

 

               2013

 

 

      Table 2: Description of Samples to Review SB/SE Procedures for

 

               Selecting Returns to Audit

 

 

      Table 3: SB/SE Audits by Number, Amount of Recommended Additional

 

               Tax, Days to Conduct the Audit, and Direct Audit Hours

 

               Compared to All Internal Revenue Service (IRS) Audits,

 

               Fiscal Year 2014 (dollars in millions)

 

 

      Table 4: SB/SE Field and Campus Audits by Number, Amount of

 

               Recommended Additional Tax, Days to Conduct the Audit,

 

               and Direct Audit Hours Compared to All SB/SE  Audits,

 

               Fiscal Year 2014 (dollars in millions)

 

 

      Table 5: SB/SE Selection Methods or Workstreams by Broad

 

               Identification Source

 

 

      Table 6: SB/SE Field Audits by Number and Internal Revenue

 

               Service (IRS) AIMS Source Code, Fiscal Year 2014

 

 

 Figures

 

 

      Figure 1: Organizational Chart of IRS Operating Divisions and

 

                SB/SE Audit Offices

 

 

      Figure 2: SB/SE's General Process for Selecting Returns for Audit

 

 

      Figure 3: Percent of SB/SE Closed Field Audits by Workstream or

 

                Identification Group, Fiscal Year 2014

 

 

      Figure 4: Example of Selection Processes for Internal Revenue

 

                Service (IRS) SB/SE Field Audits

 

 

      Figure 5: Example of Selection Processes for IRS SB/SE Campus

 

                Audits

 

 

                             Abbreviations

 

 

 AIMS      Audit Information Management System

 

 

 ASFR      Automated Substitute for Return

 

 

 AUR       Automated Underreporter

 

 

 CIP       Compliance Initiative Project

 

 

 CRC       Campus Reporting Compliance

 

 

 DDb       Dependent Database

 

 

 DEBR      Discretionary Exam Business Rules

 

 

 DIF       Discriminant Function

 

 

 E&G       Estate and Gift

 

 

 EITC      Earned Income Tax Credit

 

 

 ERM       Enterprise Risk Management

 

 

 GLD       Government Liaison and Disclosure

 

 

 IRM       Internal Revenue Manual

 

 

 IRS       Internal Revenue Service

 

 

 NRP       National Research Program

 

 

 SARP      State Audit Referral Program

 

 

 SB/SE     Small Business/Self-Employed Division

 

 

 SFR       Substitute for Return

 

 

 TEFRA     Tax Equity and Fiscal Responsibility Act of 1982

 

 

 TIGTA     Treasury Inspector General for Tax Administration

 

 

 W&I       Wage and Investment

 

_____________________________________________________________________

 

 

This is a work of the U.S. government and is not subject to copyright protection in the United States. The published product may be reproduced and distributed in its entirety without further permission from GAO. However, because this work may contain copyrighted images or other material, permission from the copyright holder may be necessary if you wish to reproduce this material separately.
_____________________________________________________________________

 

 

December 16, 2015

 

 

The Honorable Kevin Brady

 

Chairman

 

Committee on Ways and Means

 

House of Representatives

 

 

The Honorable Peter Roskam

 

Chairman

 

Subcommittee on Oversight

 

Committee on Ways and Means

 

House of Representatives

 

 

The Internal Revenue Service's (IRS) Small Business/Self-Employed Division (SB/SE) -- one of four IRS operating divisions -- oversees about 57 million taxpayers who file income, employment, excise, estate, or gift returns.1 SB/SE's enforcement responsibilities include auditing individual and business tax returns to detect misreporting.2 Audits can help improve reporting compliance and reduce the tax gap -- the difference between taxes owed and those paid voluntarily and on time.3 As we have previously reported, small businesses are a key contributor to the tax gap.4 Nearly 40 percent, or $179 billion, of the tax gap can be attributed to the underreporting of both business income on individual income tax returns and the self-employment tax that is largely assessed on business income for self-employed taxpayers. An additional 4 percent of the tax gap, or $19 billion, can be attributed to underreporting by small corporations, which IRS defines as having less than $10 million in assets.

Audits provide IRS with an important enforcement tool to identify noncompliance in reporting tax obligations as well as to enhance voluntary reporting compliance. If taxpayers perceive the selection of returns for audit as unfair, their confidence in IRS could be undermined and voluntary compliance could be undercut. The mission of IRS, as well as SB/SE, incorporates these concepts of ensuring compliance and fairly applying the tax law.5

Given the concerns expressed about the fairness of selecting taxpayers for review, you asked us to review SB/SE's processes and controls for selecting returns for audit.6 This report (1) describes SB/SE's processes for selecting returns for audit; and (2) assesses how well SB/SE's processes and controls for selecting returns for audit support its mission, including applying the tax law with integrity and fairness to all.

For the first objective, we reviewed IRS documents to understand the processes and procedures that SB/SE uses to prioritize, identify, review, and select returns for audit, as well as how SB/SE documents selection decisions. For the second objective, we reviewed the procedures identified above that SB/SE uses to help achieve its stated mission of applying "the tax law with integrity and fairness to all." We then assessed whether these procedures adhered to relevant federal standards for internal controls.7 To determine the extent to which SB/SE implemented its selection procedures, we conducted a file review consisting of (1) a nongeneralizable sample of 173 SB/SE audit cases opened between March 2014 and February 2015, (2) a nongeneralizable sample of 30 evaluations on how well SB/SE staff reviewed returns, and (3) a nongeneralizable sample of 30 returns that were selected for audit but later removed, or screened out, from the audit inventory (i.e., surveyed).8 While our samples are not representative of their populations, we selected the samples to ensure coverage across a broad range of key criteria and dimensions, including the extent to which manual processes are involved. We used the results of the file review in combination with other sources of information to assess the internal controls that help safeguard the fairness of the return selection process, and not to specifically look for cases of inappropriate selection. We also conducted eight focus groups with selected SB/SE staff who are responsible for reviewing or making return selection decisions. For both objectives, we analyzed data from IRS's Audit Information Management System. Based on our testing of the data and review of documentation and interviews, we determined that these data were reliable for the purposes of this report. Finally, we interviewed SB/SE officials about return selection processes and procedures, and to discuss any potential deficiencies we identified.

We conducted this performance audit from September 2014 to December 2015 in accordance with generally accepted government auditing standards. Those standards require that we plan and perform the audit to obtain sufficient, appropriate evidence to provide a reasonable basis for our findings and conclusions based on our audit objectives. We believe that the evidence obtained provides a reasonable basis for our findings and conclusions based on our audit objectives. More detailed information on our scope and methodology appears in appendix I.

Background

IRS's operating divisions develop annual plans to guide audit decisions in terms of the number of returns to be audited. SB/SE audit plans strive to balance the number of audits in any fiscal year across all types of tax returns (e.g., individual income tax returns) and taxpayers (e.g., individual wage earners, small businesses, corporations) given the available number and location of IRS auditors, and knowledge about types of noncompliance to pursue through audits.

SB/SE conducts audits through field offices located in seven regional areas. These audits generally are conducted by meeting with the taxpayer and/or his or her representatives. The field auditors include revenue agents who tend to audit the most complex returns and tax compliance officers who tend to audit simpler returns. SB/SE also does audits through its four campus locations; these audits tend to be the simplest and are generally done by tax examiners through correspondence with the taxpayers.9 Figure 1 shows an organizational chart of IRS's operating divisions and SB/SE's audit offices.

 

Figure 1: Organizational Chart of IRS Operating Divisions and

 

SB/SE Audit Offices

 

 

 

 

Source: GAO analysis of IRS information. GAO-16-103

In fiscal year 2014, SB/SE closed 823,904 audits, representing more than half of nearly 1.4 million closed audits across IRS in fiscal year 2014. SB/SE audits resulted in over $12 billion of the $33 billion in total recommended additional taxes across all IRS audits.10 For details on results of SB/SE audits, see appendix II.

In addition to audits, IRS conducts nonaudit compliance checks, which may lead to an audit. These checks include the Math Error, Automated Underreporter (AUR), and Automated Substitute for Return (ASFR) programs. The Math Error program electronically reviews tax returns as they are filed for basic computational errors or missing forms/schedules. Several months after returns have been filed, AUR electronically matches information reported by third parties, such as banks or employers, against the information that taxpayers report on their tax returns. This matching helps identify potentially underreported income or unwarranted deductions or tax credits. ASFR also uses information return data to identify persons who did not file returns; constructs substitute tax returns for certain nonfilers; and assesses tax, interest, and penalties based on those substitute returns. Although these and other compliance checks may identify potentially noncompliant tax returns that are subsequently audited, these programs are not the subject of this report.

In March 2014, IRS's Chief Risk Officer, who oversees its agency-wide program to identify and assess risks, completed a high-level, risk-based review of the IRS audit selection process.11 The review focused on the potential for bias based on the judgment of the Risk Officer and not on analysis against objective standards, such as comparing steps in the process to the internal control standards. Even so, the Risk Officer concluded that IRS maintained sound internal controls in its audit programs and that the risk of partiality in IRS's audit selection was very low. The risk of partiality appeared lowest in the automated selection programs. It appeared to be slightly higher for manual selection and referral programs because greater employee judgment was involved.

SB/SE Uses a Multiphase Process and Many Methods to Identify and Review Returns for Potential Audit; Most Returns Are Not Selected

SB/SE Uses a Multiphase Process to Select Tax Returns for Audit

SB/SE selects potentially noncompliant tax returns for audit using a multiphase process intended to enable IRS to narrow the large pool of available returns to those that most merit investment of audit resources. As shown in figure 2, in broad terms, this process generally includes (1) identifying an initial inventory of tax returns that have audit potential (e.g., reporting noncompliance), (2) reviewing that audit potential to reduce the number of returns that merit selection for audit (termed "classification"), (3) selecting returns by assigning them to auditors based on a field manager's review of audit potential given available resources and needs, and (4) auditing selected returns.12

 

Figure 2: SB/SE's General Process for Selecting

 

Returns for Audit

 

 

 

 

Source: GAO analysis of IRS information. GAO-16-103

SB/SE Uses More Than 30 Methods to Identify and Review Returns for Potential Audit

SB/SE uses 33 methods, called workstreams, to identify and review tax returns that may merit an audit.13 These workstreams can be categorized into seven groups based on how the return was initially identified (see appendix IV for a table of workstreams by group).14 We have listed these groups in general order of how much discretion is involved in identifying, reviewing, and selecting returns, starting with those that involve more discretion. This ordering does not correspond to the number of audits conducted. For example, although referrals generally involve more discretion in selecting returns for audit, they do not make up the largest percentage of SB/SE field audits (see figure 3).

  • Referrals. IRS employees and units, as well as external sources, such as other agencies and citizens, can refer potentially noncompliant taxpayers to SB/SE. SB/SE may start an audit if the referral indicates significant potential for noncompliance. Referrals can involve, among others, those promoting shelters created to avoid taxation, whistleblowers, and those not filing required tax returns.15

  • Related pickups. After opening an audit, SB/SE may identify the taxpayer's prior or subsequent year returns or returns of related taxpayers to audit.

  • User-developed criteria. These criteria use filters or rules embedded in computer software to identify returns with specific characteristics, often for projects. These characteristics generally involve a specific tax issue known or suspected to have high noncompliance in a particular geographic area, industry, or population. For example, the criteria may be used for projects that explore or test ways to uncover noncompliance or improve compliance.

  • Computer programs. Computer programs use rules or formulas to identify potential noncompliance across a type of tax return, rather than for a specific tax issue. For example, IRS uses a computer algorithm, the discriminant function (DIF), to determine the probability of noncompliance somewhere on the tax return. When a return receives a high enough score, SB/SE may review the return for audit potential.

  • Data matching. When information on a tax return -- such as wages, interest, and dividends -- does not match information provided to IRS by states, employers, or other third parties, these discrepancies may prompt SB/SE to review returns for audit potential. An example of a workstream that uses data matching is the payment card income pilot, which uses information from credit card transactions to identify income that may be underreported.16

  • Taxpayer-initiated. When taxpayers contact IRS to request an adjustment to their respective tax returns, tax refunds, or tax credits, or request to have a previous audit reconsidered, SB/SE may initiate an audit after reviewing these requests.

  • Random identification. The National Research Program (NRP) studies tax compliance through audits of a randomly-identified sample of tax returns.17 Specifically, NRP measures voluntary compliance in reporting income, deductions, and credits, among other categories, and generalizes those measures to the population being studied.

 

SB/SE Selection Methods Have Similarities but Also Vary

All of SB/SE's selection methods or workstreams follow the general multiphase selection process to identify and review potentially noncompliant returns before selecting and actually auditing them. Workstreams also share some common characteristics. For example, multiple staff are involved in the various phases so that one person cannot control the entire process. About one-third of the workstreams use some form of automation to identify the returns that should enter the workstream. Most workstreams involve some form of manual review to determine which returns have audit potential. For example, IRS auditors review (i.e., classify) tax returns identified as having audit potential to determine which returns have the highest potential and which parts of the return should be audited. Finally, all workstreams screen out returns as part of the review process.18 This winnowing means that the large pool of returns initially identified as having audit potential becomes a much smaller pool of returns that are selected for audit.

However, variations exist among the workstreams, particularly between the field and campus. For example, the field process generally uses more review steps and manual involvement (e.g., classification) than for campus. The latter generally focuses on a single compliance issue and relies more on automated filters and rules to identify returns. Among field workstreams, the extent of review varies. For example, a few workstreams use a committee to review proposals and authorize new projects or investigations before returns can enter the workstream. Also, for field audits, group managers generally decide whether to assign, hold, or screen out returns for audit, whereas returns selected for campus audits are generally assigned through automated processes after campus analysts review the returns to ensure that they adhere to the selection rules embedded in the automated processes. Some workstreams, such as taxpayer claims and some referrals, involve more manual processes to identify and review returns; other workstreams involve both manual and automated processes or are almost entirely automated. Finally, the procedures for screening out returns vary across workstreams.19

SB/SE Relied on Different Methods in Its Field and Campus Locations to Select Most Returns for Audit

In fiscal year 2014, related pickups from various identification methods or workstreams accounted for about 50 percent of SB/SE closed field audits.20 Most of these pickups were related to various ways in which taxpayers attempt to shelter income from taxation and DIF-sourced returns. The DIF workstream alone (part of the computer program identification group) accounted for over 22 percent of SB/SE closed field audits, and various referral workstreams accounted for nearly 7 percent, as shown in figure 3. For details on the workstreams included in the categories shown in figure 3, see appendix VI.

 

Figure 3: Percent of SB/SE Closed Field Audits by Workstream or

 

Identification Group, Fiscal Year 2014

 

 

 

 

Source: GAO analysis of IRS information. | GAO-16-103

Notes: Referrals include internal sources such as tax shelter audits and external sources such as state referrals or information reports provided by taxpayers. Claims include requests for refunds or reduction of tax liabilities assessed. All other includes several workstreams that each accounted for less than 3 percent of total SB/SE field audit closed cases in fiscal year 2014.

For campus audits closed in fiscal year 2014, available IRS data showed that 31 percent focused on the Earned Income Tax Credit (EITC).21 SB/SE relies on a computer program known as the Dependent Database (DDb) to identify most of the returns to be audited for EITC issues. DDb is a rules-based system that identifies potential noncompliance related to tax benefits based on the dependency and residency of children. According to IRS, DDb rules are reviewed yearly for changes, and no additional filtering or review is needed on the cases that are selected for audit. In fiscal year 2014, DDb identified more than 77 percent of the closed EITC audits. The other approximate 23 percent of closed EITC audits were identified using various other methods, such as referrals from within IRS and pickups related to audits of other tax returns.

Most Returns SB/SE Identified for Potential Audit Were Not Selected

SB/SE does not have complete data on the number of returns that are initially identified as having audit potential, reviewed, and selected for audit for all 33 workstreams. Using data that are available, table 1 illustrates differences in the extent to which returns are winnowed from identification through selection for two workstreams. For example, about half of the DIF-sourced returns reviewed were selected for audit, and almost all returns reviewed for NRP were selected for audit.22

 Table 1: Number of Returns Identified, Reviewed, and Selected for Field Audit

 

                 for DIF and NRP Workstreams, Fiscal Year 2013

 

 ______________________________________________________________________________

 

 

                                         Fiscal Year 2013

 

                   ____________________________________________________________

 

 

                   Returns                            Returns         Returns

 

 Selection         identified for      Returns        sent for        selected

 

 workstream        potential audit     reviewed       assignment      for audit

 

 ______________________________________________________________________________

 

 

 DIF                    1,552,714a      151,836          138,88         677,446

 

 NRP                  144,874,882b       14,687           13,870         13,449

 

 ______________________________________________________________________________

 

 

 Source: IRS. | GAO-16-103

 

 

 Notes: We used fiscal year 2013 to allow more time for audit return selection,

 

 which can occur for up to 3 years after returns are filed and because complete

 

 data for 2014 was not available at the time of our review. Estimated numbers

 

 only include field audits and primary returns. NRP returns are reported by tax

 

 year, not fiscal year.

 

 

                              FOOTNOTES TO TABLE 1

 

 

      a This is the number of individual tax returns filed whose DIF

 

 score

 

 exceeded the threshold for determining which returns were potentially most

 

 noncompliant.

 

 

      b This is the number of all Form 1040 individual income tax

 

 returns filed,

 

 which are eligible for random selection under NRP. Randomly selecting returns

 

 enables IRS to generalize estimates of reporting compliance to the population

 

 of all such returns filed.

 

 

                          END OF FOOTNOTES TO TABLE 1

 

 

Some SB/SE Procedures for Selecting Returns for Audit Met Internal Control Standards, but Objectives Were Unclear and Documentation and Monitoring Procedures Were Inconsistent

An effective internal control system can help federal agencies achieve their missions and objectives and improve accountability. As set forth in Standards for Internal Control in the Federal Government, also known as the Green Book, internal controls comprise the plans, methods, and procedures used to meet an entity's mission, goals, and objectives, which support performance-based management.23 Internal controls help agency program managers achieve desired results. They also provide reasonable assurance that program objectives are being achieved through, among other things, effective and efficient use of resources. Internal control is not one event, but rather a series of actions and activities that occur throughout an entity's operations and on an ongoing basis. Two examples of internal control standards are the establishment of clearly defined objectives and a commitment to documenting significant events.

SB/SE has some procedures in place that are consistent with internal control standards. However, we identified some internal control weaknesses that leave SB/SE vulnerable to inconsistent return selection for audit or the perception of it.

Some SB/SE Procedures Met Internal Control Standards

Our review of IRS and SB/SE procedures on selecting returns for audit found several procedures that adhered to internal control standards which provided some assurance of fairness and integrity in the selection process.24 For our review, we relied on documentation demonstrating that the standards were employed and did not independently test whether the standards were systemically applied.25

  • Ethics. SB/SE demonstrated a commitment to promoting ethical behavior among staff, which provides some high-level assurance that it may be able to meet its goal of integrity and fair treatment of taxpayers in general. For example, IRS's ethics training and annual certification process provide some assurance that IRS staff should be aware of the need to act ethically and impartially.

  • Awareness of internal controls by managers. SB/SE has demonstrated a commitment to employ internal control activities to ensure accountability in achieving its mission. All managers are required to do an annual self-assessment of internal control procedures. To the extent that SB/SE managers report deficiencies and SB/SE uses the results, the annual self-assessment can provide assurance that the importance of internal control is understood in SB/SE. Our work was not designed to test how effectively IRS used the self-assessments to identify and address deficiencies.

  • Segregation of duties. All of SB/SE's selection workstreams involve multiple parties so that no individual can control the decision-making process. For example, staff who classify a return cannot later audit the same return. Also, for field audits, IRS coordinators in an area office generally determine which returns will be assigned to the field offices, rather than field offices and auditors generating their own work. SB/SE also has procedures to ensure that managers review about 10 percent of returns classified for the DIF and NRP workstreams. Also, managers must approve auditors' requests to open audits for prior or subsequent year and related returns. Although not every step in the selection process is reviewed, these procedures provide some assurance that the decision to audit a return is not determined unilaterally.

  • Safeguarding data/systems. SB/SE demonstrated that safeguards are in place to restrict system access to authorized users. IRS has procedures on system security and uses a multitiered authentication process to control system access, which we observed.

 

SB/SE Has Not Clearly Defined or Communicated "Fairness" in Its Return Selection Process

The mission statements for both IRS and SB/SE declare the strategic goal of administering the "tax law with integrity and fairness to all." SB/SE officials stated that integrity and fairness are core values of IRS. However, they did not define these terms or provide evidence that staff know what is to be achieved by this strategic goal. Without a clear definition of fairness that has been communicated to staff, SB/SE has less assurance that its staff consistently treat all taxpayers fairly.

 

_____________________________________________________________________

 

 

Internal Control Standard: Define objectives

 

 

Internal control standards call for program objectives to be clearly defined in measurable terms to enable the design of internal control for related risks. Specific terms should be fully defined and clearly set forth so they can be easily understood at all levels of the entity. Consistent information must be reliably communicated throughout the entity if the entity is to achieve its goals.

Source: GAO. | GAO-16-103

_____________________________________________________________________

 

 

IRS's procedures manual, the Internal Revenue Manual (IRM), references behaviors auditors are to follow related to fairness, which may promote taxpayer confidence in IRS.26 For example,

"The purpose of the Internal Revenue Service is to collect the proper amount of tax revenues at the least cost to the public, and in a manner that warrants the highest degree of public confidence in our integrity, efficiency and fairness."

"All [auditors] must perform their professional responsibilities in a way that supports the IRS Mission. This requires auditors to provide top quality service and to apply the law with integrity and fairness to all."

"The obligation to protect taxpayer privacy and to safeguard the information taxpayers entrust to us is a fundamental part of the Service's mission to apply the tax law with integrity and fairness to all."

"Requirements governing the accuracy, reliability, completeness, and timeliness of taxpayer information will be such as to ensure fair treatment of all taxpayers."

These references point to the overall concept of fairness without explaining what it means, particularly when selecting tax returns for audit. Fairness can be difficult to define because everyone may have different concepts of what constitutes fair treatment. We heard different interpretations of fairness and integrity from IRS participants involved in the selection process during the eight focus groups we conducted. Given the different interpretations, not having a clear definition of fairness unintentionally can lead to inconsistent treatment of taxpayers and create doubts as to how fairly IRS administers the tax law. In our focus groups, SB/SE staff stated that they viewed audit selection as fair when they:

  • focus on large, unusual, and questionable items,

  • do not consider taxpayer's name, location, etc.,

  • avoid auditing taxpayers they know or may be in their neighborhood,

  • treat issues consistently across returns,

  • apply same standards,

  • treat all taxpayers the same,

  • account for varying costs across locations (e.g., housing costs), and

  • avoid being influenced by personal preferences.

 

Each comment represents someone's concept of fairness. According to SB/SE officials, IRS relies on the judgment of its staff to determine what is fair. Although many concepts sound similar, they can be different, or even incompatible. For example, some participants said that not considering a taxpayer's name or geographic location was fair treatment. However, other participants said that considering geographic location was necessary to avoid auditing taxpayers they knew or to determine whether expenses were reasonable for that location (e.g., larger expenses may be reasonable for high-cost locations). Also, some audit projects focus on indications of certain types of noncompliance in specific locations, such as an IRS area or a state. SB/SE officials stated that both views of fairness regarding location may be appropriate for classification.

We reviewed training materials used to instruct revenue agents in the decision-making process when selecting returns to audit, as well as the orientation briefing provided to staff assigned to classification details.27 Our review of the documentation, as well as discussions with focus group participants involved in classification, indicate that the training materials and the briefing have not defined fairness or how to apply it consistently when selecting returns for audit.28

Another challenge to treating all taxpayers consistently or under the same standard arises when the group manager in the field has to manage resource constraints. Some group managers talked about not having the right type and grade of auditor in a location to select a particular return that was deemed worth auditing. Others talked about not having enough travel money for auditors to justify selecting some tax returns. Group managers in other locations may be able to select a similar return because they have fewer of these constraints.

In addition, SB/SE officials said that what is fair may vary depending on the role of the IRS staff involved. They said IRS staff members may have different perspectives of what is "fair" depending on their responsibilities and position, such as IRS staff who are analysts or managers in headquarters versus analysts, auditors, and their managers in the field.

SB/SE Has Not Established Objectives for Fair Selection of Returns for Audit, Which Challenges Performance Measurement and Risk Management

 

_____________________________________________________________________

 

 

Internal Control Standard:

 

Assess risks and performance to objectives

 

 

Internal control standards call for management to set program objectives that align with an entity's mission, strategic plan, goals, and applicable laws and regulations. Clearly-defined objectives can enhance the effectiveness and efficiency of a program's operations and are necessary to assess risks. Objectives should clearly define what is to be achieved, who is to achieve it, and how and when it will be achieved. Documenting objectives promotes consistent understanding.

Source: GAO. | GAO-16-103

_____________________________________________________________________

 

 

SB/SE has not established objectives on the fair selection of returns. Without a definition of fairness, SB/SE cannot be assured that an objective for fair selection clearly indicates what is to be achieved. For example, objectives could be based on definitions of fairness that we heard in our focus groups, such as the extent to which selection occurs because of large, unusual, and questionable items on a return or because SB/SE is applying the same standards to similar tax returns.

SB/SE develops audit objectives in its annual work plan. For fiscal year 2014, audit objectives included (1) review workload identification and selection models, collaborate with other IRS units to revise processes/guidelines, and develop guidance and monitoring tools to ensure consistent application; and (2) use more research data to develop alternative workload identification streams and delivery. These objectives address the process of selecting returns but not whether returns are selected fairly. For example, applying selection models and processes consistently does not ensure that the models and processes were designed to achieve fairness. Further, IRS has not identified a level of consistency that would indicate that fairness has been achieved.29

Without clearly-defined objectives aligned to its mission and a clear understanding across SB/SE of how fairness is defined, SB/SE has less assurance that it is measuring progress toward or achieving its strategic goal of treating taxpayers fairly.

IRS Has Established Performance Measures, but None Directly Assessing Fair Selection of Returns for Audit

Given that SB/SE does not have clearly-defined objectives on fair selection, it also does not have performance measures aligned with these objectives and explicitly tied to integrity or fairness. For example, if IRS defined fairness as focusing on large, unusual, and questionable items and developed an objective based on this definition, performance measures could assess the quality and extent to which auditors focused on these items. SB/SE officials pointed to a variety of existing performance measures that they believe assess whether selection processes were impartial and consistent. Examples of these performance measures include:

  • IRS's Customer Satisfaction survey asks taxpayers to rate their satisfaction with the auditor's explanation for how the return was selected for audit.30 However, SB/SE did not show how answers were used to assess whether the selection process was fair or modify the process to make it fair. Further, taxpayer dissatisfaction is subjective, and taxpayers would not have context to know why their returns were selected compared to others.

  • SB/SE conducts business reviews to assess how well its selection process is performing. However, concerns raised in these reviews focused on selection process steps, such as ordering returns and conducting research projects, instead of the underlying fairness of selecting a return.

  • All employees are to be evaluated on how well they provide fair and equitable treatment to taxpayers as required by the Internal Revenue Service Restructuring and Reform Act of 1998; the IRM provides examples of behaviors that would meet this requirement.31 These behaviors may be consistent with IRS's mission, but they focus on how taxpayers were treated after the audit started rather than how auditors reviewed returns for potential audit selection.

 

Without performance measures that align with objectives to achieve fair selection, SB/SE lacks assurance that it can measure progress toward fair return selection.

IRS Has Taken Steps to Identify Risks, but Linkage to Audit Selection Objectives Is Needed

IRS's efforts to identify risks and assess whether and how to manage them operate under two complementary approaches.

  • Internal controls framework. The procedures in IRM 1.4.2 govern IRS's processes for monitoring and improving internal controls, which include the identification and mitigation of risks. Managers are expected to understand the risks associated with their operations and ensure that controls are in place and operating properly to mitigate those risks.

  • Enterprise Risk Management (ERM). ERM is broader in scope than internal controls, focusing on agency-wide risks. ERM is intended to help organizations in setting strategy to consider risk and how much risk the organization is willing to accept. IRS implemented ERM in February 2014 to increase awareness by IRS management of IRS-wide risks and to serve as an early-warning system to identify emerging challenges and address them before they affect operations.32

 

Both approaches to risk management require clear, defined objectives in measurable terms to identify and analyze risks that could challenge achieving desired outcomes. Risks toward achieving those objectives can be identified and analyzed, and risk tolerances can be determined.33

Understanding the significance of the risks to achieving objectives provides the basis for responding to the risks.

Without clear audit selection objectives on fairness, SB/SE lacks assurance that it can identify and assess risks to the fair selection of returns to audit. Absent risk identification and assessments linked to program objectives, vulnerabilities may go unaddressed, which could lead to unfair return selection.

SB/SE Has Not Consistently Documented Audit Selection Procedures and Decisions

 

_____________________________________________________________________

 

 

Internal Control Standard: Document transactions

 

 

Internal control and all transactions and other significant events need to be clearly documented, and the documentation should be readily available for review.

Source: GAO. | GAO-16-103

_____________________________________________________________________

 

 

  • Audit plan changes. Changes to the field audit plan are documented during the annual planning process, but SB/SE did not document its process for modifying the field audit plan during the year. According to SB/SE officials, they modify the plan during the year as additional budget and staffing information from IRS's finance unit becomes available. Officials stated that changes to this audit plan are documented by the budget information received and by the recalculated plan. However, SB/SE did not document how it translated the budget and staffing information into changes in the inventory targets or staffing nor why some targets were changed but not others.

  • Selection decisions and rationale. SB/SE did not consistently document decisions for selecting certain tax returns over others for audit and the rationale behind the decisions. SB/SE does not require all of these decisions and rationales to be documented. Returns that are stored electronically and are deemed to be excess inventory can be screened out without documentation such as a form, stamp, or signature. For discriminant function (DIF)-sourced returns, SB/SE's primary workstream for field audits, and some referrals, only a group manager stamp is required to screen out the returns, rather than also documenting the rationale for screening them out.34 Documentation requirements also vary within a workstream. For example, for returns involving a tax shelter fostered by a promoter, audit screen-out rationales are required to be documented at the group level in the field but not at the area office level. Officials said that, aside from the Form 1900 for certain returns, they generally do not document why a return was not selected. To illustrate, we found nine files without documentation of the screen-out decision or rationale in our file review of 30 screened-out returns.35 Regardless of whether a form is required, the screen-out decision should be documented.

  • Files not located. IRS could not locate 18 of the 233 files we requested in time for our review.36 For example, for non-DIF pickup returns, 5 out of 24 files requested were not located in time. For all types of referrals we reviewed, we were unable to review 8 out of 56 files requested because they were not located in time. According to officials, IRS could not locate these files because files for one audit may be stored with files for any number of related audits, files for open or recently closed audits may not yet be available, and files may have been stored in the wrong location.

 

In addition to internal control standards, the IRM requires all records to be efficiently managed until final disposition.37

Having procedures to ensure that selection decisions and rationale are clearly and consistently documented helps provide assurance that management directives are consistently followed and return selection decisions are made fairly. Further, being able to find files efficiently can aid congressional and other oversight, and prevent unnecessary taxpayer burden if IRS later needs to contact the taxpayer regarding material that would have been in the file.

SB/SE Has Not Regularly Monitored Decisions Made and Coding Used for Audit Selection

 

_____________________________________________________________________

 

 

Internal Control Standard: Monitor controls

 

 

Program managers should have a strategy and procedures to continually monitor and assure the effectiveness of its control activities. Key duties and responsibilities should be divided among different people to reduce the risk of error and to achieve organizational goals. Program managers need operational data to determine whether they are meeting their strategic and annual performance plans and their goals for effective and efficient use of resources.

Source: GAO. | GAO-16-103

_____________________________________________________________________

 

 

As discussed earlier in this report, SB/SE has procedures that, if implemented, help provide some assurance that its return selection process is generally monitored. However, we found that SB/SE did not have requirements to monitor certain steps in the selection process.
  • Dollar threshold for campus audits. We found that the dollar threshold for selecting some returns for campus audits has remained constant or has been adjusted informally based on inventory needs.38 SB/SE has not evaluated whether the threshold should change or be changed more formally. According to officials, the dollar threshold is the break-even point for collecting enough tax to justify the audit. However, the threshold is only a guide; sometimes the threshold can be higher depending on how many returns need to be audited to meet the audit plan. According to one official, the threshold amount has been in place at least 4 years and possibly as long as 10 years.

  • Classification review. We also found that classification decisions are not always required to be reviewed. For DIF and NRP returns, about 10 percent of classified returns are required to be reviewed for accuracy and adherence to classification guidelines. However, other field audit selection methods, including some referrals, do not include a formal classification quality review. Likewise, campus audit selections by analysts are not formally reviewed.

  • Review of group manager decisions. SB/SE does not always require that group manager return selection decisions (i.e., screen-out) be reviewed. Even though multiple people are involved, in some cases, the group manager can independently make the final selection or screen-out decision. For state and agency referrals, and others to varying degrees, screen-out decisions by group managers are not reviewed. For example, in our file review of 30 screened-out returns, 8 were screened out by group managers. We did not see documentation of the approval for screening out these returns because such documentation was not required. According to SB/SE officials, group managers are the most knowledgeable about the resources available to meet audit goals. The managers also consult with territory and area managers to determine which returns should be screened out. For campus audits, approvals are not required to screen out returns from audit. Officials said that workload selection analysts communicate about the status of current and upcoming work to determine which returns are excess inventory and not needed to meet the annual audit plan or unable to be worked because of resource limitations.39

  • Source codes. We found that some codes for identifying the return to be audited, called source codes, were mislabeled, not used, or not well defined, even though the IRM states that all data elements in IRS databases should be defined and documented.40 In our review of 215 files, six returns were coded as non-Tax Equity and Fiscal Responsibility Act of 1982 (TEFRA) related pickups.41 SB/SE officials later explained that these returns were mislabeled and should be moved to the source code used for TEFRA-related work. We also found two files that were coded as information referrals that should have been coded as related pickup audits, one file that was coded as a DIF-sourced return that should have been coded as a claim by a taxpayer to adjust a return he or she had filed, and three files that were coded as compliance initiative projects that should have been coded as returns selected to train auditors. For campus audits, source codes are assigned to each return audited but are not used to identify, select, or monitor campus inventory and do not serve any other purpose in campus audits. As a result, a source code may not represent the actual source of the inventory. Further, we found two source codes that were not well defined. One source code associated with about 35 percent of campus audits completed in fiscal year 2014 included references to DIF that were generally not applicable, since these returns were not related to or identified using DIF scoring. Another source code associated with about 18 percent of campus audits completed in fiscal year 2014 was labelled as two different items and did not accurately describe many of the returns using this code.42

 

Spreading responsibility for reviewing selection and screen-out decisions can reduce the potential for error and unfairness. In addition, adequate controls can help ensure that audits are appropriately coded so that IRS has accurate information to better ensure the efficient and effective use of resources. For example, having better controls on how returns are coded decreases the risk that data elements are misleading, which can hinder the decision-making process, such as prioritizing returns to select for audit and analyzing whether goals are met.

Conclusions

SB/SE relies on a variety of sources and processes to select returns for audit. This complexity underscores the importance of having a robust internal control system to support the selection process and achieve SB/SE's mission of administering the "tax law with integrity and fairness to all." SB/SE has some procedures in place that are consistent with internal control standards. However, we identified some internal control weaknesses that leave its audit program vulnerable to inconsistent return selection or the perception of it. Without effective internal controls, including defining fairness in selecting returns, SB/SE cannot know if it is achieving its mission and whether its return selection policies and procedures are aligned with its mission. Further, IRS will not be able to manage risk or monitor performance as well as it otherwise could. Finally, IRS risks the appearance that its return selection process is unfair to taxpayers because it is unable to communicate key pieces of information, such as its definition of fairness, to the public.

Recommendations for Executive Action

To help ensure SB/SE's audit selection program meets its mission and selects returns fairly, we recommend that the Commissioner of Internal Revenue take the following actions:

  • Clearly define and document the key term "fairness" for return selection activities.

  • Clearly communicate examples of fair selections to staff to better assure consistent understanding.

  • Develop, document, and implement program-level objective(s) to evaluate whether the return selection process is meeting its mission of applying the tax law with integrity and fairness to all.

 

To help ensure that SB/SE's audit selection objective(s) on fairness are used and met, we recommend that the Commissioner of Internal Revenue take the following actions:
  • Develop, document, and implement related performance measures that would allow SB/SE to determine how well the selection of returns for audit meets the new objective(s).

  • Incorporate the new objective(s) for fair return selection into the SB/SE risk management system to help identify and analyze potential risks to fair selections.

 

In addition, we recommend that the Commissioner of Internal Revenue take the following actions:
  • Develop and implement consistent documentation requirements to clarify the reasons for selecting a return for audit and who reviewed and approved the selection decision.

  • Develop, document, and implement monitoring procedures to ensure that decisions made and coding used to select returns for audit are appropriate.

 

Agency Comments and Our Evaluation

We provided a draft of this report to the Commissioner of Internal Revenue for review and comment. The Deputy Commissioner for Services and Enforcement provided written comments on November 23, 2015, which are reprinted in appendix VII. IRS stated that it agrees with the importance of sound internal controls and is committed to their improvement, especially in the areas we recommended. IRS stated that it agreed with our seven recommendations. Accordingly, the enclosure to the letter listed specific IRS actions planned to implement the recommendations. IRS also provided technical comments, which we incorporated where appropriate.

As IRS's letter mentioned, its audit program includes various features that are intended to promote fair return selection, such as documents that convey the importance of "fairness," existing objectives and measures, and types of monitoring. However, as our report discusses, these features do not clarify what fair selection of returns for audit entails and how IRS would know whether fair selections are occurring, except for when someone such as a taxpayer questions the fairness of return selection.

For our recommendations on defining and documenting "fairness" for return selection activities and communicating examples of fair selections to staff, IRS stated that the concept of fairness has both collective and individual attributes. IRS noted that fairness for return selection encompasses three components -- pursuing those who fail to comply, objectively selecting noncompliant returns across all areas of noncompliance, and respecting and adhering to taxpayers' rights. As such, IRS has taken the first step to implement our recommendation. However, to fully implement our recommendation, IRS will need to clarify how each component relates to return selection. For example, the first and third components also cover what happens after return selection, such as pursuing noncompliance and interacting with taxpayers during the audit.

In regard to our recommendations on developing one or more program objectives and related measures on return selection related to fairness, as our report discusses, IRS's current program objectives and measures do not address fair selection of returns. We believe that IRS should develop at least one objective and related measure that tie to its definition of fairness. Doing so would allow IRS to more conclusively demonstrate and assess whether its selection decisions were fair.

We also recommended that IRS improve the documentation and monitoring of selection decisions. Our report acknowledges that documentation and monitoring does occur in many areas but provides examples of the need for more in other areas. As such, IRS needs additional documentation and monitoring as opposed to merely a plan to evaluate the need to take these actions.

We note three other clarifications based on statements in IRS's letter.

  • First, IRS's letter correctly stated that our report did not identify any instances where the selection was considered inappropriate or unfair. We did not design our study to look for inappropriate and unfair selections, but rather to assess the internal controls that help ensure a fair selection process. Further, even if we did design our study to look for unfair selections, our design would be hampered by the lack of a definition for fairness and related objective(s) and measures(s) to evaluate whether selections were fair.

  • Second, IRS's letter stated that the seven groupings in our report do not reflect how IRS views its workstreams for identifying returns for potential audit selection. As discussed in the report, our groupings are based on how a return was initially identified rather than on IRS's workstreams. For example, related pickups, including DIF-related pickups, are identified by auditors, whereas DIF-selected returns are identified by a computer algorithm. Therefore, we separately grouped DIF-related pickups from DIF-selected returns. Furthermore, IRS could not provide complete data on the number of returns audited from each of its workstreams but could provide data on audits selected from other sources, such as related pickups. While some of these sources could be associated with a workstream, it was not possible for all. As a result, we used the available IRS data to show how all SB/SE audits were distributed by these audit identification workstreams and sources (shown in the report as figure 3).

  • Third, DIF return selections do not involve the least amount of discretion, as IRS's letter stated. As discussed in our report, many returns that were initially identified through DIF automation as having audit potential were not audited. The actual audit selections do not occur until multiple IRS staff review those returns, requiring some human discretion. Our report discusses other groupings with less staff discretion than DIF, such as when taxpayers request that IRS review their returns or when IRS randomly selects returns for a research program.

 

As agreed with your offices, unless you publicly announce the contents of this report earlier, we plan no further distribution until 30 days from the report date. At that time, we will send copies to the Chairmen and Ranking Members of other Senate and House committees and subcommittees that have appropriation, authorization, and oversight responsibilities for IRS. We will also send copies of the report to the Secretary of the Treasury, Commissioner of Internal Revenue, and other interested parties. In addition, this report is available at no charge on the GAO website at http://www.gao.gov.

If you or your staff have any questions or wish to discuss the material in this report further, please contact me at (202) 512-9110 or mctiguej@gao.gov. Contact points for our Offices of Congressional Relations and Public Affairs may be found on the last page of this report. Key contributors to this report are listed in appendix VIII.

James R. McTigue, Jr.

 

Director, Tax Issues

 

Strategic Issues

 

FOOTNOTES

 

 

1 This population of taxpayers includes 41 million self-employed individuals, 9 million small corporations, and 7 million other taxpayers.

2 An IRS audit (also called "examination") is a review of a taxpayer's books and records to determine whether information such as income, expenses, and credits are being reported accurately. Internal Revenue Code section 7602 gives the Secretary of the Department of the Treasury, IRS's parent agency, the authority to conduct audits.

3 In January 2012, IRS estimated that the gross tax gap was $450 billion in tax year 2006 (the most current estimate available). IRS estimated that it would eventually recover about $65 billion of the gross tax gap through late payments and enforcement actions, leaving an annual estimated net tax gap of about $385 billion.

4 GAO, Small Businesses: IRS Considers Taxpayer Burden in Tax Administration, but Needs a Plan to Evaluate the Use of Payment Card Information for Compliance Efforts, GAO-15-513 (Washington, D.C.: June 30, 2015).

5 Internal Revenue Manual (IRM) Part 1, Chapter 1, Section 16.1 (http://www.irs.gov/irm/) states SB/SE's mission.

6 This report is part of a larger body of our work on audit and collection case selection across IRS. See GAO, IRS Case Selection: Automated Collection System Lacks Key Internal Controls Needed to Ensure the Program Fulfills Its Mission, GAO-15-744 (Washington, D.C.: September 13, 2015); IRS Case Selection: Collection Process Is Largely Automated, but Lacks Adequate Internal Controls, GAO-15-647 (Washington, D.C.: July 29, 2015); and IRS Examination Selection: Internal Controls for Exempt Organization Selection Should Be Strengthened, GAO-15-514 (Washington, D.C.: July 13, 2015). Concerns about fairness were raised in the report, Treasury Inspector General for Tax Administration, Inappropriate Criteria Were Used to Identify Tax-Exempt Applications for Review, 2013-10-053 (Washington, D.C: May 14, 2013).

7 See GAO, Standards for Internal Control in the Federal Government, GAO/AIMD-00-21.3.1 (Washington, D.C.: November 1999). A newer version of the standards (see GAO, Standards for Internal Control in the Federal Government, GAO-14-704G (Washington, D.C.: September 2014)) took effect in fiscal year 2016. Because we conducted our review mainly in fiscal year 2015, we used the new standards only as context in preparing for an effective internal control system in the future, not as criteria to evaluate the current controls.

8 IRS uses the term "survey" to refer to the process of screening out returns after additional reviews reveal a reason not to conduct the audit. See IRM Part 4, Chapter 1, Section 3.6.

9 Campuses, formerly called service centers, are facilities where IRS performs various operations, such as processing tax returns, handling taxpayer calls, and conducting correspondence audits.

10 These data for all IRS audits exclude excise tax returns filed with the U.S. Customs and Border Protection and the Alcohol and Tobacco Tax and Trade Bureau; returns of tax-exempt organizations, government entities, employee retirement benefit plans, and tax-exempt bonds; and information returns.

11 IRS: Risk-Based Review of IRS Audit Selection Processes and Criteria, Mar. 4, 2014.

12 For campus audits, returns generally are not further reviewed by managers before starting the audit. After returns are selected for audit, taxpayers are notified, and auditors are assigned the return after taxpayers respond.

13 According to IRS officials, a workstream is a category or major type of work. Appendix III describes the workstreams.

14 Auditors identify related pickups when auditing another return, called the primary return. The primary return is identified through various methods or workstreams, including automated sources.

15 Referrals of nonfilers can come from IRS staff, such as those involved with collecting unpaid tax debts, and from automated IRS systems.

16 For information on IRS's plan for evaluating the payment card income pilot, see GAO-15-513.

17 IRS's NRP is an effort to measure taxpayer compliance for strategic planning and budget purposes. IRS uses NRP data to estimate the tax gap and update formulas used to identify tax returns for potential audits.

18 Some workstreams, such as NRP, allow few returns to be screened out. Returns identified for NRP cannot be screened out unless the taxpayer meets exclusionary criteria (e.g., death or natural disaster).

19 We did not attempt to describe every variation across workstreams. Appendix V includes high-level flowcharts of the selection process and provides examples of some of these similarities and differences.

20 We have shown certain workstreams or identification groups to better illustrate SB/SE's workload for field audits. IRS could not provide the number of closed audits by identification groups we developed. Therefore, to estimate the number of SB/SE closed audits by selection workstream or identification group, we used IRS Audit Information Management System source codes. We grouped these codes into the workstreams or identification groups shown in figure 3 and further described them in appendix VI.

21 Congress established EITC in 1975. It is used to (1) offset the impact of Social Security taxes on low-income families and (2) encourage low-income families to seek employment rather than public assistance. Generally, credit amounts depend on the number of qualifying children who meet age, relationship, and residency tests. 26 U.S.C. § 32. As we have reported, root causes of EITC noncompliance are that taxpayers or their tax return preparers determine eligibility and that IRS has limited ability to verify eligibility before issuing refunds. See GAO, Fiscal Outlook: Addressing Improper Payments and the Tax Gap Would Improve the Government's Fiscal Position, GAO-16-92T (Washington, D.C.:Oct.1, 2015). IRS focuses on EITC because of noncompliance in claiming EITC. For fiscal year 2014, IRS estimates a 27 percent noncompliance rate and $17.7 billion in noncompliance.

22 IRS reviews more returns than are selected for audit to ensure that it has enough returns that are ready to be audited when auditors are available. To determine how many returns need to be reviewed, IRS uses criteria to determine which returns have the most audit potential. We did not assess how IRS developed these criteria, whether IRS adhered to them, or their effectiveness because the processes were out of the scope of this review.

23 See GAO/AIMD-00-21.3.1.

24 We excluded areas where our other audit teams have reviewed internal controls. For example, a recent report highlighted weaknesses with controls in IRS's information security program. See GAO, Information Security: IRS Needs to Continue Improving Controls over Financial and Taxpayer Data, GAO-15-337 (Washington, D.C.: Mar. 19, 2015).

25 This discussion excludes our results from assessing certain procedures -- delegating responsibilities, applying outside evaluations, and responding to recommendations -- that appeared to adhere to internal control standards. However, the documentation on their adherence to the standards was not as clear as for the four procedures cited above in the report.

26 See IRM Part 4, Chapter 1, Section 5.1.20(1), and Chapter 10, Section 1.4(1), Section 1.6.10(1), and Section 1.6.10(2)(i).

27 Classification is the process of reviewing a return to determine if it may merit an audit.

28 The briefing references IRS's purpose to collect the proper amount of tax at the least cost to the public so as to foster confidence in IRS's integrity and fairness (IRM, Part 4, Chapter 1, Section 5.1.20(1)). The briefing also reviews IRM 4.1.5 on classification, national and local classification instructions, and expectations for the classification detail.

29 According to officials, SB/SE is striving to achieve a balance between ensuring a minimum rate of audit coverage across all the return types and doing more audits of some return types to address the areas of highest taxpayer noncompliance. This approach may address concerns about fairness across taxpayer groups, which SB/SE could incorporate into its definition of fairness. However, officials also said that this balance can shift based on changes in various factors, such as emerging issues and resource availability. These changes in the factors illustrate the need to clarify what fair return selection means to maintain this "balance."

30 A random sample of closed field audits is selected for this survey. Surveys ask taxpayers/representatives questions, most of which focus on with how satisfied they are with actions during the audit. For January 2015 to March 2015, IRS had an overall response rate of 18 percent. For the explanation the auditor provided on the reason for (a) the audit -- 55 percent of respondents were satisfied -- and (b) an expanded audit scope -- 49 percent of respondents were satisfied.

31 Pub. L. No. 105-206, 1204(b), 112 Stat. 685, 722 (1998). Examples of behaviors include timeliness in responding to taxpayers; discussing specific taxpayers with other staff only on a "need-to-know" basis; responding to taxpayers with appropriate tone, courtesy and respect; and stating facts accurately. See IRM Part 1, Chapter 5, Section 3.7.3.

32 As part of the ERM process, SB/SE has established a risk committee and developed a preliminary risk register. We found four risks in SB/SE's risk register that directly or indirectly involved return selection.

33 Risk tolerance is the acceptable level of variation in performance related to achieving objectives.

34 For mandatory work, such as nationally-coordinated research projects and some employee audits, group managers are required to explain the screen-out decision.

35 File review results are based on a nongeneralizable sample of SB/SE audits and are not representative of the population of SB/SE audits. Although nongeneralizable, the sample was taken to ensure coverage over a wide variety of audit types. For more information on our file review methodology, see appendix I.

36 We requested documentation for our file review from April 2015 to June 2015 and reviewed the majority of the files from June 2015 to August 2015. We accepted files for review up to early October 2015.

37 IRM Part 1, Chapter 15, Section 1.4.

38 We are not reporting a specific dollar threshold because doing so could impair IRS's efforts to enforce the tax laws.

39 Workload selection analysts order, review, select, and route the inventory of returns for campus audits.

40 IRM Part 2, Chapter 5, Section 13.1.

41 Pub. L. No. 97-248, §§ 401-407, 96 Stat. 324, 648-671 (1982). TEFRA established unified audit procedures for covered partnerships. A partnership was covered under these procedures if at any time during the year it had (1) more than 10 partners or (2) certain types of partners such as another partnership, a limited liability company that files as a partnership, and any type of trust, a nominee, a nonresident alien individual, among others. In November 2015, these provisions were repealed and replaced with new partnership audit procedures applicable to tax years starting after December 31, 2017. Pub. L. No. 114-74, § 1101 129 Stat. (2015).

42 IRS describes source code 06 as DIF Correspondence, which are returns converted from other programs related to high DIF-scored returns or to request returns related to a DIF return being audited at the campus. Depending on the documentation, IRS describes source code 20 as Erroneous Refunds or Regular Classification.

 

END OF FOOTNOTES

 

 

* * * * *

 

 

Appendix I: Objectives, Scope, and Methodology

 

 

This report (1) describes the processes for selecting Small Business/Self-Employed (SB/SE) returns for audit, and (2) assesses how well the processes and controls for selecting those returns support SB/SE's mission of "applying the tax law with integrity and fairness to all."

For the first objective, we reviewed Internal Revenue Service (IRS) documents that describe the processes and criteria for selecting SB/SE returns for audit. These documents included sections of the Internal Revenue Manual (IRM), procedures documents, process flowcharts, and summaries of selection processes prepared by SB/SE officials. We also interviewed IRS officials responsible for overseeing audit selection. To provide information on closed IRS and SB/SE audits, we analyzed data for 2011 through 2014 from the Compliance Data Warehouse Audit Information Management System (AIMS) closed table. We compared the results of our analyses of data in AIMS to the IRS data book to assess consistency of results. We determined that these data were sufficiently reliable for the purposes for which they were used in this engagement.

For the second objective, we reviewed SB/SE's procedures for selecting returns for audit and related internal controls intended to help SB/SE achieve its stated mission of "enforcing the tax law with integrity and fairness to all." We then assessed whether these procedures followed standards from Standards for Internal Control in the Federal Government that were relevant to return selection.1 To determine which standards were most relevant, we used our Internal Control Management and Evaluation Tool, in conjunction with observations from our preliminary audit work.2 We selected the most relevant internal control standards as criteria in consultation with SB/SE officials and our financial management and assurance and information technology teams.

We also conducted eight focus groups with selected SB/SE staff who are responsible for reviewing or selecting SB/SE returns for audit. We held two groups with field office staff who review returns for audit potential, two groups with area office staff who coordinate the review process, two groups with field office group managers who select returns for audit, one group with campus staff who review and select returns for audit, and one group with specialty tax group managers who select returns for audit. Within these five populations, we randomly selected participants who met our criteria of having more than 2 years of IRS work experience, working in different IRS offices nationwide, and covering a range of compliance issue areas. In total, our groups involved 58 participants with an average of about 9 years of IRS experience, with a range from 3 to 32 years of experience. The focus groups were held by telephone. We asked questions on internal control related topics, such as the clarity of SB/SE procedures and the adequacy of guidance to apply these procedures.

To assess the extent to which SB/SE implemented its procedures, we conducted a file review. We used IRM sections and SB/SE procedures documents as criteria. We obtained the population of SB/SE audits opened from March 2014 to February 2015 as shown in the open AIMS database and selected a nonprobability sample of 173 returns to review. Although the results of our file review cannot be projected to the population of SB/SE audits, they represent a variety of types of returns, sources, and selection processes. We focused on processes that required more manual review or affected a large number of taxpayers. As reflected in table 2, we reviewed more files for referrals and compliance initiative projects because they involve more human discretion in deciding whether to include the return in the selection inventory and in reviewing the returns for audit potential than for some other categories. We also reviewed more files for discriminant function (DIF) returns compared to some other categories because DIF returns are the largest portion of SB/SE's field audit workload by selection method or workstream. We reviewed the files to determine if decisions were documented and if staff followed procedures, such as documenting the rationale and approval for selecting or screening out returns. In sum, table 2 reflects the different types of returns we sampled, the type of files we reviewed, and the population and sample size of the files.

As shown in the last two rows of table 2, we also reviewed nongeneralizable, random samples of 30 returns that had been surveyed (i.e., screened out) and 30 classification quality review records for the same general time period as the audit files we reviewed. We created a separate sample of screened-out returns because audits were not opened on these returns. The database we used to create the audit file sample only contained returns that had been audited. We obtained the population of screened-out returns from SB/SE officials and randomly selected our sample from this population. We created a separate sample for classification quality review records because SB/SE reviews classification decisions per auditor rather than per return. We obtained the population of auditors that were reviewed during the same general time period as the files for the other samples. We identified subpopulations by region and selected a stratified random sample of these subpopulations.

                        Table 2: Description of Samples

 

           to Review SB/SE Procedures for Selecting Returns to Audit

 

 _____________________________________________________________________________

 

 

                                                     Population

 

                                                     (audits

 

                                                     opened March

 

 Sample                               Unit of        2014 to            Sample

 

 category       Files reviewed        analysis       February 2015)     size

 

 _____________________________________________________________________________

 

 

 DIF and        Paper audit           Tax return        32,176            38

 

 DIF-related    case files

 

 pickups

 

 

 Non-DIF        Paper audit           Tax return        22,407            24

 

 related        case files

 

 pickups

 

 

 Compliance     Electronic CIP        Tax return         1,638            30

 

 initiative     approval form

 

 projects       and

 

 (CIP)          white paper

 

 

 Referrals      Paper audit           Tax return        20,474            56

 

                case files and

 

                approval

 

                forms;

 

                electronic

 

                selection files

 

 

 Nonfiler       Paper audit           Tax return        70,623            25

 

                case files;

 

                electronic

 

                selection files

 

 

 Classifi-      Electronic Form       Classifier           664            30

 

 cation         5126                  (auditor)

 

 review         Classification

 

                Quality Review

 

                Record

 

 

 Survey         Paper case            Tax return        27,563            30

 

 (screened-     files

 

 out)

 

 _____________________________________________________________________________

 

 

 Source: GAO analysis of IRS data. | GAO-16-103

 

 

Finally, we interviewed SB/SE officials about the procedures and discussed deficiencies we identified. We designed uniform data collection instruments for our file review to consistently capture information on the completeness of required documentation and approvals related to return selection. IRS reviewed the instruments and the data we captured. To ensure accuracy, two of our analysts reviewed each file we assessed and reconciled any differences in responses. We then analyzed the results of these data collection efforts to identify main themes and develop summary findings.

We conducted this performance audit from September 2014 to December 2015 in accordance with generally accepted government auditing standards. Those standards require that we plan and perform the audit to obtain sufficient, appropriate evidence to provide a reasonable basis for our findings and conclusions based on our audit objectives. We believe that the evidence obtained provides a reasonable basis for our findings and conclusions based on our audit objectives.

 

FOOTNOTES TO APPENDIX I

 

 

1 GAO, Internal Control: Standards for Internal Control in the Federal Government, GAO/AIMD-00-21.3.1 (Washington, D.C.: November 1999).

2 GAO, Internal Control Management and Evaluation Tool, GAO-01-1008G (Washington, D.C.: August 2001).

 

END OF FOOTNOTES TO APPENDIX I

 

 

* * * * *

 

 

Appendix II: Summary of Small Business/Self-Employed (SB/SE)

 

Audit Results, Fiscal Year 2014

 

 

     Table 3: SB/SE Audits by Number, Amount of Recommended Additional Tax,

 

   Days to Conduct the Audit, and Direct Audit Hours Compared to All Internal

 

      Revenue Service (IRS) Audits, Fiscal Year 2014 (dollars in millions)

 

 _____________________________________________________________________________

 

 

                                          Fiscal year 2014

 

                       _______________________________________________________

 

 

                                                              Percent of SB/SE

 

 Characteristic        All IRS audits     All SB/SE audits     to IRS audits

 

 _____________________________________________________________________________

 

 

 Number of audits        1,384,365            823,904             59.5%

 

 closed

 

 

 Recommended               $33,149            $12,062             36.4%

 

 additional tax

 

 

 Number of days to

 

 conduct an audit

 

 

    Mean                       268                283

 

 

    Median                     226                224

 

 

 Number of direct

 

 audit hours

 

 

    Mean                        10                 10

 

 

    Median                       1                  2

 

 _____________________________________________________________________________

 

 

 Source: GAO analysis of IRS data. | GAO-16-103

 

 

 Note: Mean is the arithmetic average of the observed values of a continuous

 

 variable. Mean is calculated by adding all the values together and dividing

 

 the sum by the number of values. Median is the midpoint of a set of observed

 

 values arranged in either ascending or descending order.

 

 

    Table 4: SB/SE Field and Campus Audits by Number, Amount of Recommended

 

       Additional Tax, Days to Conduct the Audit, and Direct Audit Hours

 

      Compared to All SB/SE Audits, Fiscal Year 2014 (dollars in millions)

 

 _____________________________________________________________________________

 

 

                                           Fiscal year 2014

 

                       _______________________________________________________

 

 

                                       SB/SE closed          SB/SE closed

 

                                       field audits          campus audits

 

                                   ____________________   ____________________

 

 

                                             Percent of             Percent of

 

                       All SB/SE             field to               campus to

 

 Characteristic        audits      Results   all SB/SE    Results   all SB/SE

 

 _____________________________________________________________________________

 

 

 Number of audits       823,904    356,995     43.3%      466,909     56.7%

 

 closed

 

 

 Recommended            $12,062     $7,597     63.0%       $4,465     37.0%

 

 additional tax

 

 

 Number of days to

 

 conduct an audit

 

 

    Mean                    283        310                    262

 

 

    Median                  224        256                    212

 

 

 Number of direct

 

 audit hours

 

 

    Mean                     10         22                      1

 

 

    Median                    2         10                      1

 

 _____________________________________________________________________________

 

 

 Source: GAO analysis of IRS data. | GAO-16-103

 

 

 Note: Mean is the arithmetic average of the observed values of a continuous

 

 variable. Mean is calculated by adding all the values together and dividing

 

 the sum by the number of values. Median is the midpoint of a set of observed

 

 values arranged in either ascending or descending order.

 

* * * * *

 

 

Appendix III: Description of Small Business/Self-Employed (SB/SE)

 

Selection Methods or Workstreams

 

 

1. Area Office Referral -- Area office field personnel refer potential leads with correspondence audit issues to Campus Reporting Compliance (CRC).

2. Audit Information Management Systems (AIMS)/AIMS Computer Information System (A-CIS)/Previously Adjusted Exam Issues on Subsequent-year Filings -- Quarterly A-CIS reports are run to identify every campus case closed agreed or default in each of the discretionary audit programs. The subsequent year returns are classified for the same issues that are on the closed audit cases.

3. Audit Reconsideration -- Reevaluates the results of a prior audit where additional tax was assessed and remains unpaid, or a tax credit was reversed. IRS also uses the process when the taxpayer contests a Substitute for Return determination by filing an original delinquent return.

4. Campus Reporting Compliance (CRC) Compliance Initiative Project (CIP) Usage -- CRC uses CIP Authorization (Form 13498) to document approval for testing potential new inventory in correspondence audits.

5. Category A Claims for Refund -- Accounts Management staff refer claims for refunds that meet criteria indicating audit potential directly to Classification and Claim Teams within the campuses.

6. Criminal Investigation Referral -- CRC uses IRS's databases to determine if the issues Criminal Investigation identified exist on the referred returns.

7. Claim -- A request for refund or an adjustment of tax paid or credit not previously reported or allowed.

8. Collection Referral -- CRC receives two kinds of referrals from collection each year. CRC receives three referrals yearly of potential nonfiler leads from the collection queue. CRC also receives occasional referrals of Form 3949 Information Item referrals.

9. Compliance Data Environment Release 3 -- Identifies potential audits through user-defined filters and queries, and forwards those selected to the correct treatment stream.

10. Compliance Data Warehouse/Potential Unreported Heavy Use Tax -- Identifies Form 2290 returns (Heavy Highway Vehicle Use Tax Return) with potential unreported heavy use tax.

11. Compliance Initiative Project (CIP) -- When IRS identifies potential noncompliance in specific groups of taxpayers, CIPs are used to contact or audit taxpayers or collect taxpayer data within that group when another method to identify such workload is not already in place.

12. Discriminant Function (DIF) -- A mathematical technique to estimate or "score" the potential merit of auditing a particular tax return based on its characteristics.

13. Discretionary Exam Business Rules (DEBR) -- DEBR rules were developed to identify non-Earned Income Tax Credit returns with the highest audit potential for additional tax assessment for certain return conditions.

14. Employee Audit -- Any employee selected for audit under any and all methods of inventory identification (e.g., DIF (see definition above), referrals). It also includes inventory that is specifically identified based on the individual's position within IRS. Inventory identification is designed to ascertain compliance among IRS employees while maintaining their right to privacy.

15. Employment Tax Referral -- Specialty tax personnel refer potential audit leads relating to possible unfiled payroll tax returns to CRC (see definition above).

16. Estate & Gift Tax Form 1041 -- Filters identify Form 1041 returns reporting charitable contributions, fiduciary fees, and other miscellaneous deductions.

17. Estate & Gift (E&G) Referrals -- E&G tax personnel refer potential audit leads relating to possible unreported executor fees to CRC.

18. Government Liaison and Disclosure (GLD) Referrals -- GLD personnel refer information to CRC from sources outside IRS, such as states and the Puerto Rican Tax Authority (see definition below), that are potential audit leads.

19. High Income Nonfiler -- Strategy designed to address the filing compliance of taxpayers with known sources of income exceeding $200,000.

20. Information Reports -- Reports and referrals that may include information on substantial civil tax potential and significant potential for fraud, or are related to returns for tax years not yet required to be filed.

21. National Research Program (NRP) -- A comprehensive effort by IRS to measure compliance for different types of taxes and various sets of taxpayers. It provides a statistically valid representation of the compliance characteristics of taxpayers.

22. Offers-In-Compromise/Doubt as to Liability -- An offer in compromise is an agreement between the taxpayer and IRS that settles a tax debt for less than the full amount owed. Doubt as to liability exists where there is a genuine dispute as to the existence or amount of the correct tax debt under the law.

23. Payment Card Income Pilot -- Potential underreporters are flagged when Form 1099-K receipts, as a portion of gross receipts, are significantly greater than for similar taxpayers, suggesting cash underreporting.

24. Promoter Investigations and Client Returns -- SB/SE auditors, as well as other IRS or external sources, refer potentially abusive transaction promoters/preparers for audit. Client returns are audited to determine whether penalties and/or an injunction are warranted.

25. Puerto Rican Tax Authority Nonfiler -- The Puerto Rican Tax Authority provides information to IRS through the Government Liaison Office about residents in Puerto Rico who fail to file their federal tax return.

26. Research Referral -- Research personnel refer potential audit leads relating to NRP, possible nonfilers, and problem preparers to CRC.

27. Return Preparer Program Action Cases and Client Returns -- Clients of questionable preparers are audited to determine whether preparer penalties and/or injunctive actions are warranted. These are limited to preparer misconduct or incompetence that is pervasive and widespread.

28. Submissions Processing -- Submission Processing staff refer potential audit leads relating to the Alternative Minimum Tax program, math error, and unallowables to CRC or campus classifiers.

29. State Audit Referral Program (SARP) -- SARP utilizes the audit report information submitted to IRS by various taxing agencies to address areas of noncompliance.

30. State/Other Agency Referral -- Federal, state, and local governmental agencies share relationships and data with IRS through the Governmental Liaison staff to increase compliance levels, reduce the tax gap, reduce taxpayer burden, and optimize use of resources.

31. Treasury Inspector General for Tax Administration (TIGTA) Referral -- TIGTA personnel refer potential audit leads relating to TIGTA investigations to CRC.

32. Tip Program Referral -- Employees who do not report at or above the tip rate as agreed upon by the employer under various agreements with IRS may be referred for audit.

33. Whistleblower Claim -- Allegations of violation of federal tax laws made by a person who requests a reward.

Source: GAO analysis of IRS information. | GAO-16-103

 

* * * * *

 

 

Appendix IV: Small Business/Self Employed (SB/SE)

 

Selection Methods by Broad Identification Source

 

 

Table 5 shows the selection methods or workstreams by how the returns were identified.

 

Table 5: SB/SE Selection Methods or Workstream

 

by Broad Identification Source

 

______________________________________________________________________

 

 

Identification source

 

Referrals to Internal Revenue Service (IRS) audit offices

 

Description of identification

 

IRS employees and units, as well as external sources, such as other agencies and citizens, can refer potentially noncompliant taxpayers to SB/SE. SB/SE may start an audit if the referral indicates significant potential for noncompliance.

 

Field methods or workstreams

 

Internal and external to IRS

 

Employee audits

Information reports

Promoter investigations and client returns

Return preparer program action cases and client returns

 

External to IRS

 

State/other agency

Whistleblower claim

Campus methods or workstreams

 

Internal to IRS

 

Area office (nonfiler)

Collection (nonfiler)

Criminal Investigation

Employment tax

Estate tax (executor fee)

Government Liaison and Disclosure (Puerto Rico claims and international)

Research

Submissions processing (Alternative Minimum Tax)

Tip Programa

 

External to IRS

 

Treasury Inspector General for Tax Administration

State audit referral programa

Puerto Rican tax agency nonfiler

______________________________________________________________________

 

 

Identification source

 

Related pickups

 

Description of identification

 

After opening an audit, SB/SE may identify the taxpayer's prior or subsequent year returns or returns of related taxpayers to audit.

 

Field methods or workstreams

 

Various workstreams

 

Campus methods or workstreams

 

Various workstreams
______________________________________________________________________

 

 

Identification source

 

User-defined criteria

 

Description of identification

 

These criteria use filters or rules embedded in computer software to identify returns with specific characteristics, often for projects. These characteristics generally involve a specific tax issue known or suspected to have high noncompliance in a particular geographic area, industry, or population.

 

Field methods or workstreams

 

Compliance initiative project

High income nonfiler

Compliance Data Environment Release 3

 

Campus methods or workstreams

 

Compliance initiative project

Potential unreported heavy use tax

Estate and gift tax form 1041 (certain expenses)

______________________________________________________________________

 

 

Identification source

 

Computer program

 

Description of identification

 

Computer programs use rules or formulas to identify potential noncompliance across a type of tax return, rather than for a specific tax issue.

 

Field methods or workstreams

 

Discriminant function

 

Campus methods or workstreams

 

Audit Information Management Systems Computer Information System -- Previously adjusted exam issues on subsequent year fillings

Discretionary exam business rulesa

______________________________________________________________________

 

 

Identification source

 

Data matching

 

Description of identification

 

When information on a tax return -- such as wages, interest, and dividends -- does not match information reported to IRS by states, employers, or other third parties, these discrepancies may prompt SB/SE to review returns for audit potential.

 

Field methods or workstreams

 

Payment card income pilot

 

Campus methods or workstreams

 

Payment card income pilot
______________________________________________________________________

 

 

Identification source

 

Taxpayer initiated

 

Description of identification

 

When taxpayers contact IRS to request an adjustment to their respective tax returns, tax refunds, or tax credits, or ask to have a previous audit reconsidered, SB/SE may initiate an audit after reviewing these requests.

 

Field methods or workstreams

 

Audit reconsideration

Claims

Offer-in-Compromise/Doubt as to Liability

 

Campus methods or workstreams

 

Category A Claims
______________________________________________________________________

 

 

Identification source

 

Random identification

 

Description of identification

 

The National Research Program (NRP) studies tax compliance through audits of a randomly-identified sample of tax returns. Specifically, NRP measures voluntary compliance in reporting income, deductions, credits, among other categories, and generalizes those measures to the population being studied.

 

Field methods or workstreams

 

NRP

 

Campus methods or workstreams

 

NRP
______________________________________________________________________

 

 

Source: GAO analysis of IRS information. | GAO-16-103

Notes: Certain workload can come from different sources. For example, employee audits can be initiated through internal and external referrals, as well as other sources. We categorized based on how IRS characterized the workload identification method or the most frequent source. Referrals can originate from automated sources, as well as individuals. SB/SE's specialty tax (employment, excise, and estate and gift tax) returns are also identified for audit from these seven identification sources. In addition to audits, SB/SE conducts compliance reviews, which we have not included in this table or scope of this review. We also omitted frivolous return program because it moved from SB/SE to the Wage and Investment (W&I) division as of November 2014.

 

FOOTNOTE TO TABLE 5

 

 

a SB/SE manages this workload after realignment from W&I in November 2014.

 

END OF FOOTNOTE TO TABLE 5

 

 

______________________________________________________________________

 

 

* * * * *

 

 

Appendix V: Examples of Similarities and

 

Variations across Selection Methods

 

 

Figures 4 and 5 represent general similarities and variations in the Small Business/Self-Employed (SB/SE) return selection process at its field and campus locations, respectively. They do not include every process that occurs in the various methods or workstreams. In addition, the phases and processes in the figures are not necessarily discrete events but may overlap and involve other processes and staff.

 

Figure 4: Example of Selection Processes

 

for Internal Revenue Service (IRS) SB/SE Field Audits

 

 

 

 

Source: GAO analysis of IRS information. | GAO-16-103

 

Figure 5: Example of Selection Processes

 

for IRS SB/SE Campus Audits

 

 

 

 

Source: GAO analysis of IRS information. | GAO-16-103

 

* * * * *

 

 

Appendix VI: Small Business/Self-Employed (SB/SE)

 

Field Audit Sources and Audit Information

 

Management System (AIMS) Source Codes

 

 

The AIMS source code indicates the initial source of how the return was identified for audit. Table 6 shows the number of field audits closed by source code and by grouping of source codes into categories for fiscal year 2014.

    Table 6: SB/SE Field Audits by Number and Internal Revenue Service (IRS)

 

                       AIMS Source Code, Fiscal Year 2014

 

 ______________________________________________________________________________

 

 

                                               Category for field audits

 

                                      _________________________________________

 

 

                                      Number                         Percent of

 

                                      of                             category

 

                                      closed             Number of   to all

 

                                      SB/SE    Source    closed      closed

 

 Source                               field    code      audits by   field

 

 code    Description                  audits   category  category    audits

 

 ______________________________________________________________________________

 

 

 02      Discriminant Function (DIF)  69,711

 

         returns

 

 

 20,     Regular (manual) classifi-   10,382

 

 pro-    cation -- DIF return

 

 ject

 

 code

 

 0158

 

         Subtotal                              DIF         80,093      22.4%

 

 

 05      DIF -- filed return related  16,327

 

         to primary DIF return

 

 

 10      DIF -- filed prior and/or    44,724

 

         subsequent return related

 

         to a primary DIF return

 

 

 12      DIF -- nonfiled delinquent    7,134

 

         return or Substitute for

 

         Return (SFR) related to a

 

         primary DIF return

 

 

 39      Non-DIF -- Tax shelter        2,852

 

         related pickup

 

 

 40      Non-DIF -- Tax shelter       51,950

 

         related, filed prior and/or

 

         subsequent return pickup

 

 

 44      Non-DIF -- Tax shelter       36,309

 

         related, nonfiled delin-

 

         quent return or SFR

 

 

 50      Non-DIF -- Filed return      17,241

 

         with different taxpayer

 

         identification number or

 

         master file transaction

 

 

 64      Non-DIF -- Non-Tax Equity        96

 

         and Fiscal Responsibility

 

         Act (TEFRA) pickup

 

         related to Forms 1065,

 

         1041, and 1120S other than

 

         tax shelter

 

 

 72      Related to a specialist       2,706

 

         referral

 

 

 88      Fraud -- Special enforce-       240

 

         ment

 

 

         Subtotal                              Pickups    179,579      50.3%

 

 

 17      Tax shelter program          13,105

 

 

 49      Miscellaneous Non-DIF --      6,601

 

         return preparers

 

 

 60      Information reports           2,642

 

 

 65      Collection                       54

 

 

 70      Other agency requests         1,340

 

 

 71      Specialist                      422

 

 

 77      State information               278

 

 

         Subtotal                              Referrals   24,442       6.8%

 

 

 80      Research and Reference --    12,564

 

         National Research Program

 

         (NRP) -- current

 

 

 91      Research and Reference --    10,967

 

         NRP-related returns

 

 

         Subtotal                              NRP         23,531       6.6%

 

 

 30      Claims -- Claim for refund/  10,018

 

         abatement

 

 

 31      Claims -- Paid claims for       444

 

         refund

 

 

 32      Claims -- Carryback refund      619

 

 

 73      Miscellaneous non-DIF --      8,211

 

         Taxpayer requests

 

 

         Subtotal                              Claims      19,292       5.4%

 

 

 01      Tax shelter returns --          580

 

         computer identified

 

 

 03      Unallowable items                60

 

 

 06      Correspondence audit --         396

 

         DIF

 

 

 08      Self-employment tax              14

 

 

 11      Research and Reference        1,415

 

         -- Studies, Tests and

 

         Research

 

 

 20      Regular (manual) classi-      9,165

 

 (non-   fication -- other than DIF

 

 project

 

 code

 

 0158)

 

 

 23      TEFRA related                    25

 

 

 24      Nonfiler -- local sourced     4,675

 

         work

 

 

 25      Nonfiler -- Strategic         1,355

 

         initiative

 

 

 26      Minimum tax program              13

 

 

 35      Other -- administrative          12

 

         adjustment request

 

 

 46      Miscellaneous non-DIF --        185

 

         Employee returns

 

 

 62      Miscellaneous non-DIF --      9,938

 

         Compliance Initiative

 

         Project

 

 

 85      Information Return Program       11

 

         information document match

 

 

 89      Fraud -- Special enforce-       746

 

         ment-related

 

 

 90      Fraud -- regular              1,457

 

 

 04      Multiple filers

 

 14      High underreporter               11a

 

 

         Subtotal                              All other   30,058       8.4%

 

 

         Total -- all field audits                        356,995     100.0%

 

 ______________________________________________________________________________

 

 

 Source: GAO analysis of IRS data. | GAO-16-103

 

 

                              FOOTNOTE TO TABLE 6

 

 

      a Values for individual source codes are suppressed to avoid

 

 identification of taxpayers.

 

 

                           END OF FOOTNOTE TO TABLE 6

 

* * * * *

 

 

Appendix VII: Comments from the Internal Revenue Service

 

 

November 23, 2015

 

 

James R. McTigue, Jr.

 

Director, Tax Policy and Administration

 

Strategic Issues Team

 

U.S. Government Accountability Office

 

441 G Street NW

 

Washington, DC 20548

 

 

Dear Mr. McTigue:

Thank you for the opportunity to comment on the draft report titled, IRS RETURN SELECTION: Certain Internal Controls for Audits in the Small Business and Self-Employed Division Should Be Strengthened (GAO-16-103). We appreciate that GAO recognized that IRS has procedures that adhere to internal control standards and provide a level of assurance of fairness and integrity in the return selection process. The report notes that the Small Business/Self Employed Division (SB/SE) demonstrated a commitment to promoting ethical behavior among staff and that there is an awareness of internal control procedures by managers which is important to achieving our mission. The report also found that segregation of duties in the return selection process means that no individual can control the decision-making process and that there are safeguards in place to restrict access to our systems to only authorized users.

The mission of the SB/SE Division is to help small business and self-employed taxpayers understand and meet their tax obligations while applying tax law with integrity and fairness to all. The terms integrity and fairness originate from the IRS Mission statement and all IRS employees, not just those involved in return selection, are expected to carry out their duties with integrity and fairness to all.

Your report states that IRS has not defined fairness or program objectives for audit selection and corresponding performance measures, which leaves its audit program vulnerable to inconsistent return selection or the perception of it. In the SB/SE Examination sphere, the concept of fairness has both a collective and individual component. The IRS takes into account the responsibilities and obligations that all taxpayers share. We pursue those individuals and businesses who fail to comply with their tax obligations to ensure fairness to those who do and to promote public confidence in our tax system, and we discharge these important responsibilities with a focus on taxpayer rights, as embodied in the Taxpayer Bill of Rights (TBOR) and formally adopted by IRS.

Indeed, as reflected in our Policy Statement 4-21, the objective in selecting returns for examination is to promote the highest degree of voluntary compliance on the part of taxpayers through equitably selecting returns which indicate the probability of substantial error. As your report indicates, SB/SE has procedures in place that meet several internal control standards and provide assurance of fairness and integrity in the return selection process. These critical assurances are ensured by design of the overall process -- they are embedded into the foundation of our selection processes, which is designed to objectively select returns with the highest probability of noncompliance by relying on a combination of automated processes, historical data, scoring mechanisms, and data-driven algorithms; and the entire process operates under a comprehensive set of checks and balances and safeguards, all aimed at delivering a process that is fair and equitable by design.

In addition, SB/SE has multiple program level objectives with corresponding performance measures that ensure taxpayers are treated fairly. All employees are evaluated on how well they provide fair and equitable treatment to taxpayers as required by the Internal Revenue Service Restructuring and Reform Act of 1998. This includes employees involved with return selection. Moreover, fairness is embodied in the IRS's approach to taxpayer rights, a responsibility that we take seriously and that is a priority for all IRS employees in their work every day. Many of the rights in the TBOR, a cornerstone document that provides the nation's taxpayers with a better understanding of their rights, are aimed at ensuring that we discharge our mission with fairness and integrity (including, for example, the right to be informed, the right to finality, privacy and confidentiality).

We note that, while your report posits a hypothetical risk to fair case selection from the lack of documented objectives and internal control deficiencies, your report did not identify any instances where the selection of a case was considered inappropriate or unfair. Your report cites employee focus group comments as demonstrating that employees have an inconsistent understanding of fairness in return selection. Ignoring geographic location in return selection is not incompatible with considering geography to avoid examinations of taxpayers that one may know. Both are indications that we consider fairness in selection. The first attempts to balance coverage across the country, and the second attempts to avoid any appearance of bias. Also, we note that fairness of the return selection process is not an individual employee responsibility -- it is the responsibility of upper management to design a process that ensures a fair and objective selection of noncompliant cases. The employees' role with respect to fairness is to perform their duties in a manner that respects and protects taxpayers' rights. We will continue, as we already do, to monitor any claims or situations where there is a question of fairness in return selection; and we will refer to the Treasury Inspector General for Tax Administration any taxpayer complaints that relate to fairness in return selection.

SB/SE has conducted both formal and informal risk assessments of its audit selection procedures as part of longstanding, routine internal controls procedures. For example, all levels of management conduct business performance reviews, annual assurance reviews, and operational reviews. Reviews at the group, territory, area, and headquarters levels are all used as opportunities to formally identify risks, not only in the case selection process but in all aspects of our operations. In addition, the Chief Risk Officer completed a risk based review of the IRS Audit Selection Process in March 2014 with participation from exam functions within SB/SE. To facilitate the review, a standardized template for each examination workstream was prepared that provided the specific steps and individual roles utilized to approve, operationalize and maintain the workstream. The results of this review substantiated that we consistently maintain sound internal controls throughout our examination programs.

Your report cites the lack of documentation for some return selection decisions as an area of internal control weakness. It specifically mentions the decision to "screen out" a previously selected return as an example of this lack of documentation. Since this activity occurs after the return has been identified and selected, we do not consider this as part of the return selection process. Cases are not screened out because they do not have a possible compliance issue; they are screened out due to other factors such as resource constraints or priority of the workload.

Your report also outlines improvements for the monitoring of the decisions that are made and the coding for return selections. SB/SE Field Examination regularly monitors the application of coding and periodically performs a 100% inventory validation. This includes the requirement that all returns be physically reviewed to ensure they are accurately coded and, if errors are found, corrections are made. However, we agree improvements can be made, specifically in the area of coding SB/SE Campus Examination returns and appropriate actions will be taken.

As described in the draft report, the SB/SE Examination return selection process relies on a variety of sources and processes to select returns for audit. SB/SE has 33 different workstreams (or types of work); all of which follow a general, multiphase process for identifying, classifying and selecting returns for examination. Within this process, there are some variations across the workstreams. For example, there are more automated processes for campus selections and more manual processes for some field selections where the case work is more complex.

We believe the seven groupings used in your report to describe our workstreams do not provide an accurate picture of our sources of work. For example, related pickups is not a category of work from a workstream perspective and the use of this grouping is misleading. The least amount of discretion is exercised in Discriminant Index Function (DIF) return selections. Including DIF pickups in this grouping, implies that a majority of our work is selected based upon non-automated factors, which is simply not accurate. A DIF sourced initial return is the source of the related pickup because without the primary DIF indicator, the related, prior or subsequent year returns would only have been introduced into the audit workstream if they were independently identified by DIF.

We agree with the importance of sound internal controls and are committed to their improvement; especially in the areas that your report recommends that we strengthen them. We agree with your recommendations, and the enclosure provides additional detail on the specific IRS actions planned to implement them. In closing, we appreciate and value your continued support and insight as we strive to further strengthen our processes and programs throughout the Service. If you have any questions, please contact me, or a member of your staff may contact Karen Schiller, Commissioner, Small Business/Self Employed Division, at (202) 317-0600.

Sincerely,

 

 

John M. Dalrymple

 

Deputy Commissioner for Services

 

and Enforcement

 

Enclosure

 

* * * * *

 

 

Enclosure

 

 

IRS RETURN SELECTION: Controls for Audits in the Small Business

 

and Self-Employed Division Should Be Strengthened (GAO-16-103)

 

 

Recommendations for Executive Action

To help ensure SB/SE's audit selection program meets its mission and selects returns fairly, we recommend that the Commissioner of Internal Revenue take the following actions:

Recommendation 1: Clearly define and document the key term "fairness" for return selection activities.

Comment: Fairness for return selection activities has three components -- fairness to the taxpaying public by pursuing those who fail to comply or otherwise meet their tax obligations, a fair process that objectively selects noncompliant returns for examination across all areas of noncompliance, and fairness to the individual taxpayers who are being examined by respecting and adhering to taxpayers' rights. These definitions of fairness are separately documented in our Policy Statement 4-21 and reflected in the IRS Taxpayer Bill of Rights (TBOR). We agree with your recommendation as to the value of documenting these definitions in a single document and we will take appropriate action to do so.

Recommendation 2: Clearly communicate examples of fair selections to staff to better assure consistent understanding.

Comment: SB/SE Examination will communicate to managers and examiners involved in the return selection examples of fairness in the return selection process and incorporate these examples into our training materials, as needed, to further ensure a consistent understanding of fairness.

Recommendation 3: Develop, document, and implement program-level objective(s) to evaluate whether the return selection process is meeting its mission of applying the tax law with integrity and fairness to all.

Comment: SB/SE will review current objectives for the Examination program priorities and identify and implement any additional objectives, as needed, addressing the return selection process.

Recommendation 4: Develop, document, and implement related performance measures that would allow SB/SE to determine how well the selection of returns for audit meets the new objective(s).

Comment: Many of our existing performance measures provide key indicators and insight as to our program performance with respect to fairness, including for example, the rate of examinations resulting in changes proposed to the reported tax, cycle time for the examination, and yield from the examination. During SB/SE's review of current objectives for recommendation #3, if new objectives are identified and implemented, SB/SE will develop, document, and implement any additional performance measures that are needed to assess achievement of the new objective(s).

Recommendation 5: Incorporate the new objective(s) for fair return selection into the SB/SE risk management system to help identify and analyze potential risks to fair selections.

Comment: The current SB/SE risk process requires identifying and assessing risks to business objectives. Any new objectives that are developed in response to recommendation #3 will be considered within the current SB/SE risk management process framework.

Recommendation 6: Develop and implement consistent documentation requirements to clarify the reasons for selecting a return for audit and who reviewed and approved the selection decision.

Comment: SB/SE Examination will evaluate the need to improve the documentation of decisions for return selection and for the review and approval process of the selection decisions made.

Recommendation 7: Develop, document, and implement monitoring procedures to ensure that decisions made and coding used to select returns for audit are appropriate.

Comment: SB/SE Examination will review current procedures for monitoring decisions made and coding used to select returns for audit and identify and implement any new procedures, if needed, to ensure coding of returns selected for audit is appropriate.

 

* * * * *

 

 

Appendix VIII: GAO Contact and Staff Acknowledgments

 

 

GAO Contact

 

James R. McTigue, Jr. (202) 512-9110, mctiguej@gao.gov

 

Staff Acknowledgments

In addition to the contact named above, Tom Short (Assistant Director), Sara Daleski, Hannah Dodd, David Dornisch, Elizabeth Fan, Ted Hu, Ada Nwadugbo, Robert Robinson, Ellen Rominger, Stewart Small, Andrew J. Stephens, and Elwood White contributed to this report.

GAO's Mission

The Government Accountability Office, the audit, evaluation, and investigative arm of Congress, exists to support Congress in meeting its constitutional responsibilities and to help improve the performance and accountability of the federal government for the American people. GAO examines the use of public funds; evaluates federal programs and policies; and provides analyses, recommendations, and other assistance to help Congress make informed oversight, policy, and funding decisions. GAO's commitment to good government is reflected in its core values of accountability, integrity, and reliability.

Obtaining Copies of GAO Reports and Testimony

The fastest and easiest way to obtain copies of GAO documents at no cost is through GAO's website (http://www.gao.gov). Each weekday afternoon, GAO posts on its website newly released reports, testimony, and correspondence. To have GAO e-mail you a list of newly posted products, go to http://www.gao.gov and select "E-mail Updates."

Order by Phone

The price of each GAO publication reflects GAO's actual cost of production and distribution and depends on the number of pages in the publication and whether the publication is printed in color or black and white. Pricing and ordering information is posted on GAO's website, http://www.gao.gov/ordering.htm.

Place orders by calling (202) 512-6000, toll free (866) 801-7077, or TDD (202) 512-2537.

Orders may be paid for using American Express, Discover Card, MasterCard, Visa, check, or money order. Call for additional information.

Connect with GAO

 

Connect with GAO on Facebook, Flickr, Twitter, and YouTube.

 

Subscribe to our RSS Feeds or E-mail Updates.

 

Listen to our Podcasts and read The Watchblog.

 

Visit GAO on the web at www.gao.gov.

 

To Report Fraud, Waste, and Abuse in Federal Programs

 

Contact:

 

 

Website: http://www.gao.gov/fraudnet/fraudnet.htm

 

E-mail: fraudnet@gao.gov

 

Automated answering system: (800) 424-5454 or (202) 512-7470

 

Congressional Relations

 

Katherine Siggerud, Managing Director, siggerudk@gao.gov, (202) 512-4400, U.S. Government Accountability Office, 441 G Street NW, Room 7125, Washington, DC 20548

 

Public Affairs

 

Chuck Young, Managing Director, youngc1@gao.gov, (202) 512-4800 U.S. Government Accountability Office, 441 G Street NW, Room 7149 Washington, DC 20548
DOCUMENT ATTRIBUTES
  • Institutional Authors
    Government Accountability Office
  • Subject Area/Tax Topics
  • Jurisdictions
  • Language
    English
  • Tax Analysts Document Number
    Doc 2016-802
  • Tax Analysts Electronic Citation
    2016 TNT 9-15
Copy RID