2051 Using system-generated reports
Sep-2022

Requirement to evaluate reliability of reports

CAS Requirement

When using information produced by the entity, the auditor shall evaluate whether the information is sufficiently reliable for the auditor’s purposes, including, as necessary in the circumstances (CAS 500.9):

(a) Obtaining audit evidence about the accuracy and completeness of the information; and

(b) Evaluating whether the information is sufficiently precise and detailed for the auditor’s purposes.

CAS Guidance

In order for the auditor to obtain reliable audit evidence, information produced by the entity that is used for performing audit procedures needs to be sufficiently complete and accurate. For example, the effectiveness of auditing revenue by applying standard prices to records of sales volume is affected by the accuracy of the price information and the completeness and accuracy of the sales volume data. Similarly, if the auditor intends to test a population (for example, payments) for a certain characteristic (for example, authorization), the results of the test will be less reliable if the population from which items are selected for testing is not complete (CAS 500.A60).

Obtaining audit evidence about the accuracy and completeness of such information may be performed concurrently with the actual audit procedure applied to the information when obtaining such audit evidence is an integral part of the audit procedure itself. In other situations, the auditor may have obtained audit evidence of the accuracy and completeness of such information by testing controls over the preparation and maintenance of the information. In some situations, however, the auditor may determine that additional audit procedures are needed (CAS 500.A61).

In some cases, the auditor may intend to use information produced by the entity for other audit purposes. For example, the auditor may intend to make use of the entity’s performance measures for the purpose of analytical procedures, or to make use of the entity’s information produced for monitoring activities, such as reports of the internal audit function. In such cases, the appropriateness of the audit evidence obtained is affected by whether the information is sufficiently precise or detailed for the auditor’s purposes. For example, performance measures used by management may not be precise enough to detect material misstatements (CAS 500.A62).

OAG Guidance

OAG Audit 1051 Sufficient appropriate audit evidence details the standards with respect to the sources of audit evidence and the relevance and reliability of the evidence. The reliability of reports is dependent on the reliability of the data and information contained within. This section is meant to provide more practical guidance with respect to evaluating the reliability of reports that are utilized by management in the course of their control activities as well as what we as auditors use as audit evidence.

The content of this section should be considered in conjunction with OAG Audit 1051.

Understanding report reliability

OAG Guidance

Report reliability, based on the reliability of data, can have an impact for both the entity and the auditor. For the entity, it affects the quality of record keeping, reporting, and business decisions regarding entity operations. As a result, understanding the reliability of the reports and the information the entity uses can be important when evaluating the strength of the entity’s internal control framework.

For the auditor, it can affect the audit risk assessment and the degree to which reports can be relied upon to plan and perform the audit. When using entity-produced information, an evaluation should be undertaken to ascertain whether the information is sufficiently reliable for the auditor’s purpose, including, as required by the circumstances, obtaining audit evidence concerning the accuracy and completeness of the information and evaluating the sufficiency of the precision and detail of the information for the auditor’s purpose.

The auditor will seek audit evidence that the information included in the report is accurate. There are two components of accuracy to be considered. The first is mathematical accuracy: for example, that a report has been totalled correctly or that calculations are mathematically accurate. The second is accuracy of the underlying data: for example, that data such as amounts or dates in the report are consistent with the original inputs entered at the source. Assessing the completeness of the report includes verifying that all relevant data and information are included to achieve the intended purpose of the report. It will require the auditor to consider the degree and required precision of correctness for the report to be used as audit evidence for the auditor’s purpose.

Auditors need to consider the reliability of reports in all phases of the audit, from planning the audit to developing the audit strategy, performing the audit procedures, determining audit conclusions, and making recommendations to the entity. All reports used to support engagement findings, conclusions, or recommendations should be assessed for reliability. The assessment needs to be appropriate for the specific engagement purposes, and the results of the assessment need to be evaluated and documented. The auditor needs to assess if there is an audit risk associated with the possibility that the report is insufficiently reliable.

Identifying reports relevant to the audit

OAG Guidance

Auditors will likely be using different types of reports (e.g., financial summaries, sub-ledger reports, Excel reports, data extracts, etc.) as forms of audit evidence. It is important for auditors to identify which reports they need to consider and assess in the course of their audit. Financially significant reports and other End-User Computing (EUC) tools (e.g. ad hoc reporting tools or queries run from a data warehouse) reports are those that generate financial data that is used in controls in the control activities component or relied upon for other audit procedures. As a starting point, we understand the significant processes and accounts that are in scope for our evaluation and identify the relevant reports or other End-User Computing (EUC) tools which are used to support these accounts or processes.

For all significant business processes and financial statement line items (FSLIs), auditors must identify reports that may be relevant from an audit perspective. Consider the following examples of how reports can be used by the entity or the auditor:

  • in the performance of a control activity,
  • as the basis for information in developing expectations for substantive analytics,
  • as the basis for information in the creation or verification of financial estimates,
  • to summarize data for the purposes of sample selections, and
  • to summarize data for financial reporting purposes.
Understanding the nature and source of the report

OAG Guidance

Once we have identified relevant reports, we need to plan the audit approach for assessing the reliability of the report where the report will be part of our audit evidence. To do so, we need to understand the nature and source of the report; this includes considering the source of the report and the source of the data. Reports can be from an internal or external source, and data can be integrated from a variety of sources. The source and nature of the report can affect its reliability, which also depends on the individual circumstances under which it is obtained.

Source of reports (external)

While recognizing that exceptions may exist, the reliability of audit evidence is increased when it is obtained from independent sources outside the entity. Typically, evidence from credible third parties may be better than evidence generated within the audited organization.

Examples of external report sources might include

  • industry or competitor reports;

  • government department or agency reports;

  • publicly reported entity information (e.g., management’s discussion and analysis of results of operations in annual budget analysis, webcasts, and conference calls).

Even when information to be used as audit evidence is obtained from sources external to the entity, circumstances may exist that could affect its reliability. For example, information obtained from an independent external source may not be reliable if the source is not knowledgeable, or if biases exist. The report may also be less reliable if the source of the underlying data in the external report was originally obtained from the audited entity (e.g., shared databases or circular reporting).

An auditor may use reports obtained from independent sources for audit evidence in two ways. The reports may be used as direct audit evidence or may be used indirectly to corroborate internal information.

Source of reports (internal)

Internal reports may be produced from different types of systems, and the information can be obtained from a variety of sources, such as

  • off-the-shelf packages for standard reports;

  • reports routinely generated by programmed or configured financial accounting systems using live transaction and standing data files;

  • reports generated through end-user computing (EUC) tools, such as spreadsheets or ad hoc queries/scripts run by end users;

  • reports generated from other sources:

    • operational information generated from non-financial systems,
    • audit trail logs generated from the IT systems,
    • special request reports,
    • scripts run on IT security settings to support monitoring of restricted access,
    • number of employees.

The source of the information (e.g., the system that generates the report and the database of underlying data) can affect the risk that the information is incomplete or inaccurate. As a result, the auditor considers this risk when developing the audit strategy. Some considerations are listed below.

Off-the-shelf Packages for Standard Reports

An off-the-shelf package is software that is purchased from a third party vendor, installed, and used with limited IT support. If the entity is able to modify the package (e.g., add, change, or delete underlying code), the system may not fall under the definition of “off-the-shelf.” We also consider whether it is a reputable package by determining who the supplier is, whether it is a well-established package, and whether it is widely used. Generally, these off-the-shelf packages have standard reports that have been programmed as part of the software functionality. These types of reports may be referenced as “canned” reports (vs. “modified,” if they are modifiable) or are referenced as being created by a “report writer.” Examples may include trial balance reports, accounts payable sub-ledger reports, bank reconciliation reports, and monthly budget reports.

Lower risk is often associated with off-the-shelf package reports, as they have been programmed by an independent source, restrictions to the source code prevent entity changes, reports tend to be less complex, user input/actions are limited, and the reports would have been subject to independent quality control tests.

There could be a more elevated risk for off-the-shelf package reports that allow modifications to the canned reports. This can be determined by the auditor through inquiry and the performance of a walk-through.

Information Routinely Generated by Programmed or Configured Financial Accounting Systems Using Live Transaction and Standing Data Files

Similar to the off-the-shelf standard reports, an entity may have programmed or configured reports to be generated. The difference is that these reports would have needed to be programmed internally by the entity, but similarly, once developed, the reports require minimal user input/action in order to be generated. Examples may include aging reports, exception reports, rejected transaction reports, and listings of transactions meeting pre-defined attributes.

Internally programmed or configured reports have a higher risk associated with them than the off-the-shelf package reports. Risk considerations would include the qualifications of those programming the reports, proper segregation of duties, quality control over testing of the reports, and ability to change the programming.

Reports Generated Through End-user Computing (EUC) Tools

  • Spreadsheets—Spreadsheets can range in complexity, but in general have open access to code and formulas and require manual input of data. Examples may include an amortization schedule or calculation of an accrued interest payable that have been developed in an Excel or Lotus spreadsheet using manual input of source data by treasury personnel. Other examples may also include system-generated reports that are exported to an EUC tool. They are usually informally developed by users and are hosted on LAN or local drives rather than a controlled IT environment. There is a heightened risk that controls over spreadsheets will not be effective, or not feasible or practical to audit. Consider the design and, where relevant, testing of controls within the spreadsheets and also consider the ITGC-like controls which support the continued integrity of these embedded controls.

  • Ad hoc queries/scripts run by end users—Ad hoc queries/scripts may be run from a desktop management reporting system. These queries are based on manually input criteria and developed on an ad hoc basis as needed by management. The resulting output is often a basic data extract that can then be imported into a spreadsheet for additional manipulation and analysis. Examples may include an extract of cash transactions meeting specified criteria or an extract of data regarding key performance indicators, which are used to perform a relevant control.

Nature of spreadsheets

Due to the nature of spreadsheets (i.e. open access, manual input of data, susceptible to errors) and the nature of the environment in which they are typically developed and implemented (e.g., informally developed by users, hosted on LAN or local drives rather than controlled IT environment), there is a heightened risk that controls over spreadsheets will not be effective, or not feasible or practical to audit. Consider the design and, where relevant, testing of controls within the spreadsheets and also consider the ITGC-like controls which help the continued integrity of these embedded controls.

Avoid the tendency to seek ITGC-like controls over non-complex spreadsheets, as such controls often don't exist or are not apt to be as effective as the more typical user or supervisory controls over data input and output and the accuracy of calculations. More complex spreadsheets may warrant more complex, ITGC-like controls. The same considerations may be applicable to other end user computing applications important to the audit (e.g., management queries of information in a data warehouse to support an accounting entry).

Risks associated with spreadsheets and/or other EUCs

In determining which spreadsheets or EUC tools we need to understand, we identify:

  • where information from these tools underlies a control we intend to rely upon or other audit procedure, and
  • the risk of material misstatement associated with the information.

The risk of material misstatement may be impacted by the:

  • Nature and significance of the account or process that the spreadsheet or EUC tools supports
  • Complexity of the spreadsheet or EUC tools

For example, a highly complex workbook used to perform consolidation accounting would have a greater risk of material misstatement than a simple spreadsheet that totals information for purposes of verifying an entry on a reconciliation. As the risk of material misstatement increases, the extent of the controls understood or tested around the spreadsheets increases.
Below are some factors to consider when evaluating the complexity of a spreadsheet or database:

  • susceptibility to error (e.g., input error or logic error);
  • method of spreadsheet creation (e.g., automatic extraction/manually created);
  • use of macros and linked spreadsheets/databases;
  • use of complicated referencing, calculations, and pivot tables;
  • number of queries, reports, and joins used;
  • number of indices, fields, data records, columns, rows, or workbooks used;
  • uses of the output;
  • number of users of the spreadsheet or database; and
  • frequency and extent of changes and modifications to the spreadsheet or database.

The complexity of spreadsheets or EUC tools may be categorized as low, moderate, or high. For example:

  • LOW—An electronic logging and information tracking application such as a listing of open claims, unpaid invoices, and other information that previously would have been retained in physical file folders.

  • MODERATE—Spreadsheets or EUC tools which perform simplistic calculations, such as using formulas to total certain fields or calculate new values by multiplying two cells. These spreadsheets or EUC tools can be used as methods to translate or reformat information, often for analytical review and analysis, for recording journal entries, or for making a financial statement disclosure.

  • HIGH—Spreadsheets or EUC tools that support complex calculations, valuations, and modelling tools. These spreadsheets or EUC tools are characterized by the use of macros and multiple supporting spreadsheets where cells, values, and individual spreadsheets are linked. These spreadsheets or EUC tools might be considered “applications” (i.e., software programs). They are often used to determine transaction amounts or as the basis for journal entries into the general ledger or financial statement disclosures.

The importance of the integrity and reliability of the information generated by spreadsheets or EUC tools increases as the complexity progresses from low to high, and as usage increases.

Reports generated from other sources

Internal reports produced from systems and records that are separate and distinct from the accounting records or that are not subject to manipulation by persons in a position to influence accounting activities are generally considered more reliable than internal accounting data.

Examples may include reports generated from inventory systems that are separate and distinct from the financial reporting systems or personnel information that is maintained in a distinct human resource (HR) system. Audit trail logs generated from the operating system or scripts run based on IT security settings to support monitoring of restricted access are also examples of non-financial reports.

There is an inherent risk associated with these types of reports, as they are still internally generated by the entity and therefore a less reliable source than an external report. However, typically the data within those systems is managed and maintained by individuals who are segregated from the finance department, thus increasing the reliability of the information for audit purposes. Consideration still needs to be given to how those reports are generated, as the risk factors identified above with respect to EUC tools could still apply.

Source of data

The source of the data is as important as the source of the report, and can influence the reliability of the report. As part of the risk assessment process, OAG Audit 5030 Understand the entity’s system of internal control, including IT environment, auditors are required to understand and evaluate the design and implementation of relevant control activities. This includes the relevant control activities surrounding the flow of transactions in which the data is input into the system for the purpose of identifying and assessing the risk factors over the input of that data. Subsequently, the inputted data may be maintained within the system and can be directly extracted based on the configuration of the specific report.

In today’s complex IT environment, the data is not always extracted directly from the system in which it was input. Data may be stored in a separate database or another system. The auditor must then understand the relevant control activities surrounding the transfer or extraction of data. Consider the following factors:

  • automated interfaces between systems;

  • controls that are effectively designed and implemented to ensure that data transmitted between systems is complete and accurate;

  • the nature of those controls: whether they are fully automated or include a manual component (e.g., an individual who reconciles the information or reviews error reports).

The source of the underlying data may not always be evident to the auditor. The auditor should ask the individual who is responsible for and accountable for the report and its underlying data (i.e., the report owner/data custodian) to determine the source of the underlying data. That individual may not be the end user of the report, and in the case of reports that have long been established, it may be unclear who is ultimately responsible. Unclear assignment of accountability and responsibility can indicate a weak internal control environment, which has an impact on the data integrity and can increase the risk that the report is not complete and/or is not accurate.

Understanding how the entity uses the report

OAG Guidance

Purpose of the report

The relevance of the report and the degree and required precision of correctness will be influenced by the manner in which the entity uses the report.

It is important to understand how management uses a report, as the importance of the integrity and reliability of the report increases as the usage increases and as the significance of the account or process increases. In many entities, much of the information used for carrying out monitoring of controls and control activities, such as business performance reviews, will be produced by the entity’s information system. If management uses its business performance review as a relevant control to detect material misstatements, then any reports that are used to develop the business performance review as well as the final report become relevant to the auditor. If management places a significant amount of reliance on that control, then the report integrity is more important, as a significant error or misstatement in the report can have a material impact on the financial reporting process. If management performs that same control but places less reliance on that control and relies on other relevant controls in the business process, then the required precision level for the report decreases.

Throughout the course of the audit, the entity may also provide us with audit evidence that it has prepared, either in the ordinary course of business or specifically at our request. Reports could include analytical data that may be useful in our efforts to perform risk assessment, substantive, and/or overall conclusion analytical procedures. Reports may be used to look at the composition of significant financial balance(s), either on an aggregate or disaggregated basis (e.g., based on type of transaction, period of time, source, etc.).

Entity’s assessment of report reliability

As part of OAG Audit 5030 Understand the entity’s system of internal control, including IT environment, we need to evaluate how management obtains evidence that the report is sufficiently reliable for its purposes. It is management’s responsibility to ensure that relevant controls are designed and implemented to provide reasonable assurance that operational and control objectives will be met. Report integrity can affect management’s ability to achieve its objectives and thus needs to be considered as part of the business process. Often, management will establish an internal control framework for report reliability, particularly for those reports that are considered more valuable or relevant to the entity and its objectives.

Factors to consider when assessing management’s internal control framework for report reliability include

  • nature of the reports (e.g., external vs. internal);
  • source of the reports (e.g., system and/or data);
  • purpose of the report (e.g., why, when, and where the reports are needed);
  • management commitment (e.g., strength of the control environment as it relates to report reliability);
  • how the report is created, maintained, or modified;
  • frequency and manner in which report reliability is evaluated;
  • resources available to evaluate and monitor the effectiveness of the controls over report reliability.

The audit procedures that we eventually design to gather sufficient audit evidence regarding report reliability will depend on how management obtains assurance that the report and the underlying information is complete and accurate.

In practice, entities may not develop sufficient control activities to assess the integrity of the report. Management could assume that the report is accurate and complete without having a basis for that assumption. Consequently, errors or omissions may exist in the report, rendering the business performance review or manual control ineffective, or any financial estimates or balances based on that data inaccurate. Often, management may rely on its own judgment and its trust in the system to assess whether the reports are accurate and complete. This informal approach should not be disregarded, but rather should be the indicator that additional questions need to be posed.

Consider the following inquiries when trying to establish the strength of management’s assessments:

  • What level of expertise and knowledge of the report and its underlying data does management have?

  • Why does management feel comfortable with it?

  • Is it based on previous experience?

  • What is management’s previous experience?

  • Is management able to substantiate the report by using knowledge obtained from other sources (e.g., operational managers, other activities that it performs)?

  • Does management inherently look at key relationships or at data analytics?

  • Does management randomly verify formulas in a spreadsheet?

  • Does management assess whether the data reflects its expectations?

  • If management identifies an error in the report, does it reassess the rest of the information, or let it go as a one-time error?

  • Does management look for key transactions to ensure they are included properly in the report?

  • How would management identify missing information?

  • Does management consider the source of the report and the qualifications of the person/integrity of the system that created it?

Often, a significant amount of knowledge resides with the individual. Although it is not always a documented process, there may be potential for the auditor to obtain some level of assurance. Auditors should use their professional judgment in assessing report integrity and can consider the following factors during their assessment.

Comfort factors

Discomfort factors

  • System or process was designed by people with a good understanding of data and report integrity

  • System is not documented
  • Effective procedures and controls over report integrity are important to entity management
  • Reports are generally accepted without question
  • Report integrity is very important to the users of the reports
  • Report and data come from one person
  • There is evidence that the entity would change processes or procedures if problems had been identified
  • There is a lack of adequate review and approval
  • Report integrity was evaluated using sound sampling principles
  • Older data from a prior period is used
  • There is a consistent good approach to data management over time (years)
  • Reports are from undocumented non-standard reports or are from an end-user computing (EUC) tool
  • Report comes from an established vendor product
  • Reports are highly complex and/or require manipulation of data
  • Entity personnel are knowledgeable (including how the data is used and by whom) and experienced
  • There is an absence of adequate validity checks
  • There is evidence of good procedures and controls
  • Formulas may be subject to error
  • Report is subject to review and approval
  • There is an inability to rely on trend or pattern analysis
Developing an audit strategy

OAG Guidance

The approach to testing reports is similar to the approach used by the auditor to develop the overall audit strategy. Refer to OAG Audit 4024 Develop the Testing Strategy.

The auditor shall design and perform further audit procedures whose nature, timing, and extent are based on and are responsive to the assessed risks of material misstatement. In designing the further audit procedures to be performed, the auditor should consider the likelihood of material misstatement due to the particular characteristics of the report and underlying data (that is, the inherent risk) and whether there are relevant controls (that is, the control risk), thereby requiring the auditor to obtain audit evidence to determine whether the controls are operating effectively.

Regardless of the rigour with which the entity prepares its report or assesses the report integrity, we cannot accept the entity’s report as a source of audit evidence without independently assessing the integrity. Our audit conclusions are based on applying our professional judgment to determine the quantity and quality of audit evidence required to support that conclusion. In testing reports, we must keep this in mind and consider that the greater the audit risk, the greater the quantity and quality of evidence required to address that risk.

Nature and extent of testing

The auditor’s assessed risks may affect the types of audit procedures. For example, when an assessed risk is high (refer to comfort factors listed above), the auditor will require a higher quality of evidence; therefore, confirming the report with an independent source or performing targeted testing may be required. If an assessed risk is lower because of the particular nature and source of the report, without consideration of the related controls, then the auditor may determine that substantive analytical procedures alone provide sufficient appropriate audit evidence.

Further, certain audit procedures may be more appropriate for some assertions than others. For example, tests of controls may be most responsive to the assessed risk of misstatement of the completeness assertion, whereas substantive procedures may be most responsive to the assessed risk of misstatement of the accuracy assertion. The extent of an audit procedure that is judged to be necessary is determined after considering the materiality, the assessed risk, and the degree of assurance the auditor plans to obtain. When a greater degree of assurance is required, or if there is a higher risk of material misstatement, the auditor may increase the quantity of the evidence or obtain evidence that is more relevant or reliable.

Degree of assurance

The degree of assurance the auditor needs to obtain will depend on the purpose of the report and its importance as a source of audit evidence. For example, if the report is the sole source of evidence for a highly complex and financially significant estimate, the higher the degree of assurance required.

Decisions on whether we have sufficient appropriate audit evidence is a question of the engagement team’s judgment, weighing the materiality and risk factors against the quality of the evidence provided, the rigour of our testing, and the results of all our work. When making these decisions, we consider that certain audit procedures may be more appropriate for some assertions than for others. The degree of assurance required will be influenced by the purpose for which management or the auditor is using the report.

Consider the following actions when developing the audit strategy:

  • Summarize reports that may be used in the financial reporting process for significant financial statement line items (FSLIs) or disclosures;

  • Review the assessment of the entity’s internal control framework as it relates to the summarized reports;

  • Review the identified and documented risks of material misstatement;

  • Evaluate expectation to place any reliance on the operating effectiveness of controls and consequently the evidence expected to be gathered from substantive testing (which collectively can be described as our testing strategy);

  • Determine the impact of IT on the audit procedures, including the availability of data and the expected use of computer assisted audit techniques (CAATs);

  • Determine whether we expect to place reliance on the work of internal audit, service organizations, or audit evidence obtained in previous audits.

Developing audit procedures

A variety of audit techniques can be used to obtain audit evidence. Based on the information gathered, use your professional judgment to determine the most efficient approach to obtain the desired level of evidence.

  • Control testing

    • Manual controls (which may include controls over end-user computing (EUC) tools or management controls)
    • Automated controls or IT dependent manual controls (ITDMC) (which includes assessing application controls and Information Technology General Controls (ITGCs))
  • Substantive testing

    • Substantively test the report to reliable source documents
    • Use technology to perform testing
    • Perform data or trend analysis
    • Perform accept-reject testing

Develop audit procedures using control testing

The controls tested as part of the audit, and the extent of testing, will need to be considered in each case based on the nature of the controls over the underlying information. Control testing can be an excellent source of audit evidence. When controls are effective, the auditor generally has greater confidence in the reliability of the report. The following guidance is focused on the integrity of the report itself; however, the auditor still needs to consider separately the integrity of the underlying data and the control activities over the actual data input.

Testing the Entity’s Manual Controls

An auditor can rely on procedures performed by the entity when it assessed the integrity of the report if the auditor is satisfied with the effectiveness of the controls’ design and operating effectiveness. Obtain an understanding of how the entity assesses the reliability of the report, whether the entity is using a sufficiently rigorous process and whether the design is considered effective, then obtain evidence on how the entity

  • compares the report to reliable sources external to the company;

  • compares the report to internal information that is reliable and verifiable; and

  • relies on corroboration by individuals with the requisite knowledge, experience, and analytical skills who are not involved with the process and systems that generate the underlying information.

Examples:

  • The manager of Accounts Payable verifies the accuracy of the payables listing by selecting a small sample of planned payments (randomly or based on risk) and vouching the vendor and amount to source documentation. She assesses the completeness of the report by examining it for significant transactions for which she authorized payment (e.g., signed the invoice) earlier in the week. She signs the payables listing as authorization to proceed with payment and considers this to be evidence that she has evaluated the completeness and accuracy of the report.

  • The sales director in a consumer packaged goods company may perform business process re-engineering (BPR) over recorded sales by using reports of sales volumes by major customer, corroborated by information and knowledge he has gained through sales staff meetings and customer visits, as well as his general knowledge of sales prices.

  • The chief operating officer for a shipping company may perform a weekly BPR control over recorded fuel costs by using external fuel prices indices and a weekly fuel consumption report submitted by each vessel that she can corroborate with her own knowledge about the operating performance of the individual ships.

Document the understanding of the control and management’s procedures for corroborating the underlying information as well as the nature, timing, and extent of the audit procedure performed.

The operating effectiveness of controls over reports and non-financial information may often be tested in conjunction with other tests of controls. For example, in establishing controls over the processing of sales invoices, an entity may include controls over the recording of unit sales. In these circumstances, the auditor may test the operating effectiveness of controls over the recording of unit sales in conjunction with tests of the operating effectiveness of controls over the processing of sales invoices. In another example, if the control performer assesses the report integrity as part of their control activity, this is part of the control design and thus can be considered together. For example, if a manager reviews the BPR but in doing so looks for key relationships and material transactions, and tests the mathematical accuracy or is able to identify errors in mathematical accuracy based on the key relationships, the report can be considered reliable if the control is considered effective.

Test the Relevant Application or End-user Computing (EUC) Tool Controls and Information Technology General Controls (ITGCs)

Understanding the nature and source of the report will be important in determining whether ITGC or application controls are relevant in aiding the auditor in gathering audit evidence over the report. The design of controls for off-the-shelf packages and system-developed reports will be different from the controls over spreadsheets or EUC tools. More complex spreadsheets may warrant more complex, ITGC-like controls. The same considerations may apply to other end-user computing applications important to the audit (e.g., management queries of information in a data warehouse to support an accounting entry).

Avoid the tendency to seek ITGC-like controls over non-complex spreadsheets, as such controls often do not exist or are not apt to be as effective as the more typical user or supervisory controls over data input and output and the accuracy of calculations. More complex spreadsheets may warrant more complex, ITGC-like controls. The same considerations may be applicable to other end user computing tools important to the audit

Information routinely generated by off-the-shelf packages or programmed financial accounting systems

  • When we intend to rely on ITGCs and application controls, perform all of the following:
    • Assess the reliability of the source data. This will most often involve testing the effectiveness of relevant manual and automated controls designed to ensure the completeness, accuracy, and validity of the source data.

    • Determine that the program or routine in the system functions as intended and draws from appropriate, reliable data sources.

  • Example procedures include
    • Test the effectiveness of relevant ITGCs designed to support the ongoing integrity and reliability of both the system(s) and source data, and revalidate the automated program logic anytime it is changed.

    • For widely distributed off-the-shelf packages, standard reports may be deemed accurate if we obtain sufficient appropriate audit evidence of the client’s inability to access or change the applicable vendor source code (e.g., through a combination of specific inquiries with appropriate client IT personnel, system-based evidence such as modification date stamps, or discussion with an IT Audit specialist who have knowledge and experience of the package). Our rationale for reliance on the off-the-shelf package and the evidence obtained would be documented.

End-user computing (EUC) spreadsheet—Control considerations

  • Spreadsheets can be easily changed and may lack certain control activities, which results in an increased inherent risk and error such as:

    • Input errors: Errors that arise from flawed data entry, inaccurate referencing, or other simple cut-and-paste functions;

    • Logic errors: Errors in which inappropriate formulas are created and generate improper results;

    • Interface errors: Errors from the import or export of data with other systems;

    • Other errors: Errors include inappropriate definition of cell ranges, inappropriately referenced cells, or improperly linked spreadsheets.

  • Spreadsheet controls may include one or more of the following:

    • ITGC-like controls over the spreadsheet,

    • controls embedded within the spreadsheet (similar to an automated application control),

    • manual controls around the data input and output of the spreadsheet.

  • Similar to other applications, preserving the integrity of controls and calculations embedded within spreadsheets is dependent on the continued effective operation of ITGC-like controls over the spreadsheet. If securing controls embedded within the spreadsheet is important, the ITGC-like controls we expect to see over spreadsheets need to be similar to those over other applications used by the entity. Refer to Appendix A for detailed examples of appropriate spreadsheet controls.

  • The level of controls implemented is considered relative to the spreadsheet’s use, complexity, and required reliability of the information. For more significant amounts and/or spreadsheets with higher complexity, it may be very difficult to achieve an adequate level of control without migrating these functions to an application system with a more formalized information technology control environment.

  • It is likely that the entity will not have implemented many, or any, “industrial strength” ITGC-like controls over their spreadsheets that can be tested. In most cases, the entity essentially treats spreadsheets like a manual control. As a result, focus on those manual controls over the data input into the spreadsheet and the output calculated by the spreadsheet.

  • Where the spreadsheets are relevant to the audit, we evaluate the design and implementation (and, if seeking to rely on controls, test operating effectiveness) of controls over the data input and output from the spreadsheet to achieve information processing objectives.

  • Many of the controls around spreadsheets can be similar to typical ITGCs, consider the involvement of an IT Audit specialist in understanding and testing such controls.

Develop audit procedures using substantive testing

We may determine that it is appropriate to assess the reliability of underlying information by testing it substantively. Such a determination needs to consider the results of procedures we performed to obtain an understanding of internal control to plan the audit. In obtaining this understanding, consider an entity’s use of information technology to initiate, record, process, and report transactions or other financial data. If we become aware of evidence that calls into question the expected reliability of the underlying information, substantive testing alone may not be an effective approach, as the nature, timing, and extent of testing needed of the underlying information would likely be prohibitive.

Substantive tests may include procedures such as tracing the report details to source documents or reconciling the report to independent, reliable sources (e.g., testing the accuracy of an aging report by tracing the details back to sales invoices).

Substantive analytical procedures

Using substantive analytical procedures may be an efficient and effective way to test the reliability of reports and data. Substantive analytics consist in evaluating financial information through analysis of plausible relationships among both financial and non-financial data. Analytical procedures also encompass a necessary investigation of identified fluctuations or relationships that are inconsistent with other relevant information or that differ from expected values by a significant amount.

When using substantive analytical procedures to test the reliability of reports and data, we should consider using information that has already been audited or using third-party information. When we must use unaudited reports or data to build our expectation or to analyze a plausible relationship, we must also evaluate the reliability of that report or data before placing reliance on it.

Analytical procedures to test the reliability of reports and data should be conducted in accordance with OAG Audit 7030.

Targeted testing

Targeted testing involves selecting items to be tested based on a particular characteristic. This is our preferred approach for tests of details, as it provides the opportunity to exercise judgment over which items to test. Targeted testing can be applied to either a specific part of the data or the whole of the data. The results from targeted testing are not projected to the untested items in a population. See OAG Audit 7042. The auditor must understand the purpose of the report and how management and the auditor will use it to identify an appropriate characteristic for testing.

Use technology to perform substantive tests

Using technology to verify the completeness and accuracy of complex or large-volume reports can be an efficient audit approach. The audit team may require assistance from an IT Audit specialist, depending on the nature of the test. The involvement of the IT Audit specialist should be documented in the audit file in the IT Audit Planning Memo. If the audit team is testing the report in conjunction with an IT Audit specialist, roles and responsibilities should be clearly documented in the IT Audit Planning Memo.

The following are examples of substantive tests that can be done through the use of technology:

  • Replicating the report by running our own independent queries or programs on the actual source data (i.e., reperformance of the report);

  • Evaluating the logic of the programmed report or ad hoc query, by perhaps

    • inspecting application system configurations,

    • inspecting vendor system documentation,

    • interviewing program developers (usually not enough by itself);

  • Running sample transactions (test decks) through the program or query and comparing the output to expectations:

    • The most efficient way to test the mathematical accuracy of large reports such as detailed accounts receivable aged listings is to obtain the report electronically and use Excel, IDEA, Access, or other software to total the subtotals and/or grand total of the report. When requesting an electronic report from the client, it is important to know the approximate number of records (e.g., invoices) contained in the report.

Accept-reject testing

Before performing accept-reject testing, determine that it is the appropriate test in the circumstances. It is expected that controls testing, substantive analytical procedures, and targeted testing are considered before using accept-reject testing.

The objective of accept-reject testing, also referred to as attribute testing, is to gather sufficient evidence to either accept or reject a characteristic of interest. It does not involve the projection of a monetary misstatement in an account or population; therefore, we use accept-reject testing only when we are interested in a particular attribute or characteristic and not a monetary balance. When testing the underlying data of a report, we can apply accept-reject testing specifically to test the accuracy and completeness of the data included in the report. If more than the tolerable amount of exceptions is identified, the test is rejected as not providing the desired evidence. OAG Audit 7043 provides additional guidance and examples of when accept-reject testing may or may not be appropriate. OAG Audit 7043.1 explains the five-step approach to performing accept-reject testing.

Where it is not feasible to test the mathematical accuracy of the report electronically, we would select page sub-totals or customer balance subtotals and apply accept-reject testing (see examples of procedures in the section “Extent of testing”), following the guidance in OAG Audit 7043. As the objective of the test is to ensure that the report is accurate, we would not tolerate any exceptions. If no errors are found, we may accept the test and conclude that the listing is accurately totalled. On the other hand, if a mathematical error is found, we normally reject the accuracy of the listing, request that the client retotal the report, and then reperform the test.

Extent of testing

Substantive Procedures

Careful consideration must be given to the extent of testing when relying on substantive procedures to test the reliability of reports and data. Since substantively testing reports and data will provide assurance only on the specific instance of a report or data tested, it is required to test the report or data for each instance. For example, if report XYZ is used in testing a control and that control is tested five times, each instance of that report needs to be tested. Then, for each instance of the report tested, we need to test a sufficient number of items based on the type of substantive procedures selected. To determine the extent of testing for each instance, the auditor should follow the appropriate sections in OAG Audit 7000 that relate to the selected substantive procedures.

Reliance on Controls

When placing reliance on Control Testing to test the reliability of reports and data, the auditor will follow the appropriate sections in OAG Audit 6000 that relate to the type of control test performed to determine the extent of testing. Then, once the report or data has been tested and no errors have been identified in the control test, the auditor may place reliance on that report or data and use it for all instances where it is being used in the audit.

Rotational Testing

When relying on controls and Information Technology General Controls (ITGCs) to test the reliability of reports and data, it may be possible to perform rotational testing of reports and data. Rotational testing is permitted only when the report and data is subject to a normal risk, the control testing did not produce errors in previous years, and there has been no change in process or in the reports and data generation. Rotational testing would permit testing of a sample of reports and data yearly, but would require that all are tested at least every three years.

Source data used by experts

OAG Guidance

Where the work of an expert is relevant to the context of an audit, the reliability of source data provided to the expert should be evaluated. This evaluation should include procedures designed to ensure all reports relevant to the audit are identified. Accordingly, where an auditor’s expert has been engaged, the auditor should be directly involved in and knowledgeable about the source data and other information provided to the expert. Where the entity engages an expert directly; the source data and other information necessary for the expert to perform his or her work will often be provided without the auditor’s direct involvement or knowledge. As such, when using the work of management’s expert, the auditor should perform additional procedures to identify all relevant reports and sources of data.

For example, where the auditor plans to use the work of an actuary engaged by management, the auditor should ask management and the actuary to identify all relevant sources of data and information used by the expert and where appropriate test the underlying completeness and accuracy of data and information. For example, census data files used by an actuary are typically generated from the entity’s payroll and/or personnel information systems; therefore, testing the completeness and accuracy of the report is necessary to evaluate the reliability of the source data.

Related Guidance

For additional, guidance on evaluating source data used by the auditor’s expert and management’s expert, refer to:

OAG Audit 3096 Evaluating the Adequacy of the Auditor’s Expert’s Work

OAG Audit 3111 Management’s Experts

Appendix A: End-user computing spreadsheet control examples

The auditor can consider a combination of the following controls over end-user computing (EUC) tools or spreadsheets:

  • Change control (e.g. password protection)—Maintaining a controlled process for requesting changes to a spreadsheet, making changes, and then testing the spreadsheet and obtaining formal sign-off from an independent individual that the change is functioning as intended.

  • Version control—Creating naming conventions and directory structures for current and approved versions of spreadsheets or databases.

  • Access control (e.g., Create, Read, Update, Delete)—Limiting access at the file level to spreadsheets or databases on a central server, and assigning appropriate access privileges. Spreadsheets or databases can also be password protected to restrict access.

  • Security and integrity of data—Implementing a process to ensure that data embedded in spreadsheets or databases is current and secure. This may be done by “locking” or protecting cells to prevent inadvertent or intentional changes to standing data. Also, the spreadsheets or databases themselves are to be stored in protected directories.

  • Segregation of duties—Defining and implementing roles, authorities, and responsibilities, and defining and implementing procedures for issues such as ownership, sign-off, segregation of duties, and usage.

  • Development lifecycle—Applying a standard Software Development Life Cycle to the development process of the more critical and complex spreadsheets or databases. The process typically includes Requirements Specification, Design, Build, Testing, and Maintenance. Testing is a critical control to ensure that the spreadsheet is producing accurate and complete results.

  • Backups—Implementing a process to ensure that spreadsheets or databases are backed up on a regular basis so that complete and accurate information is available for financial reporting.

  • Archiving—Maintaining historical spreadsheets or databases (and linked data sources) that are no longer available for update in a segregated drive, and are locked as “read only.”

  • Documentation—Maintaining and keeping up to date the appropriate level of spreadsheet or database documentation, and outlining the business objective and specific functions of the spreadsheet or database.

  • Logic inspection—Having someone other than the user or creator of the spreadsheet or database inspect the logic in critical spreadsheets or databases. This review needs to be formally documented.

  • Input control—Understanding that reconciliations occur to make sure that data is input completely and accurately. Data may be input into spreadsheets manually or systematically through downloads.

  • Overall analytics—Implementing analytics as a detective control may be useful in finding errors in spreadsheets or databases used for calculations. Analytics alone would not be a sufficient control to completely address the inherent risk of financial amounts generated using spreadsheets or databases.

  • Detailed analytics—Recalculating key spreadsheet matrices manually and comparing them to recorded values. Key database reports might be recalculated by using computer assisted audit techniques (CAAT) tools such as IDEA.

Examples of spreadsheet controls

A complex spreadsheet is used to calculate the monthly liability for warranty claims. The spreadsheet contains multiple linked spreadsheets, macros, and formula-driven entries. Example controls might include the following:

  • Spreadsheet is maintained in a segregated LAN drive that is accessible only to personnel responsible for calculating and reviewing warranty claims liability.

  • Controls exist over standing data, such as protecting cells against inadvertent and unauthorized changes (i.e., passwords and formula locking).

  • Access to change passwords or formulas is restricted to an independent developer.

  • Manual data input and reconciliation of input to the source data is done by the claims accountant.

  • Pre-set analytics are embedded into the spreadsheet and measure outcomes against expectations.

  • Output is reviewed by the controller, who investigates any unusual results from analytics.

  • Documentation of spreadsheet uses and key inputs is maintained.

  • Standard naming conventions ensure the use of current spreadsheet versions.

  • Backup copies of spreadsheets are maintained for recovery of data.

  • All changes to formulas are made following an approved program change control methodology.

Manual input of data into the spreadsheet exposes the formulas and codes to direct access and to the risk that the formulas may be altered by intentional or inadvertent manipulation. Cell-locking controls can easily be turned off unless access is restricted to users outside of the business function (i.e., the IT group).

The controls described above may not be enough to mitigate the risks involved. We may suggest that the entity migrate this calculation to a formal production system.

If the following controls were implemented for the warranty claims liability spreadsheet, the Information Technology General Controls (ITGC)-type controls would allow you to place greater reliance on the spreadsheet controls and test the automated controls or calculations within the spreadsheet:

  • Data was manually input into an input facility; the spreadsheet extracted the information from this input facility.

  • Access to the spreadsheet was restricted to the automated data routine extracting information from the input facility and the IT group.

  • An audit logging software tool was implemented to log every formulaic change made to the spreadsheet and was used to restrict access to the spreadsheet.

We could rely on the controls’ continued effective operation if the ITGCs operated effectively throughout the period.

If the complexity and risk of the spreadsheet were less, such as with a spreadsheet that has simple formulas to total invoices and assign them to vendors based on vendor ID number, the example controls above would likely suffice to mitigate the risk of material misstatement.

Appendix B: Report-testing strategy examples

Accuracy and completeness: The guidance below lists procedures over the more common reports (debtor report, and inventory report) that an engagement team is expected to rely on as well as procedures we may perform to ensure they are accurate and complete. The entity and the auditors may use and rely on other reports, and similar types of testing would be considered for those. Note that this list of procedures is not exhaustive. We need to determine whether additional procedures are needed to rely on the entity’s reports.

Accounts receivable aging report

  • Apply accept-reject testing on the date and amount of individual transactions to supporting document (sales invoices and, if appropriate, shipping documents) to ensure the accuracy of the report. [To ensure accuracy of the accounts receivable report and accuracy of the aging.]

  • Agree total balance in report to amount in general ledger to ensure that there are no missing items. If there are any missing items between the report and account ledger, perform testing on the reconciliation between the report and account ledger. [To ensure completeness of the accounts receivable report.]

The procedures stated allow us to use the report for some of our detailed audit procedures in relation to aging. However, if an additional report is produced from the Accounts Receivable ledger for calculation of the provision for doubtful debts, appropriate testing on that report would also be performed.

Inventory report

  • Agree the quantity to physical inventory for a sample of items. [To ensure accuracy of the inventory report.]

  • Test calculation for the inventory amount (quantity multiplied by unit cost) of a sample of items (accept-reject testing) in the report. [To ensure accuracy of inventory report.]

Trace the report balance to account ledger to ensure that there are no missing items. If there are any missing items between the report and account ledger, test the reconciliation between report and account ledger. [To ensure completeness of inventory report.]