3072 Documentation of Work Performed Using Technology Solutions as Part of an Audit
Jun-2020

Overview

This topic explains:

  • Documenting whether a technology was used and the procedures performed and/or logic applied by the technology
  • Retention of client data used with technology solutions
  • Practical examples of documentation when using technology solutions

OAG Guidance

Work performed using technology solutions is documented and retained in accordance with the requirements in OAG Audit 1100. Examples of technology solutions include

  • Computer Assisted Audit Techniques (CAATs);

  • Data auditing techniques, automations and visualizations using technologies such as Excel or IDEA;

  • Solutions built using IDEA, Python, R, SAS or other technologies, including solutions used to test IT system configurations or security settings.

The documentation requirements apply regardless of whether the solution was developed and/or the work was performed by an engagement team member or by specialists such as members of the Data Analytics and Research Methods Team, or other specialists in accounting or auditing.

When performing procedures using technology solutions, the following is documented in the audit file:

  • The scope and purpose of the work to be performed, including the audit risks to be addressed. If an auditor’s expert or specialist in accounting or auditing is performing the work, the engagement team members need to discuss the scope and purpose with the expert or specialist for purposes of understanding and documenting or reviewing documentation of the work. Documentation includes

    • The scope of testing (e.g., consolidation, specific set of accounting records, entities or company codes);

    • The system(s) or data source(s) included in the audit procedure. Examples include

      • Level at which the testing is performed (e.g., consolidation, general ledger, subledger) when using a technology solution;

      • The specific functionalities/attributes of controls to be tested (system configurations, application security settings, etc.) when testing the entity’s internal controls (information processing controls and/or ITGCs).

    • The specific procedures to be performed and coverage dates.

  • A description of the work completed, including the following:
    • The procedures performed to determine the completeness and accuracy of the information used in the procedure(s);

    • The procedures performed, including the nature, timing, extent and results (including exceptions) of tests performed, who performed and reviewed the work and the date such work was performed and reviewed;

    • Procedures performed and/or logic applied to generate the results, as applicable;

    • Description of original entity file(s) used and the source(s) from which the data was obtained (e.g., person, system and date received);

    • A summary of analysis and findings;

    • Any follow-up actions required by the audit team (if the work was performed by a specialist in accounting or auditing or auditor’s expert); and

    • Changes or additions to the original scope, if applicable.

Documenting whether a technology was used and the procedures performed and/or logic applied by the technology

Specific to documenting the procedures performed and/or logic applied to generate results from a technology solution, there are various options available, such as

  • Including the output from the technology that includes the formulae or other criteria (e.g. sorting, filtering) used in executing the procedure (e.g., include the Excel worksheet).

  • Attaching the actual script/programming (e.g., SAS, SQL, IDEA Script, IDEA Project Overview (Table and Graph) or IDEA History) or an export of the analytical software’s workflow, the visualization output, or other system logic in the audit file.

  • When doing so, if the formulae, criteria, or logic are complex, consider how the reviewer or “an experienced auditor having no previous connection with the audit” would be able to understand what procedures were performed. In some cases, “plain English” explanations may need to be incorporated along with the workflow, visualization, or script logic to facilitate review. For example, this might be accomplished by using explanation boxes above each function executed within the workflow to explain in simple terms what the function is doing and including the workflow in the workpapers.

  • Including “plain English” documentation of the procedures performed by the technology. For example, when using script to perform certain procedures, we may provide the following documentation: “IDEA was programmed to extract [for example, a list of loan terms] from the underlying contract, populate the spreadsheet with each term and corresponding amount in a different column and calculate the difference between the balance per management’s system in column C with the amount populated by IDEA in column D.”

  • Including tickmarks in the workpapers explaining where and how the amounts were obtained/calculated.

  • When using a visualization to perform an audit procedure, there are two documentation options: (1) export screenshots of the visualizations and insert them into audit file or (2) retain the visualization file itself (e.g., IDEA .idash file). Consider the complexity of the visualization when choosing an option. The more filters, formulae or other criteria are used to reconfigure the underlying data sets, the more likely it is that retaining the visualization file will be necessary to meet the documentation requirements.

If the procedures performed using the technology are complex (e.g., the procedures may involve multiple inputs, multiple relationships between the inputs or multiple iterations of a calculation), we typically run test transactions through the workflow or bot to test both positive and negative conditions, as well as various permutations and conditions, as necessary, to verify it functions as intended. However, such testing to verify the technology functions as intended is typically not necessary if the solution is an approved tool. Document the testing performed and/or retain the documentation of the test transactions and test results in the audit file.

The nature and extent of our procedures and related documentation needs to be commensurate with the assessed risk of material misstatement and how important that audit procedure is to our audit strategy and audit plan. The higher the assessed risk and/or the complexity and the more important the audit procedure, the greater the extent of documentation.

These documentation principles apply equally when a spreadsheet, such as Microsoft Excel, is used as a technology solution in an audit. For example, when using Microsoft Excel to examine, sort, filter or analyze data, we consider using the comment functionality to include “plain English” documentation of the nature of the Excel functions or formulae being used. Likewise, if Visual Basic coding is used, we consider using comments in the Visual Basic Editor (“VBE”) to prepare “plain English” documentation of the nature of the Visual Basic program behavior and indicating in the Excel worksheet that this documentation has been included in VBE.

In addition to documentation considerations, supervision/review requirements are interrelated and are considered together when determining whether the technology used is appropriate in the circumstances and what and how much to document in the working papers. Refer to OAG Audit 3073 for supervision and review considerations when using technology solutions.

Documentation considerations when using an approved tool

In cases when we are using a technology solution approved for use by the Office we document how we used the tool and its related outputs, but do not need to document the tool’s logic. For example, when using IDEA Script to analyze general ledgers, where this tool is an approved tool for use on engagements, documentation in the engagement database regarding the software code and logic is not necessary. The reliability of approved tools is the responsibility of the Office because documentation is expected to be maintained centrally and does not need to be duplicated by the engagement team.

Retention of client data used with technology solutions

Generally, data obtained from the entity that is not required to be retained pursuant to CAS 230 and OAG Audit Manual (refer to OAG Audit 1110 and OAG Audit 1121) would not be kept after completion of the engagement. However, there may be reasons for using data previously provided by the entity such as to provide benefit in a subsequent year audit, such as:

  • Performance of trend analysis covering multiple periods of data
  • Facilitating the entity's assembly of subsequent year data
  • Training new team members on the engagement to understand data nuances

As per current practice, it is important to ensure audit documentation is properly assembled during the final file assembly period and deleted from temporary storage devices within the 60 days period. All documentation that supports the audit conclusion must be retained in the audit working paper software. A best practice is to advise the client that data received in the past will be deleted as per our policy and that we may ask for it again in the future. When data obtained from the entity will not be retained, our documentation of the procedure performed using the client data includes sufficient detail of the data used (e.g., the name, date and any other relevant parameters of the client report or query used to generate the data used in the procedure). Whether or not the data is retained, we document the procedures performed over the reliability of the data in line with OAG Audit 4028.4

Refer also to OAG Audit 3101 for retention considerations when work is performed by specialists in accounting or auditing.

Practical examples of documentation when using technology solutions

The following examples focus on some relevant considerations when making judgments about:

  • How to document the procedures performed/logic applied by technology
  • The documentation needed to support supervision and review

These examples do not cover other documentation typically included in the workpapers, such as documentation in support of the completeness and accuracy of the source documents.

Example 1

Team member uses IDEA to test the mathematical accuracy of the Accounts Receivable subledger, identify items the engagement team considers unusual, such as credit balances, identify high value items to target test, and sum the remaining balance that will be subject to non-statistical sampling.

Documentation

Documentation would specify the types of items the engagement team considered unusual, the actual items identified, a description of the items targeted (e.g., all invoices greater than $1 million), the value of the accounts receivable amounts targeted and the value of the aggregate remaining accounts receivable amount subject to non-statistical sampling. Documentation would also include which functions in IDEA were used to perform these activities (note—functions are referred to as “tasks” within IDEA). For example, “invoices greater than $1 million were selected via the IDEA direct extraction tool”. Another way the preparer may document which functions were used in this example would be to include the IDEA Project Overview Graphic and Table or the complete IDEA History, and any Equations or Import Definitions used in the audit file. Since this analysis may be used in the subsequent year’s audit, maintaining the IDEA Equations or Import Definitions in the audit file may provide efficiency in the future.

Supervision and review

The reviewer would need to understand the procedures performed by each IDEA Task and Equations used in the analysis, including filters applied, to achieve the results and would include a review of the Analysis to consider whether it was designed in a way that achieves the intended objective.

Example 2

A custom bot is designed and developed by an engagement team to automate the comparison of two Excel spreadsheets (terminated users of the general ledger system during the period and a current general ledger user listing), to identify potential exceptions where a terminated user still has access. The bot is programmed to access both Excel files, insert an Excel standard formula (vlookup) to match user IDs and names between the two spreadsheets, and flag the users appearing on both reports, indicating that user access should have been removed but was not.

Documentation

The procedures the bot performed would be documented by the preparer in the working papers via one of the documentation options described in this OAG Audit section (e.g., as an export of the bot logic or a plain English description of the bot actions). This documentation differs from the documentation that would be needed if only vlookup in Excel was used to perform the test (i.e., without the bot accessing the files to insert vlookup) because the vlookup formula would be readily reviewable directly in the Excel worksheet included in the working papers.

Supervision and Review

The reviewer would need to review the output from the bot in the context of the audit procedures being performed and review any inputs into the bot, including checking that the correct source data and parameters (e.g., time period) were used and checking the completeness and accuracy procedures performed over the source data. The reviewer is also responsible for reviewing the procedures performed to understand and assess the functionality of the bot, including whether the procedures it performs are appropriate to achieve the intended objective.

Example 3

A visualization created by an engagement team using IDEA is used to determine the period before and after period end to test sales/accounts receivable cut-off. The visualization graphically presents sales volumes for each in scope revenue stream on a daily basis during the last month of the year subject to audit and the first month of the subsequent year for each of the entity’s revenue streams.

Documentation

The working papers need to include a brief description of the visualization, including any calculations the preparer applied directly in IDEA. This would be supplemented by either an export of the visualization (e.g., PDF screenshot or PDF) or the actual IDEA file (.idash) in the audit file. As the formula being applied is relatively simple (i.e. 30 days prior to year end and 30 days subsequent to year end), inclusion of an export of the screenshot documenting this formula would likely be sufficient to meet the documentation requirements.

Supervision and Review

The reviewer would need to review the output from the visualization in the context of the procedures being performed and determine whether the filters and measures applied and calculations used are appropriate to achieve the intended objective. The reviewer will also need to understand the filters applied, assessing the purpose of the filters and whether all relevant data has been included and whether any of these considerations could change the conclusions reached.

Frequently Asked Questions

Q1. What documentation would be expected in the audit file when using the Fuzzy Match function in IDEA?

A1. The IDEA Fuzzy Duplicate task can be used to identify inexact matches or non-identical duplicates in a dataset by specifying a match key and similarity degree percentage threshold. Match similarity degrees only need to fall within the user-specified or default thresholds established in the configuration properties. Higher thresholds result in higher similarity degrees (i.e., less variability between the input and the “match groups” identified) resulting in a lower number of “match groups.” Conversely, lower thresholds result in lower similarity degrees (i.e., more variability between the input and the “match groups identified”) resulting in a higher number of “match groups.”

The IDEA Fuzzy Duplicate task similarly to the Excel Fuzzy Lookup and allows a user to identify inexact matches in their data. By specifying similarity degree percentage threshold (the default value in IDEA is 80%), utilizing varying matching algorithms and specifying other configuration options, the user can customize the tool to best fit their dataset and data match objective. Note that a unique identifier for each data record is necessary for the IDEA Fuzzy Duplicate task to work.

When using the IDEA Fuzzy Duplicate task, engagement teams need to set the matching threshold at a level that is consistent with their data match objective. The rationale for the selected matching threshold needs to be documented. For example, in trying to identify related party transactions, an engagement team may plan to use IDEA Fuzzy Duplicate task to scan the disbursement register for payments made to related parties. If the engagement team entered “Camouflage Company” as the term to match and set the threshold at 95%, variations such as “Camoflage Company” or “Camoflague Company” may be returned, but variations such as “comoflage co,” “camoflague comp” may not be returned. If the engagement team had a data match objective that includes identifying these other company name variations they would need to set the threshold at a lower percentage and document their rationale for selecting the threshold.

Q2. If the engagement team includes an IDEA equations/Import Definitions/Macros/Visual Script Macros/Visual Templates in the audit file (e.g., for use in the subsequent year), where and how can it be incorporated into the audit file?

A2. Any supporting files, including an IDEA equations (.EQX), IDEA Import Definitions (.jpm, .rdf and .xrdf), IDEA Macros (.iss), IDEA Visual Script Macros (.vscript), IDEA Visual Templates (.idash) can be imported into audit file. Do not embed files and documents as attachments within working papers as this can corrupt them. Keep in mind that when any of these files are included in the audit file, a user needs to have the related software installed in order to open the file.

Only final versions of these supporting files are uploaded to the audit file (as edits within the audit file are not possible).