Every audit conducted under AU-C Section 240 classifies management override of controls as a significant risk. The standard says this explicitly: paragraph .31 states that "the auditor should treat the risks of management override of controls as a significant risk." Treated as a significant risk means it requires specific audit procedures, not just a checkbox in the risk assessment section. Yet the procedures most firms use to address this risk are materially inadequate for how override actually manifests in practice.
This is not a theoretical observation. The SEC's enforcement division and the PCAOB's inspection program have both documented cases where management override was occurring during the audit period and was not detected until it appeared in a subsequent restatement or enforcement action. In most of those cases, journal entry testing procedures were performed — they were just designed to find a different kind of anomaly than the one that was present.
How Management Override Actually Works
Management override of controls means that someone with authority over the accounting function uses that authority to cause entries to be made that circumvent established controls. The definition is important because it clarifies what you're looking for: not fraud in the abstract, but the specific mechanism of using legitimate system access to make illegitimate entries.
In practice, management override takes several distinct forms, and the audit procedures that detect one form often don't detect the others.
Direct entry bypass: The CFO, controller, or other senior finance personnel posts entries directly to the general ledger using a user ID that has approval authority. These entries bypass the normal preparer/approver workflow because the user is both preparer and approver. In most ERP systems, this is technically permitted — the system enforces segregation of duties for standard users but grants override capability to defined superuser roles. The entry looks authorized because it was posted by an authorized user.
Approval chain circumvention: Entries that would normally require secondary approval are posted through a mechanism that avoids that approval — through a system parameter override, through a journal entry type that's exempt from the normal workflow, or through an approval that was obtained under false pretenses. The entry has an approver in the system, but the approver didn't exercise independent judgment about whether to approve it.
Accumulating reserve manipulation: This is the most common form seen in mid-market clients and the hardest to detect without longitudinal analysis. Management establishes reserves that are "conservative" (i.e., larger than the best estimate) and draws them down in periods when reported results would otherwise miss earnings targets. Each individual entry to establish or release a reserve looks reasonable in isolation. The pattern only becomes visible when you look at reserve creation and release timing across multiple periods relative to reported earnings.
Why Standard JE Testing Procedures Miss This
The standard journal entry testing procedures — filter for entries above materiality, filter for after-hours postings, filter for unusual preparers — are designed to find entries that are statistical outliers relative to the population. Management override by senior finance personnel is often not a statistical outlier. The CFO posting a $500,000 year-end reserve reversal may have done exactly that in the prior four years. The reversal is consistent with historical patterns. It clears all the standard filters. It gets sampled at the same rate as everything else in the population.
The problem is that the standard filters are looking for entries that look different from normal, but management override at the level we're discussing is specifically designed to look normal. The override is embedded in the accounting judgment — the decision to record a $500,000 reversal rather than a $200,000 reversal — not in a technical violation of system controls.
Detecting this kind of override requires longitudinal analysis that compares the current period's estimates, reserves, and discretionary entries to prior period patterns and to the financial statement outcomes they were made in the context of. An entry made on December 30 to reverse a reserve that was established on October 1, resulting in a revenue line that exactly meets the quarterly target, is suspicious in a way that neither the October entry nor the December entry is suspicious in isolation.
What the Detection Procedures Actually Need to Include
AU-C 240's three required JE testing procedures are: reviewing the types of entries made (not just the amounts), testing entries made at the end of the period, and testing entries that relate to unusual or infrequent transactions or items. The third category is where override typically hides, and it's the category most often under-addressed.
Effective management override detection in journal entry testing requires these elements in addition to standard filter procedures:
Preparer authority analysis. Identify all journal entries posted by users whose system role grants them bypass authority — CFO, controller, accounting manager, IT superusers. Treat this sub-population as a distinct testing population with higher coverage requirements. The fact that an entry was posted by an authorized user does not mean it was authorized in substance.
Reversal timing analysis. Identify all accruals established during the period and trace their reversal timing. Reserves reversed in the same period they were established, or reversed in periods where reported results came in below expectations, warrant investigation. This analysis requires the prior-year JE population or at minimum the prior-year trial balance to perform meaningfully.
Unusual account combination analysis. Entries that debit an operating expense and credit a reserve account, or debit an asset and credit revenue, warrant more scrutiny than entries between accounts with an obvious logical relationship. AuditPulsar's detection engine specifically flags account combinations that deviate from this client's own historical distribution of account pairings, which is a more precise filter than generic materiality thresholds.
Soft close to hard close comparison. For clients that use both, compare entries posted in the soft close period to those posted after the soft close adjustment. Entries that appear only in the hard close period — particularly adjustments to previously settled soft close entries — are a management override indicator.
The Documentation Gap This Creates
Under AS 2201 for integrated audits, management override of controls is a significant risk that must be explicitly addressed in the audit response. The documentation must show not just that JE testing was performed, but that the specific procedures performed were designed to address the override risk identified in the risk assessment.
A workpaper that documents "we tested entries above materiality and after-hours entries" is not documenting a response to management override risk. It's documenting a response to entry-level anomaly risk, which is a different thing. The documentation gap is not just a compliance issue — it reflects a genuine gap in the procedures being performed.
The workpaper response to management override risk needs to specifically address: how senior-authority postings were identified, what additional scrutiny was applied to that sub-population, and how reserve creation and release timing was analyzed for evidence of earnings management. Each of these is a distinct procedure from the standard anomaly tests.
Using Automated Scoring for Override Detection
AuditPulsar's anomaly scoring specifically includes a superuser-posting flag and a reserve-timing analysis. The superuser flag is derived from the preparer's historical posting patterns — users who post across a wide range of accounts, across many document types, and without a consistent secondary approver are scored higher for override risk regardless of whether their individual entries look unusual on standard statistical tests.
The reserve-timing analysis requires two years of data to run meaningfully. When a second year's journal entry population is loaded into the platform, the system automatically identifies recurring reversal patterns and flags reversals where the timing correlates with earnings reporting periods. That correlation doesn't prove override — it identifies entries that warrant manual investigation of the business rationale.
The combination of automated flagging for these patterns and auditor review of the flagged items addresses the oversight gap that standard JE testing procedures create. The key is ensuring that the workpaper documentation describes the management override-specific procedures separately from the general anomaly testing, so the risk response is clearly traceable to the identified risk.
Starting Point for Firms That Don't Currently Do This
If your current JE testing program doesn't include preparer authority analysis or reserve timing review, the shortest path to adding them is a retrospective exercise on a completed engagement. Pull the prior-year JE population. Filter to entries by the CFO and controller user IDs. Review the timing and account combinations of those entries relative to the earnings periods they fall in. Compare the results to what your testing documentation says you found.
That exercise will tell you whether your current coverage is addressing the risk or simply performing procedures that satisfy the checklist while leaving the actual override exposure unexamined. The answer is usually that the gap is real and the procedures need to change.