This is a living index. It links every technique article on this site and is updated as new pieces are published. Bookmark it as a reference.
Most audit functions can produce analytics output. Fewer can explain the methodology behind it, defend the approach to a skeptical regulator, or prove the analytics drove better scope decisions rather than just generating numbers. The gap is almost never technical. It is methodological and structural.
This guide collects practical frameworks for the full analytics cycle in internal audit: from planning through fieldwork through quality review. Each linked article goes deep on one technique — what data you need, how to execute it, where it goes wrong, and what good results actually look like.
By Audit Phase#
Planning Analytics#
The most underused analytic opportunity in most audit functions is the planning phase. Data that already exists — issue registers, prior audit results, entity risk scores — can fundamentally change how scope decisions are made before fieldwork begins.
- Planning Analytics Part 1: Issue Population Analysis — How to use the full history of audit issues, their severity, aging, and root causes to sharpen planning scope and improve management conversations from day one.
Fieldwork and reporting analytics coming as the series develops.
Fieldwork Analytics#
(Forthcoming — covering transaction testing, population stratification, control attribute analysis, and exception identification.)
Continuous Monitoring#
(Forthcoming — covering threshold-based alerting, trend monitoring, and audit-owned dashboards.)
Technique Reference#
| Technique | Phase | Core Question Answered | Article |
|---|---|---|---|
| Issue Population Analysis | Planning | Where has audit found problems before, and what pattern does that create? | Part 1 → |
What Makes Audit Analytics Credible#
Analytics in internal audit operate under constraints that business analytics do not. Results may be reviewed by management, audit committees, regulators, or external auditors. That changes the standard.
A credible audit analytic:
- Has a documented methodology that a peer auditor could replicate
- Has known limitations that are disclosed, not hidden
- Produces results that are proportionate to the evidence (no overreaching)
- Has been tested against false positives before conclusions are drawn
- Can withstand the question: “How do you know this tells you what you think it tells you?”
Every article in this series applies that standard. Technique breakdowns include common pitfalls specifically so you know where to stress-test your own work.
Quality Standard for Technique Articles#
Each article in this series covers:
- The problem being solved — what planning or testing gap does this address
- Data required — what you need and where to get it
- Step-by-step execution — how to actually run the analysis
- Interpreting results — what findings look like and what they mean
- Common pitfalls and false positives — where this goes wrong
- Audit quality impact — how this changes the quality of audit work, not just its efficiency
New articles are added to this index when published. Follow via RSS or LinkedIn.