The Analytics Audit Checklist: 50 Points We Check on Every Account
A comprehensive analytics audit checklist — tracking, attribution, dashboards, KPIs, governance. The 50-point list we run on every measurement engagement.

An analytics audit checklist is only useful when applied with discipline. The accounts we have turned around all had the same pattern — the data wasn't trustworthy, the KPIs were misaligned, the dashboards weren't read, and nobody knew the system well enough to fix it.
This is the 50-point analytics audit checklist we run on every new measurement engagement. It covers tracking, attribution, dashboards, KPIs, governance, and the operational details that separate functional analytics from theatre.
The checklist is grouped into seven sections — tracking foundations, GA4 configuration, Google Tag Manager, attribution, dashboards, KPIs, and governance. Work through them in order. Out-of-order audits leave gaps that compound.
Section 1 — Tracking foundations (8 points)
If the foundation is broken, nothing built on top works.
1. Is GA4 installed on every page?
Test by walking through every section of the site. Confirm GA4 fires in DebugView. Missing pages create data gaps that distort everything downstream.
2. Is GTM installed on every page?
Same test for GTM. The container should load on every page where you want to track anything.
3. Are conversion events firing correctly?
For each defined conversion (purchase, lead, signup, etc.), test the flow. Verify the event fires once with correct parameters.
4. Are events firing without duplicates?
Refresh thank-you pages 3 times. Verify only 1 conversion records. Duplicates inflate volume and break Smart Bidding.
5. Is conversion data within 5 percent of source-of-truth systems?
Compare GA4 purchase count to backend orders. Compare GA4 form submissions to CRM lead count. Discrepancies above 5 percent indicate tracking issues.
6. Are Enhanced Conversions enabled?
In GA4 and Google Ads, Enhanced Conversions with hashed first-party data should be active. We covered the full setup in our Google Ads conversion tracking setup guide.
7. Are offline conversions imported for lead-gen?
For B2B and long-cycle businesses, closed deals must flow back to Google Ads (and Meta) via offline conversion import.
8. Is HTTPS enforced and tracking-friendly?
Mixed-content errors break tracking silently. Site should be HTTPS-only.
Section 2 — GA4 configuration (10 points)
GA4 needs deliberate setup beyond just installing the snippet.
9. Is data retention set to 14 months?
Default is 2 months. Change to 14 months for usable year-over-year reports. We covered the full setup in our GA4 setup guide.
10. Are referral exclusions configured?
Payment processors (Stripe, PayPal), your own subdomains, and partner platforms should be excluded. Otherwise they pollute "referral" traffic numbers.
11. Is cross-domain tracking configured (if applicable)?
For sites spanning multiple domains, cross-domain tracking unifies user journeys. Without it, users look like separate visitors per domain.
12. Is Google Signals enabled (with consent)?
Cross-device tracking and demographic reports require Google Signals. Enable with appropriate consent management.
13. Are recommended event names used (not custom)?
purchase not order_complete. generate_lead not form_submit. Recommended names unlock standard reports and AI insights.
14. Are conversions correctly defined?
Only true business outcomes marked as conversions. Marking everything as conversions makes the data unhelpful.
15. Is Google Ads linked?
In Admin → Google Ads Links, GA4 conversions should be importable to Google Ads. Without this, Smart Bidding misses the right signal.
16. Is BigQuery export enabled?
For long-term data retention and custom analysis, BigQuery export is free for most SMEs and unlocks unlimited history.
17. Are internal traffic and developer traffic filtered?
Office IPs, dev environments, and bot traffic should be excluded from production data.
18. Are key audiences built?
Purchasers, leads, cart abandoners, high-value users, site visitors. Each audience powers a use case.
Section 3 — Google Tag Manager (8 points)
GTM is where most analytics implementations either scale or collapse over time.
19. Does GTM follow a naming convention?
[Type] - [Vendor] - [Purpose] or similar. Without convention, the container becomes unsearchable. We covered the patterns in our Google Tag Manager best practices guide.
20. Are tags organised into folders?
Logical folders (Analytics, Marketing, Pixels, Chat, etc.). Untagged items create chaos.
21. Are all tags documented (Notes field filled)?
What does it do? Who added it? When? Without documentation, original purpose is forgotten within months.
22. Are dormant tags removed?
Tags that haven't fired in 90+ days are candidates for removal. Use tools like Tag Inspector to identify.
23. Is publish version description filled in?
"Added Meta CAPI server-side event" beats "v23". Versions without descriptions block rollback.
24. Is Consent Mode v2 configured?
For EU/UK traffic (and increasingly elsewhere), consent gates tag firing. Misconfiguration is a compliance violation.
25. Are duplicate tags consolidated?
Multiple GA4 tags doing the same thing, multiple Meta Pixels, multiple analytics tools. Consolidate.
26. Is server-side GTM deployed (if applicable)?
For sites with 5+ pixels and meaningful traffic, SS-GTM significantly improves performance and data quality. We covered when to use SS-GTM in our server-side tracking guide.
Section 4 — Attribution (6 points)
How conversion credit is distributed across channels.
27. Is data-driven attribution enabled in GA4?
For accounts with 300+ monthly conversions, DDA produces better insights than last-click. We covered the framework in our multi-touch attribution explained guide.
28. Is data-driven attribution enabled in Google Ads?
Same for Google Ads attribution. Smart Bidding optimises on the chosen attribution model.
29. Is attribution window appropriate for sales cycle?
Default 30-day attribution windows are too short for B2B. Extend to match actual cycle length (60 to 90 days for many B2B).
30. Is Meta CAPI implemented (for paid Meta accounts)?
Server-side Meta tracking recovers 15 to 30 percent of conversions lost to iOS privacy restrictions.
31. Are offline conversion imports running?
For B2B, closed deals should flow back to ad platforms. We covered the offline import setup in our Google Ads conversion tracking setup guide.
32. Is incrementality testing in cadence (for larger accounts)?
For accounts spending €50K+/month, periodic incrementality tests (pausing a channel and measuring) calibrate attribution models.
Section 5 — Dashboards (8 points)
The surface where data becomes decisions.
33. Is there an executive overview dashboard?
5 to 8 top-level KPIs that capture business health. Reviewed monthly by leadership.
34. Is there a channel-specific dashboard per major channel?
Google Ads, organic, social — each needs its own deep-dive dashboard for the channel owner. We covered the dashboard archetypes in our marketing dashboard design guide.
35. Do dashboards include period comparisons?
Numbers without "vs last period" or "vs target" are meaningless. Every metric should have comparison.
36. Do dashboards include target lines?
Visual indicators of whether the business is on track. Traffic-light status indicators for executive views.
37. Are dashboards mobile-friendly?
Executive dashboards are viewed on phones between meetings. If they don't work on mobile, they don't get used.
38. Is each dashboard owned by a named person?
Without owners, broken dashboards stay broken. Every dashboard has an accountable individual.
39. Is the dashboard review cadence documented?
Daily for campaigns, weekly for channels, monthly for executive. Document who reviews what when.
40. Are unused dashboards archived?
Quarterly audit: remove dashboards nobody opens. Most teams have 30 to 60 percent dashboard waste.
Section 6 — KPIs (6 points)
The metrics that define what success means.
41. Is there a defined North Star metric?
The single number the business optimises toward. Without it, every decision becomes a debate about priorities. We covered the framework in our marketing KPIs selection guide.
42. Are 5 to 10 Strategic KPIs defined that drive the North Star?
The business-level metrics that determine whether the North Star moves up or down.
43. Are Tactical KPIs aligned to Strategic KPIs?
Channel-level metrics roll up to Strategic KPIs. Without alignment, channel owners optimise for vanity.
44. Are KPI targets set against historical baseline?
Targets pulled from thin air don't motivate. Use historical performance + ambition for targets.
45. Are vanity metrics minimised?
Impressions, page views, followers without business impact context are noise. Cut them from primary dashboards.
46. Is the KPI framework reviewed quarterly?
Markets and businesses evolve. KPIs that fit Q1 may not fit Q4. Quarterly review keeps the framework relevant.
Section 7 — Governance (4 points)
The operational discipline that keeps analytics functional over time.
47. Is there a named analytics owner?
Without an owner, problems compound silently. Someone must be accountable for data quality and dashboard maintenance.
48. Is there a quarterly audit cadence?
Tracking degrades. Tags accumulate. Dashboards become stale. Quarterly audits catch issues before they compound.
49. Is there documentation for the analytics stack?
How GA4 events are structured, where the dashboards live, who owns what. Without documentation, knowledge walks out the door with team turnover.
50. Is there a privacy and consent compliance review?
GDPR, CCPA, and similar laws require ongoing attention. Privacy practices that were compliant 18 months ago may not be today.
How to score the audit
Count how many of the 50 points you pass. Most accounts we audit pass 18 to 28 points. Top-tier accounts pass 42+.
Each missed point creates compounding problems:
- Missed tracking points: data is untrustworthy
- Missed GA4 points: reports tell incorrect stories
- Missed GTM points: container becomes unmaintainable
- Missed attribution points: budget is misallocated
- Missed dashboard points: insights don't reach decision-makers
- Missed KPI points: team optimises wrong things
- Missed governance points: problems compound over time
Prioritise based on which gaps cause the most pain.
A 30-day plan to fix the worst leaks
Run the audit. Then attack the gaps in this order.
Days 1 to 7 — Tracking foundations. Conversion events, deduplication, Enhanced Conversions, offline imports.
Days 8 to 14 — GA4 configuration. Retention, exclusions, audiences, Google Ads link, BigQuery export.
Days 15 to 20 — GTM cleanup. Naming convention, folders, documentation, dormant tag removal.
Days 21 to 25 — Attribution. Switch to DDA where eligible. Configure attribution windows. Implement server-side tracking.
Days 26 to 28 — Dashboards. Build executive overview if missing. Add comparisons and targets.
Days 29 to 30 — KPI framework. Define North Star. Set Strategic KPIs. Document.
Most accounts move from 22 of 50 to 38 of 50 in this window. The data becomes trustworthy. The team starts making better decisions.
Common audit findings on SME accounts
These are the patterns we see most often.
- No North Star metric (70 percent of accounts)
- Default 2-month GA4 retention (60 percent)
- No referral exclusions for payment processors (50 percent)
- Duplicate conversion tracking causing inflated numbers (45 percent)
- No GTM naming convention (75 percent)
- No data-driven attribution despite eligibility (55 percent)
- No offline conversion imports for B2B (80 percent of B2B accounts)
- Dashboards built once and abandoned (65 percent)
- No analytics owner (50 percent)
If any of these match your account, fix those first.
A real example — Marseille cosmetics analytics audit
A Marseille cosmetics e-commerce client passed 19 of 50 points on initial audit. Major issues: 2-month GA4 retention, no DDA, no Meta CAPI, no SS-GTM, dashboards in 6 unrelated tools, no defined KPI framework.
After 45 days of the framework above:
- All 50 points: 41 of 50 pass
- Retention extended to 14 months + BigQuery export
- DDA enabled, Meta CAPI implemented
- SS-GTM consolidated 14 tags
- Single executive dashboard replaced 6 fragmented ones
- KPI framework with North Star (gross profit) defined
Result: marketing decisions started using consistent data. Smart Bidding improved with cleaner conversion signal. Total marketing ROI lifted 26 percent over 90 days. The full story is in our Marseille cosmetics case study.
Frequently asked questions
How long does an analytics audit take?
A thorough audit takes 4 to 8 hours per major area. Total: 20 to 40 hours for the full 50-point audit.
How often should I run a full analytics audit?
Full audit annually. Spot-checks (top 10 to 15 points) quarterly. After any major site change or tool migration.
Can I audit my own setup without expert help?
Yes for the first audit. This 50-point checklist gives you a self-audit framework. Implementation requires varying levels of technical work.
What is the single most important analytics fix?
Usually one of three: conversion deduplication, switch to data-driven attribution, or defining a North Star KPI. The right answer depends on which is weakest.
Should I trust an analytics audit from the agency that set up my analytics?
A self-audit from your current setter carries inherent bias. Independent audits surface issues a setup agency might miss or downplay.
How fast can a properly-audited analytics setup show value?
Tracking fixes show within 1 to 2 weeks. Attribution fixes show within 3 to 6 weeks as Smart Bidding adjusts. Dashboard and KPI fixes show within 1 month as the team starts using better data.
Get a free 50-point analytics audit
We run this exact audit on accounts free of charge. Within 48 hours, you get a scored breakdown of all 50 points and a prioritised action plan ranked by expected impact on marketing decisions.
Book a free 30-minute audit. We screen-share, walk through your analytics stack, and you leave the call with a clear list of what to fix first.
Or explore our Google Ads service for the full system we run on accounts that need integrated paid media and analytics.
Want these strategies applied to your business?
30 minutes of free audit with concrete recommendations tailored to your business.
Read next
Cohort Analysis for SaaS and E-commerce: The Retention Truth
A practical cohort analysis guide — what cohorts reveal, retention curves, LTV calculation, tools, and how cohorts surface insights aggregate metrics hide.
Customer Journey Analysis: From First Touch to Repeat Purchase
A practical customer journey analysis guide — mapping touchpoints, identifying friction, tools (GA4, Hotjar, session replay), and patterns that lift conversion.
Marketing Mix Modeling for SMEs: When MMM Pays Back at Smaller Scales
A practical guide to marketing mix modeling for SMEs — what MMM is, when it pays back, lightweight approaches, and how to combine MMM with attribution.