Foundation — Book II: Accessibility Lies
WCAG is the canon, axe is the engine—yet the green check is not the truth.
TL;DR: WCAG defines the standard for accessible experiences; axe automates a large, accurate subset of those checks. But no tool alone can prove accessibility—governance, design discipline, and human review are required. With new rules in force (EU's EAA, U.S. DOJ Title II), treating accessibility as a product capability—not a compliance project—protects revenue, reduces risk, and compounds velocity.

Introduction
Every empire of software learns the same lesson: checklists are comforting, reality is unforgiving. Accessibility is no exception. The Web Content Accessibility Guidelines (WCAG) are the consensus canon for digital access, and axe is the most widely-used open-source engine to test against them. Yet leaders who equate a green dashboard with "accessible" end up paying twice—first in missed users and conversions, then in remediation and regulatory pain.
This chapter separates doctrine from myth and offers a practical operating model. The aim is simple: ship experiences that more people can use, while keeping the organization on the right side of WCAG 2.2, the EU's European Accessibility Act (EAA), and the U.S. DOJ's Title II rule.
The Canon: What WCAG Really Says
WCAG is a stable, technology-agnostic standard organized by A/AA/AAA conformance levels. Most policies target Level AA; 2.2 (published Oct 5, 2023) added criteria like Focus Not Obscured, Target Size (Minimum), Dragging Movements, Consistent Help, and Accessible Authentication. These reflect pragmatic hurdles users face every day.
- Example: Target Size (Minimum) requires at least a 24×24 CSS pixel hit area or equivalent spacing—critical on touch UIs. Focus Not Obscured keeps keyboard focus visible despite sticky footers and overlays.
Leaders: this is not pedantry; these are friction taxes your users pay. Remove them and conversion rises, support tickets fall.
The Three Lies of Accessibility
Lie #1: "If we pass automated checks, we're accessible."
Automated tools are essential—but cannot check all accessibility aspects. The W3C states plainly that human judgment is required and tools can produce false or misleading results. In other words, automation accelerates, it does not certify.
A reality check: the 2025 WebAIM Million scanned one million homepages and still found WCAG failures on ~95% of them. Automated checks get you part of the way; disciplined teams go the rest with design reviews and manual testing.
Lie #2: "Accessibility is a project we finish."
W3C guidance emphasizes evaluating early and throughout development—shift-left, not once-and-done. Treat accessibility like security or performance: a lifecycle with instrumentation, guardrails, and regression control.
Regulators are moving in the same direction:
- U.S. DOJ (Title II) adopted WCAG 2.1 AA for state and local government web and mobile—clear evidence of standards-based enforcement.
- The EU's EAA requires a wide range of products and services to meet accessibility requirements from 28 June 2025 onward, harmonizing obligations across member states.
Lie #3: "ARIA will fix it."
W3C's first rule of ARIA: use native HTML first. The Authoring Practices add the blunt corollary: "No ARIA is better than bad ARIA." Over-specifying roles and states can break assistive technology semantics. Start with well-structured HTML; add ARIA only where HTML cannot express the needed behavior.
WCAG + axe: A Pragmatic Stack (that Actually Scales)
Think of this as psychohistory for accessibility—small, consistent actions that predictably shape outcomes.
1) Design for access, not for aesthetics alone
- Bake contrast ratios, focus states, and target sizes into design tokens and component specs.
- Reference WCAG's Quick Reference and the "What's New in 2.2" criteria when updating design systems.
2) Automate the checkable surface area with axe-core
- axe-core implements rules for WCAG 2.0/2.1/2.2 (A/AA/AAA) and best practices. It tags each rule to relevant WCAG criteria and can run in browsers, CI, and test frameworks.
- Enable the WCAG 2.2 ruleset where applicable; keep AAA and experimental rules deliberate and documented.
Operating heuristic: Block on "critical" and "serious" violations in CI; log "moderate/minor" with SLAs tied to release risk. (axe exposes severities and metadata to support this.)
3) Keep humans in the loop—efficiently
- Run manual "easy checks" (headings, titles, alt text, keyboard focus) during PR review or story "definition of done." They're fast, useful, and catch what automation cannot.
- Add assistive-tech smoke tests on critical journeys (e.g., sign-in, checkout) with common screen readers (JAWS, NVDA, VoiceOver). WebAIM's survey data underscores their real-world prevalence.
4) Map rules to risk and accountability
- Use ACT mappings and rule tags to align axe findings with WCAG criteria and policy commitments. This strengthens audit trails and makes triage less subjective.
- Track coverage (where axe runs) and closure rate (how quickly violations are fixed), not just counts. Both are leading indicators.
5) Govern like a capability
- Treat accessibility debt like any other form of tech debt—visible, prioritized, and paid down continuously.
- Set quarterly targets:
- 0 critical violations on top-10 user flows
- 95% of new components meeting WCAG 2.2 AA on merge
- <48h MTTR for newly introduced critical issues
Executive FAQ
Do we aim for AA or AAA?
AA is the common policy baseline; AAA is valuable but often impractical everywhere. Adopt WCAG 2.2 AA as the standard, and selectively pursue AAA where it meaningfully improves core journeys (e.g., media captions, higher contrast in data-dense UIs).
Is 2.2 mandatory?
Mandates vary by jurisdiction and sector. In the U.S., DOJ Title II specifies 2.1 AA for public sector; private-sector expectations are trending similarly via settlements. In the EU, the EAA harmonizes requirements from June 28, 2025; national laws and referenced standards operationalize them. Future-proof by designing to 2.2 AA now.
Why axe?
It's the most adopted open-source engine with strong WCAG coverage, stable APIs, and ACT reporting. Crucially, Deque publishes rule metadata and mappings—useful for compliance narratives and for engineering triage. Pair it with manual review to close the last mile.
Implementation Sketch (for your teams)
- Design system: encode 2.2 AA rules (focus visibility, target size, contrast) as tokens and lintable CSS/JS vars.
- Developer workflow: run axe locally and in CI (per-PR and nightly on key routes); fail builds on critical/serious.
- QA: add short, scriptable keyboard-only and screen-reader passes on "money paths". Document with rule IDs and WCAG SC mappings.
- Governance: publish a one-page policy: "We conform to WCAG 2.2 AA, measured by axe-core in CI + documented manual checks." Refresh quarterly with EAA/DOJ updates.
Conclusion
In the vaults of Foundation, the data are clear: standards without practice are ceremony; tools without judgment are theater. Use WCAG as the canon, axe as the engine, and human review as the compass. The reward is not just legal cover—it's a broader market, higher trust, fewer rollbacks, and a faster path from idea to impact.
References
- W3C WAI — "WCAG 2 Overview." Editor: Shawn Lawton Henry. https://www.w3.org/WAI/standards-guidelines/wcag/
- W3C WAI — "What's New in WCAG 2.2." Published Oct 5, 2023. https://www.w3.org/WAI/standards-guidelines/wcag/new-in-22/
- U.S. Department of Justice — "Nondiscrimination on the Basis of Disability; Accessibility of Web Information and Services of State and Local Government Entities (Final Rule)." Federal Register, Apr 24, 2024. https://www.federalregister.gov/documents/2024/04/24/2024-07758/
- European Union — "Directive (EU) 2019/882 on the accessibility requirements for products and services (European Accessibility Act)." EUR-Lex. https://eur-lex.europa.eu/eli/dir/2019/882/oj/eng
- W3C WAI — "Selecting Web Accessibility Evaluation Tools: What Tools Can and Can Not Do." https://www.w3.org/WAI/test-evaluate/tools/selecting/
- Deque Labs — "axe-core: Accessibility engine for automated Web UI testing (README)." GitHub. https://github.com/dequelabs/axe-core
- W3C WAI — "axe-core ACT Implementation." https://www.w3.org/WAI/standards-guidelines/act/implementations/axe-core/
- Deque University — "Axe Rules and Remediation Advice." https://dequeuniversity.com/rules/axe/
- W3C — "Using ARIA in HTML: First Rule of ARIA." https://www.w3.org/TR/using-aria/
- W3C WAI — "ARIA Authoring Practices (Read Me First): No ARIA is better than Bad ARIA." https://www.w3.org/WAI/ARIA/apg/practices/read-me-first/
- WebAIM — "The WebAIM Million (2025)." https://webaim.org/projects/million/
Related Posts
Foundation — Book III: QA Directors
"Should I fire my QA Director?" Learn when to replace QA leadership vs. fix systemic quality issues. Transform QA from reactive cost center to proactive powerhouse.
Foundation — Book I: The Three-A Problem
AAA structures tests; BDD defines acceptance. Use AAA inside code and BDD at the specification boundary to scale quality without slowing delivery.
Test Wars – Episode VII: Test Coverage Rebels
Test-coverage numbers feel comforting, but they can hide mission-critical gaps. Shift focus to end-to-end, revenue-driving scenarios.