Skip to main content

Vibe‑coding our way into WebAIM's 1 Million

Reviewing the WebAIM Million report

The annual report on the accessibility of the top million home pages was just published by WebAIM, and it’s as bleak as I expected. Detectable errors are getting worse, whilst the issues we can’t automatically detect are likely just as bad if not worse than before. Only manual testing and real user testing can show the full picture of how many barriers people still face on the web.

We’re not even getting the basics right for a more accessible web

Detected errors increase a whopping 10.1% since last year's analysis. These aren’t minor usability quirks, they’re high‑impact failures that break WCAG and, more importantly, block people from completing essential tasks. And those tasks aren’t trivial: using a service, buying groceries, paying a bill, booking transport. Real‑world stuff people rely on.

The fact that 95.9% of home pages had detectable WCAG 2 failures, a 1.1% rise on last year, shows how far we still have to go to value even the most basic user needs for accessibility. The errors are often simple to fix, but only if they’re built into everyday practice rather than treated as a one‑off tick‑box exercise.

The code we’re producing is overly complex

We saw a 22.5% increase since last year for the average number of page elements. Whether it’s down to sloppy hand‑coded markup, 3rd-party frameworks and libraries, AI‑generated or AI-assisted code, the lessons are the same: we need to get back to basics of understanding what elements do, when to use them, and how to use them properly.

Poor coding standards have real user impact, they affect how well assistive tech like screen readers perform, and they affect the devices we use too, from battery life to data usage.

People need text they can actually see or read out aloud

As in previous years, the top detected failures are still low‑contrast text, missing alt text, and missing labels. These all have real user impact and can stop someone completing a journey altogether.

I talk a lot about vision impairment in my work training sessions, including something as common as needing reading glasses, alongside colour deficiency and blindness, because these affect huge numbers of people. Yet we still do so little to accommodate them sufficiently. That’s a lot of broken journeys when we can’t even get the basics right: colour contrast, readable text, alt text and labels so people can also hear what’s on-screen.

Either learn Accessible Rich Internet Applications (ARIA) or leave it out

Humans, AI, code vibers – they all seem to be using ARIA more and more each year, over 133 per page on average this year! With it, we're seeing more detected errors. ARIA is not being used correctly and in many cases making the experience worse for users, if not creating more barriers when it was intended to reduce them. Better it's left out entirely than used by uninformed coders, be they human or otherwise.

What’s driving the increase: frameworks, libraries or AI?

The report does go on to hint that automated or AI-assisted coding practices (“vibe coding”) are likely having an impact on the alarming rate of ever larger and more technologically complex web pages. Meanwhile, JavaScript frameworks and libraries overall haven't done well in reducing errors, except Astro, which improved by 84%.

It would be interesting to see studies on what AI coding tools like Claude Code actually produce when asked to generate accessible code across a range of use cases, especially without informed users guiding the prompts.

As ever, it feels like the latest tech wave, AI, has bolted accessibility on as an afterthought. But we’re still early enough in the cycle to shape how these tools work and improve what they produce. Manual testing, and testing with real humans, will always be essential.

Those are my takeaways from the report, but you may have different conclusions. Do take some time to read the full WebAIM 1 Million report and share it with anyone you can think of who needs to read it.