Automatica11y: Automated web accessibility testing

By Job van Achterberg (‎jkva‎)
Date: Saturday, 3 December 2016 11:30
Duration: 50 minutes
Target audience: Any
Language: English
Tags: mechanize selenium usability webdesign


There exists a large variety of accessibility tools. From small offline single-purpose scripts and
"checkbox accessibility" browser plugins, to full-stack auditing suites integrating with the entire
development process.

Manually auditing content accessibility is a tedious process. While tools like The Paciello Group's
Color Contrast Analyser[1] and Access For All's PDF Accessibility Checker[2] can be used to
validate some elements, this does not significantly reduce the amount of time spent testing
manually.

To increase the efficiency and scope of manual testing, tools such as Deque's aXe[3] and
Google's Chrome Accessibility Developer Tools[4] can be used by an auditor to quickly run a
wide array of tests on a webpage inside of a browser, and be presented with a list of failures and
suggestions on how to correct these.

However, while faster and more effective, these browser extensions still rely on manual auditor
intervention, and have to be iteratively applied as changes are made to the application under test,
in order to prevent accessibility regression from taking place.

This problem, combined with the current trend of accessibility becoming more and more intrinsic to
the software development process is forcing testing tools to adapt, as QA auditors and software
developers have different requirements and expectations when testing for compliance.

In response to this trend, we can see tools such as Tenon[5] offering integration with the
development and QA process across the entire stack, by running automated tests as pull requests
are being pushed for review, and integrating with existing code unit tests frameworks. Deque's
aXe[3] recently added integration with popular continuous integration tooling in order to be more
easily used by developers.

As this trend continues, we will be seeing tighter coupled, more useful integration with
development tooling such as automated Github/Jira issue generation, as well as more
sophisticated automated tests, such as throttled network page load simulations, image visibility
analysis, and more powerful heuristic algorithms to predict high likelihood of problematic content.

During this talk, various accessibility testing tools will be compared in terms of intended use and
offered functionality. The audience will learn how these tools can be applied to make auditing
easier. The limitations of automation will be discussed, and contrasted with the human auditor role
and value in accessibility testing.

Visible trends in tooling will be discussed, as accessibility testing is slowly becoming an integral
part of the web development process.

We will see how increased usage of WAI-ARIA attributes by developers shows how accessibility
becomes more about bolt-on semantics, rather than native semantics. This prevents a new
challenge for automated tooling, as the algorithms analysing proper WAI-ARIA use will have to
learn to correctly identify both valid and invalid usage of this spec.

Finally, an analysis of real-world auditing data will show patterns and predictions in web
accessibility. As JavaScript single-page applications continue to grow in popularity, we will see this
reflected in accessibility patterns and anti-patterns across the web.


Attended by: Tony Dunlop, Roland Schmitz (‎roli‎), Lee Johnson, Helen Schuilenburg, David Dorward, Anthony Lucas (‎hor|zon‎), Richard van Lochem (‎rvlochem‎), Chris Jack, Peter Mottram (‎SysPete‎), Dagfinn Ilmari MannsÃ¥ker (‎ilmari‎), Jody Belka (‎knewt‎),

Sponsors

Corporate

Adzuna     Booking.com     CV Library     Eligo     geek University     magnum Solutions     OpusVL     Perl Careers     Science Photo Library     Shadowcat
      Systems Limited

Community

Enlightened Perl Organisation     FlossUK     Perl6 Community     Perl Weekly