After 15 months of development (and plenty of love, sweat, and tears), I am excited to announce Crest, an open source tool that helps developers conduct better automated software testing, enabling compliance with ADA guidelines and improving accessibility for people with disabilities. The timing of this announcement is no accident. Today is International Day of Persons with Disabilities (IDPD). It has special significance for me, because I struggled with congenital orthopedic issues for many years. I have dedicated my career to this cause, building corporate accessibility programs designed to drive inclusion and outreach for people with disabilities.
What is accessibility?
Even though most of us have heard the term, “accessibility” may be a bit vague to those who have never struggled with a disability. Being locked out of sporting events, parties, video games, websites, software, classes, and so on can provoke a terrible feeling of isolation and alienation. Making places and activities accessible extends an invitation to participate and to feel included and valued. Online, the Web Content Accessibility Guidelines (WCAG) 2.1 Level AA enumerates an international standard of 50 guidelines that help ensure that someone with a disability or combination of disabilities (think Steven Hawking) can use websites, native apps, software, or documentation without assistance.
Lack of accessibility is a shockingly widespread problem. When Web Accessibility in Mind (WebAIM), a Utah State University accessibility nonprofit, used its software to evaluate 1 million websites, it found that less than two percent met international WCAG 2.1 standards. One reason is the difficulty involved with testing. While reviewing VMware products against the WCAG 2.1 standard, I experienced this phenomenon firsthand. Whether testing audio versions of text for the visually impaired, keyboard compatibility for the physically impaired, or other assistive technology, I was forced to run numerous manual checks. I found that only 30% of the required testing could be conducted with standard automated open source tools and the remaining 70% required manual testing. This is what I refer to as the 30/70 gap. Some examples include making sure that all headings are relevant to the body content and all videos have captioning. Activities that involve mouse buttons, clicking links, and using menus should be accessible via keyboard and provide a keyboard focus indicator that displays a box around a component to highlight it. Even the box outlines have requirements for contrast and thickness. I was unable to find any tools that could do this in an automated fashion.
Manual testing is incredibly time-intensive. It requires tabbing and then shift-tabbing (to go backward) through every component of every page, taking detailed contrast and size measurements of the focus indicators. It could take me 25 minutes to test four things on a single complex page. I needed to scale as quickly as possible while evaluating WCAG 2.1 Level AA compliance for VMware products.
The cost of tools and manual checks for an existing product can be prohibitive, especially for small businesses. But without an accessible website, companies are at risk of incurring penalties. In 2020, consumers and employees filed roughly 4,000 web-accessibility lawsuits against businesses that failed to comply with the WCAG 2.1 guidelines.
A new model for accessibility testing
In trying to determine a more effective solution, I reviewed some well-known accessibility-testing engines, such as WAVE and axe. All of them analyzed code but were unable to test the contextual product behavior or content itself, prohibiting them from accurately evaluating compliance with many of the WCAG guidelines. I decided to study a machine-learning-based approach that we believed could close the 30/70 gap by recognizing images, text, and patterns, achieving a more accurate evaluation of the software’s behavior and content in context. This approach, we theorized, could find defects earlier in the software development cycle, even in organizations new to accessible development.
To speed up development, I decided to integrate WAVE with our internal tool, VMware Test as a Service, or vTaaS (pronounced, vee-tahs). vTaaS is a comprehensive testing harness that can transfer the results of a test failure into a trouble ticket with a single click of a button. The tool let me focus on the new test behavior, as opposed to the test execution, dashboarding, or bug tracking. vTaaS handled it all.
From idea to execution: the innovation pathway
In October of 2019, I participated in VMware’s first internal Pitch-a-thon, a program that helps innovators refine and articulate their ideas effectively to get buy-in from leadership. Once I had crystalized my pitch (and won the competition!), I was encouraged to submit my idea to VMware’s xLabs innovation program. xLabs is the equivalent of an internal startup accelerator within VMware Office of the CTO (OCTO). It welcomes innovative ideas from any part of the VMware organization, encouraging invention and creativity across the company. I was thrilled to learn that xLabs accepted my idea as a “lite” three-month project (as opposed to a full-scale year-long project), funding a intern to assist me with project development starting in the summer of 2020.
Over the next three months, our small team began to build new automated accessibility tests using a combination of natural language processing, image processing, and pattern recognition. Starting with gov.uk — widely regarded by accessibility experts as the most accessible global website, based on the WCAG 2.1 standard — we tested Crest against more than 300 of the world’s most heavily trafficked websites. We reviewed the results, looking for false negatives and false positives. The final step involved either adjusting the code or feeding the correct results back into the machine-learning model. By the end of the three-month incubation, we successfully developed five automated test segments, which were formerly completely manual:
- Header comparison with surrounding text
- Closed caption presence in videos
- Transcript presence for podcasts
- Keyboard access for activatable components
- Keyboard focus indicator
To see a demo, check out the VMworld 2020 session here (at 31:00).
Due to the successful completion of our three-month innovation project, in October of 2020, Crest was promoted to the xLabs full track, which enabled me to add a full-time staff member to the project for one year. The expertise of the VMware Accessibility and vTaaS teams, along with our newest team member’s perspective as a blind screen reader, allowed us to effectively decrease manual testing from 70% to 66%.
While a four-percent decrease may sound like a trivial improvement, every one percent of accessibility testing that we can automate can save VMware tens of thousands a year in accessibility-testing costs. With future versions of Crest and new tests already planned for release in 2021, our goal is to decrease the manual-accessibility testing gap to a 50/50 split. This progress moves the team closer to our goal of allowing VMware engineers, including those with vision loss, to test and thoroughly debug their own code to ensure compliance with WCAG 2.1 Level AA guidelines.
Accessibility innovation as a tech for good
We decided to launch Crest as an open source project to make online accessibility a viable option for more businesses, even those with minimal budgets. Crest source code and documentation will be available on GitHub by the end of the year, so stay tuned for updates by following us on Twitter at @vmwOCTO and @vmwOpensource. We’re excited that major private- and public-sector groups are already expressing interest in contributing to the Crest data model and testing repository.
This endeavor serves not only our customers, but their customers, as well. It is also an example of the type of innovation we strive for at VMware, as we work towards the goal of becoming a significant force for good in our local and global communities.
We invite the development community to participate in Crest development and improvement. The flywheel of innovation spins much faster with widespread participation. If you’re interested, please email me at firstname.lastname@example.org. We look forward to your contributions. You can learn more about VMware’s many open source activities and investments by visiting our blog at https://blogs.vmware.com/opensource.