7 Minute Read

Redesigning Accessibility Insights

Duration
2023 (Aug-Dec)
Role
Product Designer
Summary
As part of our Research Methods course, we partnered with Microsoft’s Accessibility Insights to redesign the tool’s overview page and navigation —resulting in a 55% increase in developer efficiency.
#The Introduction
Narrative Reading Game Project Interface
A novice developer is asked to evaluate a website for accessibility— for the very first time.
He launches Microsoft's Accessibility Insights tool, designed to find and fix accessibility issues on webpages by performing both automated and manual tests (90+ tests in total!)
Narrative Reading Game Project Interface
#The Problem

But the developer finds the tool difficult to navigate and overwhelming to interpret

-> Summarizing progress is difficult
-> Key actions are unclear
-> Navigating to tests is complex
#The Challenge

How might we improve the first-time accessibility testing experience for novice developers? 🤔

#The Solution

We reimagined Accessibility Insights through the eyes of a first-time user — to make it feel approachable, purposeful, and clear.

Smarter Test Organization
📌 Finding
The original layout overwhelmed users with dense, unordered categories.
Tests are now logically grouped by disability type, sortable, and displayed alphabetically by default—making information easier to find and act on.
"I appreciate how you guys filtered tests by modality and criticality" - Laura Waits
Simplified Navigation
We removed the redundant sidebar from the overview page and made test category cards directly clickable.

The sidebar now appears only within test pages, reducing cognitive load and reducing time!  
📌 Finding
Users take time to locate specific tests.
"The layout looks clean, I can access the required tests quickly" - Participant
Clear Progress Indicators
📌 Finding
Users couldn’t easily track overall assessment progress or test results
We introduced graphs, percentages along with a one-line summary ("Your website is 50% accessible") to make overall progress instantly visible and understandable.
“They’d be able to check off websites based on different accessibility categories — which is a big step.” - Microsoft Product Designer
Improved Language and Discoverability
We renamed unclear labels and repositioned key actions like Export near progress summaries
📌 Finding
Users misunderstood terms like “Save” vs. “Export,” and struggled to locate key actions like Export.

The results? We improved efficiency by 55%, empowering novice developers to test with confidence and reframed accessibility as doable — not daunting.

Narrative Reading Game Project Interface
☆ Click here to sKip to evaluation ☆

So how did we get there?

#Measuring Success
We set out to improve the learnability of the tool for Novice Accessibility Developers (NADs)
Narrative Reading Game Project Interface
Why novice accessibility developers?
-> Improving their experience can reduce friction for all users.
-> Reducing the perceived complexity surrounding accessibility
Why learnability?
It's the measure of how easily users could complete tasks on first use and how quickly they improved.
What success metrics?
-> Reduce Time on Task 🕐 on the first time use
-> Achieve Improvement in Learning Curve 📈
#Understanding the Problem
I conducted secondary research and tool walkthrough to identify tasks challenging for novice accessibility developers.
Explored Online reviews about the tool from Chrome store, Accessibility Insights’ Github community and Stack overflow.

Performed the tests provided in the tool and used Figma and Notion to annotate issues.  
We hypothised that developers might want quick access to failed or incomplete tests to revisit later.
#Mixed Methods Research
We ran a 3-trial learnability study collecting quantitative and qualitative insights across repeated accessibility testing tasks.
Each trial included the same 3 tasks:
1. Start Quick Accessibility
2. Keyboard navigation test
3. Locate and report failed tests from a prior run
We collected:
-> Time on Task
-> Number of Errors
-> Think-aloud insights
-> Post-task interviews
-> SUS scores
5 participants (all developers)
-> 3 new to accessibility
-> 2 with prior experience
#What we learned
What do these graphs mean?  🤔
The average Time on Task (ToT) for Task 1 and Task 2 decreased over successive trials. For Task 3, there was an increase in ToT.
The error rate for all tasks dropped to 0 from Trial 2.
Users completed Task 1 (Quick Accessibility Scan) and Task 2 (Keyboard Navigation Test) faster with zero errors over trials. But for Task 3 (Locate and Report Failed Tests), time increased..
Users were exploring new ways to report failures once they gained confidence.
We concluded that users became proficient after two trials, so we shifted our focus to issues during their first-time use of the tool.
#Prioritisating Findings
We prioritized on the usability issues on overview and side navbar.
Using the insights we found across the tool, we prioritized them based on impact and cost.
#Key Insights
Theme
Problem
Implication
Navigation
Users couldn't promptly reach tests
Enable intuitive movement between pages and tests
Visibility of Status
Assessment summary was unclear
Use clear visual indicators for progress
UX Copy
Action terms were confusing
Use clear, contextual language
Discoverability
Export was hard to find
Make key actions more prominent
#Chaotic Iterations
Ideation. Iteration. Feedback. Repeat.
#Iteration 1
In this iteration, we explored various ways to visualize assessment progress and failed test results, incorporating colorful indicators into the navigation bar for better visibility.
#Iteration 2
In the second iteration, we simplified visuals, moved Export Results below assessment-related buttons, and replaced the onboarding navbar with clickable categories that reveal the sidebar on the test page.
After multiple rounds of iteration and feedback, our final design aimed to reduce cognitive load, guide user flow, and improve visibility.
Prototype
Play around with our prototype!
#Evaluation Testing
To validate the redesign, we reran Task 3: “Locate and Report Failed Tests.”
The average Time on Task (ToT) for Task 3 in the original design (Trial 1) was higher than that of the redesigned version
We collected:
-> Time on Task
-> Think-aloud insights
Participants
4 developers with no experience with the tool
We observed a 55% decrease in time taken to report failed test results.
Overall, we received positive feedback with some criticisms.
“While the navigation does not give complete status of the assessment, it is still a big improvement from before” - Microsoft Product Designer
Problem
Implication
Users were confused by the phrase “50% accessible.”
Clarify whether the percentage refers to passed tests or total.
“New Assessment” was mistaken for a static button, not a dropdown.
Add a dropdown arrow/icon to indicate interactivity.
Test categories still felt like a lot
Use progress disclosure
#Accessibility Testing
We conducted Cognitive Walkthrough with 2 SMEs with visual impairment to evaluate interface for screen reader compatibility.
Task Performed
-> Launch Quick Assessment
-> Filter and Sort tests on Overview
-> Report Failures
"Just like in maps you first see an overview of how the path is and then go into step by step directions. I first want to know how the website is structured before going into the main sections”​​ – Accessibility Expert 2
Improving semantic structure, ARIA labeling, and alt text can significantly enhance accessibility for screen reader users.
Feedback
-> Suggested semantic labeling of regions (e.g., “Main Section”, “Navigation Section”)
-> ARIA roles and tab order need improvement for test categories and filters
-> ALT text needed for all images and visual elements (e.g., charts)
Reflections
What I Learned
The learnability study took way more time than I expected—time we probably could have spent iterating on designs. Looking back, the tool was actually pretty learnable by the second trial, so some of my early assumptions about the difficulty were a bit off. It was a good reminder to keep testing assumptions early and often!
The Challenges
Accessibility testing was definitely tougher than I anticipated. Especially testing for screen reader compatibility—everyone uses different tools and settings, which made it tricky to prepare fully. It really opened my eyes to how complex inclusive design and testing can be, and how important it is to build flexible, user-centered tools.
Sharing the Journey
I had the chance to share this work at Knowbility’s AccessU 2024 conference, talking about Leveraging User-Centric Practices to Create Accessible and Learnable Tools & Experiences for Accessibility Testing. It was an amazing experience to connect with others passionate about accessibility and continue learning.
Click here to check it out ->