#Measuring Success
We set out to improve the learnability of the tool for Novice Accessibility Developers (NADs)
Why novice accessibility developers?
-> Improving their experience can reduce friction for all users.
-> Reducing the perceived complexity surrounding accessibility
Why learnability?
It's the measure of how easily users could complete tasks on first use and how quickly they improved.
What success metrics?
-> Reduce Time on Task 🕐 on the first time use
-> Achieve Improvement in Learning Curve 📈
#Understanding the Problem
I conducted secondary research and tool walkthrough to identify tasks challenging for novice accessibility developers.
Explored Online reviews about the tool from Chrome store, Accessibility Insights’ Github community and Stack overflow.
Performed the tests provided in the tool and used Figma and Notion to annotate issues.

We hypothised that developers might want quick access to failed or incomplete tests to revisit later.
#Mixed Methods Research
We ran a 3-trial learnability study collecting quantitative and qualitative insights across repeated accessibility testing tasks.

Each trial included the same 3 tasks:
1. Start Quick Accessibility
2. Keyboard navigation test
3. Locate and report failed tests from a prior run
We collected:
-> Time on Task
-> Number of Errors
-> Think-aloud insights
-> Post-task interviews
-> SUS scores
5 participants (all developers)
-> 3 new to accessibility
-> 2 with prior experience
#What we learned
What do these graphs mean? 🤔
The average Time on Task (ToT) for Task 1 and Task 2 decreased over successive trials. For Task 3, there was an increase in ToT.
.png)
The error rate for all tasks dropped to 0 from Trial 2.
.png)
Users completed Task 1 (Quick Accessibility Scan) and Task 2 (Keyboard Navigation Test) faster with zero errors over trials. But for Task 3 (Locate and Report Failed Tests), time increased..
Users were exploring new ways to report failures once they gained confidence.
We concluded that users became proficient after two trials, so we shifted our focus to issues during their first-time use of the tool.
#Prioritisating Findings
We prioritized on the usability issues on overview and side navbar.
Using the insights we found across the tool, we prioritized them based on impact and cost.

#Key Insights

Users couldn't promptly reach tests
Enable intuitive movement between pages and tests
Assessment summary was unclear
Use clear visual indicators for progress
Action terms were confusing
Use clear, contextual language
Make key actions more prominent