Addressing Accessibility Challenges
At Stack Overflow, accessibility has not always been a priority. Our teams often dealt with issues reactively, waiting for users to report problems rather than proactively addressing them. This approach, unsurprisingly, led to a lack of clear accessibility targets. Leadership hesitated to invest fully in accessibility without visible progress, making it difficult to prioritize it alongside other initiatives. This was further compounded by our communities and customers, some of whom had legal requirements for accessibility, expressing concerns over our commitment.
Recognizing the need for improvement, we set out to establish trust with our communities and clients by demonstrating a strong commitment to accessibility. Our goal was to proactively enhance the accessibility of our products, making it a foundational aspect of our development process.
Establishing Clear Accessibility Targets
The first step in our journey was to set clear and documented accessibility targets based on the Web Content Accessibility Guidelines (WCAG). These guidelines provided us with concrete benchmarks to measure our products against, offering clarity to our teams on what accessibility meant in practice.
One of the challenges we faced was with Stack Overflow's signature orange color, which did not meet the WCAG contrast requirements. We adopted the Advanced Perceptual Contrast Algorithm (APCA) to address this, allowing us to maintain our brand color while ensuring accessibility. We formalized these targets through an Architectural Decision Record (ADR), making accessibility a non-negotiable part of our development process.
Measuring Accessibility Progress
With targets in place, we needed to measure our progress consistently. We created a synthetic signal to track our adherence to accessibility targets, giving us a clear indication of our direction. We used Axecore, an automated tool to calculate accessibility scores for our product pages, though it only caught about 50-57% of issues.
To cover the gaps, we established manual scores from various sources, including third-party auditors, community feedback, and client-reported issues. This comprehensive view of our product's accessibility was crucial for identifying areas needing improvement.
Building an Accessibility Dashboard
We developed an accessibility dashboard to make all data accessible and actionable. This hub allowed teams, the community, and clients to track progress and hold us accountable. Our dashboard displayed score trends, highlighted accessibility scores by page, and detailed issues identified through automated checks.
We employed GitHub workflows to automate checks using Axe Engine and Playwright. The results were transformed into metrics and uploaded to our accessibility platform. Manual issues were logged on a centralized board, triaged by accessibility champions, and incorporated into our manual accessibility score.
Improving the Design System
We prioritized enhancing our design system, as it serves as the foundation for user interfaces across our products. By improving focus styles and implementing skip links, we ensured better navigation, particularly for keyboard-only users. A new color palette developed using APCA was also rolled out, addressing previous contrast issues.
Our efforts included contributing back to open source by creating an APCA compliance rule for the Axe core engine, enabling automated color contrast testing across our pages.
Embedding Accessibility Early in Development
We aimed to integrate accessibility considerations early in the development process. This involved establishing accessibility champions within each team, responsible for spreading knowledge and meeting regularly with other champions. We worked with leadership to prioritize accessibility, creating a lean checklist that appeared as a comment when UI changes were detected in pull requests.
We developed short internal video trainings, dubbed accessibility bites, offering practical tips for engineers and designers. We also set accessibility service level objectives, using automated scores as service level indicators. A drop in score would trigger a priority incident, prompting swift action to address regressions.
Key Learnings and Ongoing Commitment
Through our journey, we learned the importance of setting clear accessibility targets and securing organizational commitment. Tracking progress motivated teams and demonstrated our dedication to accessibility to both leadership and customers. Integrating accessibility early in the development process and creating supportive tools and processes helped prevent regressions.
We recognize that accessibility work is never complete; it requires continuous refinement and user testing to meet evolving needs. Our approach at Stack Overflow serves as a model for other companies looking to prioritize accessibility, emphasizing that it's a continuous journey of improvement.
Comments