Listening to the Web, Part One: Thinking in Accessibility

Listening to the Web, Part One: Thinking in Accessibility


"We have some concerns around the way some of the form controls are designed. They'll need to be completely customized in order to make them keyboard accessible and match the design. This means more time will be spent coding and making sure they're usable. It also means a slightly longer loading time."

"Yes, okay that's fine. Just make it work."

"We also wanted to bring up issues we found with the colour contrast of the primary text. The way things are now it will be difficult to read for some people who may have low vision…"

"Listen, we don't care about accessibility. We just want the site to look as it was designed. Let's move on."


This was an actual, slightly paraphrased, conversation I overheard between my agency at the time and a design firm. I was floored.

We don't care?

Why would you not care about making sure something was usable by people? Why design anything at all? Aren't people the reason why we do what we do?

Before that project launched, we did all that we could. The site was mostly usable, but there were many issues that remained, not the least of which was that this approach didn't demonstrate inclusive thinking.

I realized in this moment that no amount of technical know-how could convince people of the need to ensure an accessible user experience . Like most things, creating accessible websites is a process, one which begins with a mindset shift.

Overview

I wrote this post series as a way of introducing readers, and particularly developers, to the basics of working in accessibility. It's a recollection of my personal experience over the last few years, learning and growing as a developer with accessibility at the forefront of my mind.

We'll start at the very beginning: thinking in accessibility. Developers often think about screen readers when they think about accessibility, so let's explore that area. We'll take a look at who uses screen readers, why and how. Then we'll roll up our sleeves and get to work.

We'll take a look at semantic HTML , aspects of elements and how to choose accessible elements which provide proper semantics. We'll dive into how people use screen readers , commands every developer should know, and we'll seal the deal with a look at issues and fixes on a demo site.

Most importantly, I hope after reading this that you'll see how the technical skills are not the end but the means to serving a larger vision: one of awareness, inclusivity, and empathy.

Mindfulness is key

Consider this: When you're driving a car, it's crucial to pay attention to your surroundings. To be a successful driver, you must take into account other drivers, cyclists, road signs, and pedestrians. Without this consideration for others, you face the potential of getting into trouble quite quickly.

Along the same lines, as designers and developers, you help ensure a positive user experience for all when you are mindful of people and the various ways they use and interact with your website.

As you design, if you consider someone in a remote area of the world with a poor connection speed, you'll make sure the site and all its assets will load quickly and perform optimally. If you consider someone doing research and learning about a topic using a handheld device, you'll be sure to design a user interface that supports a wide range of device screen sizes and orientations. If you consider someone who is blind, hard of hearing, or who has difficulty understanding written text on a page, then you'll code and structure content in a way that will help convey meaning and purpose for people who use and depend on assistive technology to experience the web.

What really happens is that as you start thinking of and developing for people with disabilities, you end up actually improving the usability of a site for everyone (for instance, people using outdated browsers or older handheld devices).

Being mindful of people's abilities and showing empathy and compassion helps drive your development workflow and best practices in a way that is beneficial for everyone.

Who's listening?

One afternoon while riding on a public bus, I noticed a gentleman across the aisle, holding his phone with both hands in a peculiar way. With his left hand, the phone was held up with the speaker end pressed tightly against his ear. His right hand was feverishly swiping and gesturing commands into the device, moving so quickly I couldn't understand how he was able to make sense of it all. Shortly after, I realized that he was using the built-in screen reader software on his phone in order to find and listen to the content on the website or app. Witnessing this was a revelation to me. The experience showed me how some people consume content, and how important it is that front-end interfaces are well-equipped to handle various types of user interactions.

In a 2014 report , the National Health Interview Survey (NHIS) estimated that 22.5 million adult Americans either "have trouble" seeing, even when wearing glasses, or are unable to see at all. This is nearly 10% of the population, and a significant number of those people may rely on assistive devices and technologies, some using screen readers, to read and navigate the web.

People who use a screen reader can hear what's on the screen by having a digital voice read aloud the content and interactive elements on the page. There are several kinds of people who rely on screen reader technology, including those who have low-vision or who are legally blind. Some people have difficulty reading text, while others need to have text read aloud as they do something else. Some people just want to listen to a lengthy blog post while cooking dinner! In other words, screen readers are not solely used by people with sight limitations.

When you consider that at least 10% of the population would benefit from screen reader technology, the time it takes to learn and work with a screen reader is not a waste at all. This knowledge will make you a great asset to any development team!

Bake accessibility in from the start

When I began my planning stage of how I'd mark up a new design with accessibility in mind, my whole outlook on development changed. I made sure to use native browser controls and logically ordered headings, colours that would pass contrast tests, and unobtrusive JavaScript which helped manage keyboard focus for complex interactions.

HTML

For the first few days on a new project, after studying the design files, making mental notes of what made sense to be repeatable modules, and talking with the designers, I would write HTML. Just simple, plain HTML without any bells or whistles. This provided me with a place where I could start testing with a screen reader for basic, low-level issues, things I could watch for and fix right away. I made sure everything had the correct semantics and context for what the elements were supposed to be.

CSS

With the HTML structure in place written as reusable modules of a greater design system, I would go in and start adding classes and writing the CSS in order to match the intended design. When the styles were added to a module, I'd again go in and test for accessibility issues. I would keep an eye out for any elements with a display property, taking note to ensure things that were visible (or not visible) would reflect this state when interacting with a screen reader. I'd also watch for element positioning. Things can get out of hand pretty quickly, so I needed to be certain that the visual order of content matched the order that would be found when using a screen reader. This way, everyone would get the same reading order for the content and avoid any possibility of confusion.

JavaScript

It may surprise some readers, as it did myself when I first started learning about accessibility development, that JavaScript plays a big part in writing accessible interfaces. This is especially true when creating dynamic widgets such as accordions or modal windows, or perhaps implementing a hidden navigation menu for small screen devices. By using JavaScript, we can dynamically create and inject extra elements and add the required attributes to further convey the meaning and purpose of our more complex user interfaces. JavaScript is also used to manage keyboard focus for any custom widgets or complex interactions.


We'll dive a little more deeply into these concepts in the next part of the series but for now, the main point to take away is that we should keep people in mind and test for accessibility issues in each step of the development process. Let's remember that what we do is for people, not for other designers or developers. In most cases, not even for ourselves.

Bake accessibility into each development iteration. Test early and often. Be mindful of your audience. By adding these small extra steps to your workflow, you'll be making sure accessibility is ready and available right from the get-go.

Back to blog