When everyday tech tools aren’t designed with accessibility in mind, they can do more harm than good — even if the harm is unintentional.
Systems like AI-based resume scanners and productivity trackers can unwittingly discriminate against people with disabilities, according to Ariana Aboulafia, a project lead at the Center for Democracy and Technology.
WORKPLACE AI & DISABILITY FAQ
Which tools are likely to discriminate?Resume screening software, productivity monitoring systems that track keystrokes or computer activity and surveillance tools that monitor breaks, according to experts and a University of Washington study.
How do AI systems create this bias?
Algorithms learn from datasets that may exclude people with disabilities, leading to biased outputs. The systems also look for typical patterns, but people with disabilities often don’t fit standard patterns (retina scanners might not work for low-vision people, for example).
What can employers do to prevent discrimination?
Companies should inform employees about AI tools being used, audit systems to check if they work for everyone and make accommodations when needed.
What is inclusive design?
A principle that considers people’s differences from the beginning of the design process rather than as an afterthought. It helps ensure technology works for the widest range of users.
Why should companies care about accessibility?
Beyond legal compliance, accessible technology expands the potential customer base. If a tool isn’t accessible to people with disabilities, they can’t use it or buy it.
AI tools were used to help create this summary. Like everything Technical.ly publishes, it was edited and reviewed by a human.
Companies and other stakeholders need to implement methods to identify how the tech could be causing harm and mitigate those harms so everyone can use it, she said last week at the Pennsylvania Human Relations Commission’s 2025 DisABILITY Conference in Harrisburg.
“[AI] tools are essentially here to stay,” Aboulafia said. “They’re incorporated in every system that we are encountering, even if we don’t know it. And so it’s important to think about these, how they impact us, but also how we can build better.”
People with disabilities interact with systems like healthcare, education, employment and government benefits. As these sectors introduce more new technology, these tools can actually make it harder for people with disabilities to participate, Aboulafia said.
“They can also actually make them better, but that has to be done intentionally and with community engagement in mind,” she said.
When it’s an afterthought, people with disabilities can miss out on opportunities and face extra challenges while completing normal tasks.
AI tools aren’t always most efficient
Companies are increasingly using AItools to streamline hiring and employee management processes, but may not be aware they disproportionately impact people with disabilities. People who had disability-related credentials on their resume, for example, were ranked lower by a ChatGPT screening tool, according to a June 2024 University of Washington study.
Surveillance tools that monitor when an employee is away from their computer or the amount of keystrokes they make can also be problematic to people who need longer or more frequent breaks due to a disability, Aboulafia said.

Employers can help mitigate these harms by telling employees what kinds of technologies they’re using, auditing the tools to see if they’re working for everyone and if not, making a change, she said. Companies don’t have to use these tools or they could make accommodations for people with disabilities.
Considering disabilities in the design process
Algorithmic tools like AI come up with outputs based on datasets. But if the data doesn’t include people with disabilities, the tool might create biased output.
These algorithms also look for patterns in the data and function based on the most likely outcome. But people with disabilities don’t fit into the typical pattern, Aboulafia said.
She offered the example of retina scanners, which measure light reflection on the eye’s internal blood vessels. Those measurements are turned into code and compared to existing data. They are often used for security or identification purposes, but for people with certain eye conditions, these blood vessels can change and there isn’t consistent data to identify them.
One way to fix this is to reference inclusive design principles, a set of guidelines that consider and accommodate people’s differences from the beginning of the design process. In the case of the retina scanner, accommodations would need to be made for someone without typically functioning retinas.
“There are arguments to make to companies about ensuring that the most people possible can buy their product,” Aboulafia said. “If you make a tool that’s not accessible to people with disabilities, they can’t use it, and so they’re not going to.”