As a member of the Deaf community himself, Deaf-Hearing Communication Centre (DHCC) Executive Director Neil McDevitt leads his Swarthmore-based organization in finding ways for Deaf individuals to better participate in their communities through work, fellowship and community activism. It also offers American Sign Language classes and ASL interpreting services.
How sign language interpreters are using teamwork during the COVID-19 pandemic https://t.co/k9tZJruJUe
— Action News on 6abc (@6abc) June 23, 2020
Coronavirus has drastically affected the way DHCC team members, which include both Deaf and hearing people, communicate with each other as they mange programs, budgets and other obligations remotely. Before COVID-19, McDevitt said, conversations were usually pretty easy even if folks didn’t know or weren’t totally comfortable using ASL. Technology has helped them stay in touch during the pandemic — but it doesn’t come without its challenges.
Technical.ly spoke to McDevitt about the perils of automatic captioning, things hearing people should understand about how Deaf people communicate, and more. This interview has been edited for length and clarity.
How has working remotely because of coronavirus changed the way you communicate with your colleagues? If you’re using tools such as Zoom or Google Meet to keep in touch with coworkers, what communication challenges are you experiencing, and how is your organization overcoming them?
We already had the biggest tool in our organization, Slack, active for about two years but I don’t think it was used to its full potential until COVID hit. We really got to see folks communicating seamlessly through Slack in the requisite channels after we began working remotely.
I think the biggest issue has been fatigue, not the technology itself. For sure, the technology has been helpful, between the beta-copy of Zoom that we have — which has auto-captioning — to the caption-features of Google Meet. But for hearing or Deaf folks, the challenge is the same — you’re kind of scanning an entire computer screen to find the social cues that help you navigate the interpersonal interactions you have constantly. We know that computer screens are a very poor representation of those cues. Someone who is uncomfortable may be shifting their feet. It’s easy to pick up when you’re seeing them in person, but far more difficult to do that remotely. For Deaf folks, it feels like we enter a period of hypersensitivity whenever we do a Zoom session. It’s exhausting.
What we’ve been trying to do is minimize the actual number of calls and ask, “Can this be resolved in Slack or email?” On average, we may have one zoom call a week for a large group of colleagues. But everything else is handled in Slack or in small Zoom sessions.
How have communication-focused applications helped to streamline the way people with impaired hearing are able to manage remote communication?
For anything that does automatic captioning, it’s a “good in some ways, bad in other ways” situation.
It’s good because it brings a level of access to the fore that didn’t exist before, particularly with the new experience of everyone wearing masks. With the investments that companies have made into fixing the inherent shortcomings of voice-to-text, they’re improving.
The bad is still very significant. The platforms are not perfect. Yesterday, when I was on a call with a colleague of mine, the automatic captions came up, “I’m not that person has specifically said, Obama that interpreter will using those seven people experiment figure.” In the entire sentence, the only correct word was “interpreter.” Nothing else was right and it was very frustrating trying to decipher the meaning behind that.
Ava [which offers a threaded speech-to-text tool] and similar platforms are designed to supplement the ugliness of auto-captioning with a human who corrects those. [They are] great platforms but they do come with a price tag. And for small nonprofits like DHCC that have had massive revenue impacts from COVID-19, we’re not in a position to use the tool regularly.
At the same time, there’s a very real problem of over-reliance on these tools. For many people with disabilities, the accommodations they need are a civil right outlined in the Americans with Disabilities Act. Those accommodations often cost money and DHCC spends many hours every week fighting for those rights on behalf of the Deaf people who are often being denied that access.
What we often see is someone reads that this app is available and they see “free,” “cheap” or “one-time cost.” Then they jump on that and try to force the Deaf or disabled person to accept that tool, even when the person has made it clear that the tool is inferior to their preferred methods. It becomes a battle of the pocketbook of a larger organization against an individual who needs service.
What are some things you want hearing people to know about the way you communicate?
Change your mindset. For a lot of hearing people, the first question they ask a Deaf person is, “Do you read lips?” They mean well, but [that] is a loaded question. We don’t realize that only 30% of the English language is visible on the lips which means a simple sentence like, “Technical.ly is an amazing hyperlocal media company serving Philadelphia” may only be understood as “tech … ly … local … med … Phila.”
Realize that as a result, the Deaf person is doing all of the heavy lifting to make the conversation work. So ask the person, “What’s the best way for us to communicate?” before you start the conversation and be open to adjusting on the fly as the conversation progresses.
Michael Butler is a 2020-2022 corps member for Report for America, an initiative of The Groundtruth Project that pairs young journalists with local newsrooms. This position is supported by the Lenfest Institute for Journalism.-30-