Why it’s time for Deaf Assistive Tech that doesn’t rely on the written word - RSA

Why it’s time for Deaf Assistive Tech that doesn’t rely on the written word

Blog

  • Picture of Mark Applin FRSA
    Mark Applin FRSA
    Fascinated by tech/access/inclusion/design/assistive and elective tech
  • Picture of Ross Smith FRSA
    Ross Smith FRSA
    Director, Skype for Good at Microsoft
  • Fellowship
  • Fellowship in Action

Whilst much headway has been made in developing tech for individuals with visual impairment, for those with less apparent needs, such as hearing loss, advancement has been lagging. What is the current situation for hearing impaired members of society? And what ‘tech for good’ is in the pipeline?

 

                                      Text Box

Technologists have made huge strides innovating assistive tools to support those with mobility, visual and hidden impairments. Day to day living has been phenomenally transformed through the adoption of accessories and appliances which augment abilities to communicate, understand and evaluate.  

Screen readers for the blind and visually impaired are transformational and almost ubiquitous. Just take a look at Microsoft’s Seeing AI which uses AI to describe people, objects, and even text, in an effort to "narrate the world around you". Or Be My Eyes, a free app that connects blind and low-vision people with sighted volunteers and company representatives for visual assistance through a live video call. The advancements are breath-taking.  

However, it’s apparent that an opportunity remains to support Deaf sign language users who struggle with lower literacy and interpreting written English. We think the time has come to turn the might of tech onto this issue.  

 

                                            HIDDEN IN PLAIN SIGHT 

                                    Text Box

According to UK charity Action on Hearing Loss, there are 11 million people with hearing loss across the UK - that's a staggering one in six of the population.  900,000 of these have severe or profound hearing loss. For 87,000 members of this group, British Sign Language (BSL) is their preferred language and English may be a second or third language. 

You may be surprised that Deaf sign language users consider sign to be their first language and English a second language. But those who are prelingually deaf, that is experience hearing loss before they learn to speak, are less likely to develop good speech and speech recognition skills and may not be taught to sign either. Studies have also shown that deaf children’s reading develops at a slower rate. The average deaf school leaver has a reading age equivalent to that of an 8/9-year-old hearing child. Lip reading is harder than it seems – try it! 

When acquiring English is madmore difficult; the consequences are isolating. 

From opening a bank account to accessing medical support, so much communication relies on the written or spoken word. Face-to-face communication with hearing people can range from hard to impossible. Essential information isn’t available in print and online and even when it is available, many have lower literacy rates (can be as low as 8/9 years old). with lower literacy rates (can be as low as 8/9 years old). 

To put things into context, for big service providers like Lloyds Banking Group it’s likely a large chunk of their customer base use BSL – maybe up to 29,000. That’s equivalent to a town the size of Chichester trying to interact with the organisation on a frequent basis about potentially high value transactions. How many conversations each year take place? Of those how many were free flowing and how many convoluted? What’s the ‘value’, not just financially but in terms of quality of life? 

On the other hand, health sector decisions are potentially ‘life or death’. Since the 31st July 2016, all NHS England and adult social care services must follow a new set of rules called the Accessible Information Standard. The Standard tells health professionals what they MUST do to improve communication and accessibility for people with a disability or sensory loss. The principle is fantastic, but implementation is much harder. Why? 

The UK, as with most of Europe, has a shortage of sign language interpreters. A Comprehensive Guide to Sign Language Interpreting in Europe estimates one interpreter to every 70 deaf sign language users. If you’ve tried to book an interpreter lately, you’ll have felt the effects of this. 

                                                      Time for Tech! 

Organisations everywhere are viewing the world through different eyes – acknowledging that we’re all vulnerable at different times of our life and that by building better, inclusive systems EVERYONE should stand a chance of completing their journey whether as a customer, colleague or citizen independently.  

These same organisations grasp the issues sign language users may experience with the written word and face-to-face communication. They have a desire to do more. They get the ‘why’ but lack some of the tools to readily implement the ‘what and how’. 

So, what is the current situation for hearing impaired members of society? Let’s take a look at the pace of change with hearing loss in our abbreviated (not comprehensive) timeline. 

150BC 

Polybius (a Greek historian devises a system of alphabetical smoke signals  

1876 

Alexander Graham Bell won the first U.S. patent for the telephone 

1898 

“Modern” ear trumpet was invented. They didn’t amplify sound, however, but worked by collecting sound and “funnelling” it through a narrow tube into the ear. 

1964 

Robert Weitbrecht creates telephone typewriter, or TTY 

1964 

Videophone from ATT - the Picturephone system transmitted only one image every two seconds 

1977 

Hearing technologists begin their obsession with sign language avatars when a research project successfully matched English letters from a keyboard to ASL manual alphabet letters which were simulated on a robotic hand  

1980 

The first closed-captioned programs were broadcast on TV stations across the U.S. 

1981 

BBC launches See Hear, a TV magazine for Deaf & hard of hearing people 

1984  

U.S. Food and Drug Administration  approves the 3M/House single-channel cochlear implant 

1992 

First text message to a mobile phone is sent 

1996 

The Telecom Act mandated closed captioning, which is now widely available for the deaf and hard of hearing community 

1999 

Chris Wendt starts research on machine translation  

2000s 

Early AI applications of speech to text 

2003 

Skype founded by Niklas Zennström, from Sweden, and Janus Friis, from Denmark 

2004 

Significan't (UK) Ltd, a deaf and sign language led social enterprise, was the first to establish an IP Video Relay Service in London 

2017 

Signly augments printed English with pre-recorded sign language content and tests the concept with Lloyds Backing Group 

2018 

Lero, the SFI Irish Software Research Centre and University College Dublin (UCD) announce a research initiative with Microsoft Skype designed to enable future communications between deaf and non-deaf people 

2018 

Skype adds real time closed captioning 

     

                                        Text Box

 

In recognising the good work above, it’s apparent that an opportunity remains to support Deaf sign language users who struggle with lower literacy and interpreting written English. Many of the advances cited above rely on a good grasp of written English. That’s why we think the time has come to turn the might of tech onto this issue.  

In our next blog, we’ll be looking at potential future advancements, and how organisations can sponsor innovation and make immediate strides forward right now to support the Deaf community. 

Be the first to write a comment

0 Comments

Please login to post a comment or reply

Don't have an account? Click here to register.

Related articles