In the 1960s, when I was diagnosed as a six-year-old with macular degeneration, technology was non-existent apart from some very thick-lens glasses. Over the intervening 50 years, assistive technology has gone from dedicated, extremely expensive CCTVs and early adapted computers to now, when the visually impaired can rely on mainstream technology complemented with specialist vision impairment support. This transformation represents the convergence of many different strands of formerly very separate disciplines.
The increasingly miniaturised yet ever more powerful devices – combined with cameras, microphones and communications hosted in the Cloud – can be used to replace or enhance most aspects of the human body. In the spirit of the Six Million Dollar Man, we can almost ‘rebuild’ missing elements (e.g. 3D printing of bionic hands). This new computing environment also allows application developers to open up every possible channel for interaction between customers and sellers of products and services.
Increasingly miniaturised yet ever more powerful devices can be used to replace or enhance most aspects of the human body
The telecoms industry refers to this as omnichannel. However, if one channel is cut off permanently or temporarily to an individual, then the wealth of other channels can compensate. Think of how you increasingly chat with bots when dealing with companies. Voice is still there as an option and may be best for someone with vision impairment whereas chat is ideal for those suffering from auditory impairment, and so on.
As a blind person, I now use a regular iPhone with its inbuilt voiceover accessibility feature switched on. I permanently have a Bluetooth earpiece inserted and use a mini Bluetooth keyboard in one hand with a white cane in the other. Through this combination I have access to all the mainstream apps that people use, social media, messaging and audiobooks as well as good old-fashioned email and telephony.
In addition, I use some specialist apps like Microsoft’s Seeing AI to digitally recognise ‘things’, cooking instructions and even people. Be My Eyes is an app that gives me access to over a million volunteers who can see what I’m looking at via a video link and tell me what’s in front of me – be it an air conditioning control in my Chinese hotel bedroom, a bottle of wine on a supermarket shelf or the identity of a building in front of me.
Bringing people with disabilities into the digital world opens up the estimated $4tr of spending power that they represent
Bringing people with disabilities into the digital world opens up the estimated $4tr of spending power that they represent. The diversity of conditions and degree of impairment used to be a gating factor to specific technology. It’s now an option on most mainstream apps over and above the specialist apps and technology. For example, I also use an Orcam device which clips to a pair of regular glasses. This self-contained Optical Character Recognition system is way more powerful than the Kurzweil reading machine that I used in the University of Manchester library during my degree in the 1980s. It cost over $100,000 and wasn’t very good.
So, rather than putting the disabled individual in a pigeon hole, the options that technology gives to interact on different levels and through different channels represents a major market opportunity and brings the erstwhile excluded disabled billion into the mainstream digital market. Perhaps the best example of this is the use of voice. Using your Amazon Echo, Google Home or Siri to order a taxi, manage your home, play music or test your general knowledge was unthinkable even five years ago. Now mainstream, this is a massive advantage to the vision-impaired community. For the first time in my life I can control the lights, temperature and front door intercom without seeing what I’m doing.
The beauty of technology is not technology for its own sake. As the Be My Eyes example shows, the human element is still vital. For the first time I’m now considering using a guide dog. Even there, technology will inevitably play a role as my dog will doubtless be armed with a camera and tracking device to enhance its performance helping me navigate my way around the world.
Should we see disability as an issue in its own right with a billion people being classified in forms and degrees? Should we think of disability as part of the healthcare journey that sees us enter the system, get analysed, diagnosed and helped before emerging into the social care system? Or, given what technology is doing for everyone in our daily lives, consider it as an opportunity to digitally include everyone.