Blogs

Cognitive computing as a wearable prosthetic

Post Comment
Big Data Evangelist, IBM

It's a little embarrassing when Facebook's inline facial recognition service is faster than me in identifying a friend in a newly posted photo.

OK, I've never been good with faces and names. I am quite nearsighted. And I am getting older. But this is ridiculous. Facebook is getting so good that I'd like to embed its smart, facial recognition software in my otherwise dumb, ordinary eyeglasses.

Wearable facial analytics? That sounds like a perfect application for Google Glass and other smart eyeglasses that rivals will almost certainly introduce in the future. The ideal scenario would be if the smart spectacles sported computer-vision capabilities, embedded the facial recognition feature and were able to whisper the mystery face's name into the wearer's ear via a Bluetooth connection. If the wearable discreetly revealed the approaching party's name a split second before they offered the wearer their hand, it would save many an awkward social situation.

Let's not trivialize the potential impact of such a feature. For people afflicted with various forms of dementia, facial recognition often becomes difficult if not downright impossible. Many Alzheimer's sufferers would still be able to live productive, dignified social lives if they could rely on high-tech memory prosthetics to fill in the situational blanks. Taking it to the logical extreme, embedding of this capability into the sufferer's brain would mitigate the risk of them, in their cognitively-impaired state, forgetting to actually wear the wearable (or forgetting exactly where they left it).

Thinking heads.jpg

These thoughts came to me recently when I read this article about how facial recognition algorithms are starting to outperform people of average memory capability.

Let's dream. From a healthcare analytics standpoint, image-analytics wearables could help many people who suffer from diverse memory, perception and learning impairments. People with dyslexia could wear glasses that make subtle alterations to the personalized rendering of lines of visible text, or that automatically feed text-to-voice renderings into their ears, to compensate for their brains' tendency to scramble some alphabetic orderings. People with Asperger's Syndrome could wear glasses that feed key situational-context cues into their ears to compensate for their brains' tendency to overlook or misinterpret these in their social interactions.

Clearly, there is huge potential opportunity for cognition-assist wearables that leverage embedded image analytics. And it needn't be limited to prosthetic devices that serve people who are otherwise impaired.

Cognition-assist presentation devices (in the form of eyeglasses and earpieces, for example) might tap into the sensors on these and the other devices (like smartphones and smartwatches) that people are wearing throughout the day. What if the cognition-assist prosthetics were able to alert us in real time to the salient points about the full sensory circumstance we're in? In addition to identifying faces, sensor-fed prosthetics might identify difficult-to-discern events taking place in other sensory modes, such as:

  • Auditory (voice of a recent acquaintance across the room at a noisy party)
  • Olfactory (boss' cologne lurking outside the office door)
  • Gustatory (unhealthy sodium content in the food you've just loaded onto your plate)
  • Tactile (subtle floorboard vibrations indicating the approaching party is human, not animal)

Note that some of these events, as revealed through wearable sensory analytics, might have direct healthcare consequences. Others might be more lifestyle-relevant, but have indirect healthcare consequences, insofar as they might flag events that tend to cause you stress or anxiety—or, on the upside, rock your world.