Blogs

Post a Comment

Evolving the binocular vision and opposable thumbs of cognitive computing

May 29, 2014

If I'm about to offend your religious sensibilities, I apologize in advance. Please avert your eyes from this post.

Humans didn't evolve smarts because we're divinely anointed. There are many theories for why we became such supremely brainy organisms. But few scientists doubt that primate features such as opposable thumbs, binocular vision and arms with enough leverage needed to hurl spears at mammoths played a huge role in our development of superior cognition. Furthermore, our peripheral nervous systems evolved into efficient intermediaries for handling a closed loop of signals between our brains, senses, muscles and the other bodily systems that interface us to the environment.

Evolution.png

Image courtesy of Openclipart and used with permission

n the cognitive fabric of the new society, humanity is deploying the Internet of Things (IoT) as our sensory and muscular interface to the world around us. Several months ago, I blogged on the concept of an "autonomic planet" that is a self-healing intelligent ecosystem. Within that global infrastructure, the IoT is the pervasive fabric of what I referred to as "sensory computing." Alex Philp calls out this IoT role nicely in his recent IBM Data Magazine post, that focuses on the cognitive capabilities provided by the IBM Watson platform. "Consider mobile devices. Smartphones contain an average of 15 sensors, which are the future eyes and ears and sensing media for Watson. Moreover, the array of sensors and devices that abound today (such as computers, cameras, satellites and drones) are all collecting and transmitting data and combined with mobile devices open the floodgates for torrents of potentially valuable data slipping past us."

But that's not all. IoT has a larger role: as part of the muscle sinew of the cognitive computing fabric. Many "things" (such as embeddable devices) combine sensors with "actuators" (features that take actions based on sensor readings and other inputs). In that light, you could also consider IoT as a foundation for what I referred to in that same post as "volitional computing." This refers to the ability of automated systems to handle the thought processes that translate cognition, affect and sensory impressions into willed, purposive, effective action. Next best action (NBA) technologies are another key enabler for cognition-coordinated volitional computing; I discussed NBA's interface to IoT in this IBM Big Data & Analytics Hub post over a year ago.

In order for IoT to intelligently drive sensation and locomotion, cognitive computing fabrics need to consider the full geospatial and temporal context for all events—past, present, future. I found Philp's discussion of Watson's capabilities in this area fascinating: "Watson continues to learn by recognizing shapes and spatial relationships—proximity, distance, boundaries and patterns—that are the geometry of geography. In other words, the ‘where’ is being added to Watson’s cognitive capabilities. By combining spatial and temporal components of data, Watson will derive meaningful cause-and-effect relationships from incredibly diverse data sets comprising networked sensors and human inputs numbering in the billions. And it will accomplish these relationships in real time. Imagine when Watson becomes directly connected to these networks of sensors and is able to parse the spatial-temporal components from the trillions of observations it makes by the second."

Human inputs and human actions are just as fundamental to this vision as any automated componentry. Those same smartphones that are collecting sensor readings are also feeding real-time information, context and guidance to the people who clutch them in their hot little hands. It's the point where the IoT becomes an organic extension of people's own biological organs of sensation and locomotion. And it's the point where IoT, cognitive computing and next best action converge with social networks. 

Decision support is the point of it all. People's cognition must always adapt to the social spatial and temporal coordinates of the world we inhabit. Things happen around us, and the new world of intelligent things will help us tune and adapt to it all.