Enterprise IT Watch Blog

Nov 10 2014   11:46AM GMT

Behavorial analytics in the era of wearables

Michael Tidmarsh Michael Tidmarsh Profile: Michael Tidmarsh

Tags:
Analytics
Wearable devices

shutterstock_193504472
Wearable technology image via Shutterstock

By James Kobielus (@jameskobielus)

Behavior is something we usually measure and analyze at the personal level. By that, I mean we tend to measure whether such-and-such a person went here, said that, and did that. In doing so, we almost always abstract away the finer-grained intrapersonal behaviors involved in all of that. We rarely measure the specific behaviors of the person’s legs, arms, hands, torsos, faces, tongues, eyes, brains, and other organs that made it possible for them to do all that.

We abstract away these lower-level details because they are usually irrelevant to behavioral analyses we are performing. For example, the specific sequence of movements of your customers’ hands across their smartphones’ touchscreen applications is immaterial if you’re simply trying to determine the circumstances under which they’ll click your “buy” button. By the same token, the specific accent that inflects how they speak the word “buy” into your voice-recognition application has no bearing on their decision to do so.

However, as the Internet of Things (IoT) pushes more deeply into our lives, we’ll start to rethink these assumptions. IoT-enabled wearable devices will incorporate interfaces that respond to inputs that are primarily tactical, gestural, ocular, muscular, motion-sensitive voice-activated, and brainwave-triggered in nature. To keep pace with innovations in wearable devices, IoT behavioral analytics will need to enable the user experience to predictively morph in keeping with people’s changing circumstances and intentions.

In that regard, I recommend this recent article on innovations in cognitive-computing technology that can predict how people pose, move, and gesture in various activities. As author Derrick Harris notes, these analytics have the potential to improve gesture-recognition capabilities built into wearable devices, and also to enable better simulation of real human behavior in computer animations. The deep-learning algorithms that have been developed can accurately predict the positions of people’s arms, legs, joints and general body alignment in various activities. These advances, according to Harris, “could lead to better gesture-based controls for interactive displays, more-accurate markerless (i.e., no sensors stuck to people’s bodies) motion-capture systems, and robots (or other computers) that can infer actions as well as identify objects.”

Conceivably, predictive behavioral analytics of this sort might be used in IoT wearables to drive more fine-tuned gestural interfaces. Wearables, either through embedded and/or cloud-based algorithms, could conceivably anticipate what the wearer will do or intend next. Behavioral predictions could seamlessly guide the wearer toward those ends by, for example, adjusting the gestural, tactical, visual, or auditory device interfaces in real time.

Clearly, immersively wearable user experiences are just around the corner, and cognitive-computing algorithms will tailor them to our individual physiologies like a virtual, dynamic epidermis.

 Comment on this Post

 
There was an error processing your information. Please try again later.
Thanks. We'll let you know when a new response is added.
Send me notifications when other members comment.

Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to:

Share this item with your network: