AlEye: Nonverbal Gestures as Haptic Feedback

-

When we communicate with each other, it’s not just words that are being expressed. Tone, body language, and facial expressions are some of the nuances we observe to understand what the other person is trying to convey. Unfortunately, not everyone can read those nonverbal cues due to certain disabilities. For instance, a person with visual impairment would not be able to see a person’s facial expression clearly or even at all. To bridge this gap, HapWare developed AlEye, a device that can catch and translate those silent gestures into haptic feedback.

Haptic vision

AlEye consists of a pair of glasses and a wristband. The eyewear has computer vision algorithms to detect facial expressions, gestures, and body language in real time. As the cues get detected, the wristband translates those into unique, dynamic haptic interactions. HapWare’s CEO and cofounder, Jack Walters, shared that “Each non-verbal communication has its own unique and corresponding haptic or touch sensation. It’s intuitive users pick it up quickly — and with the mobile app, you have full control over what cues are communicated to you, whether that’s emotions, gestures like a handshake, or body language.”

In other words, AlEye translates the cues as real-time haptic feedback. For example, when a person smiles, nods their head, or lifts their hand for a handshake, the device can communicate those to the user within 0.2 seconds. It also allows people to have conversations while identifying nonverbal cues and helps to interpret haptic patterns without distractions.

AlEye
Image: HapWare

Hurdling the communication challenges

To understand AlEye better, I had a quick chat with Walters. He shared that the device began as a federal research project wherein they were tasked to create something using haptic technology. Mentoring them was the company’s CTO and cofounder, Bryan Duarte, who lost his eyesight due to an accident. In turn, his vision to “restore access to the world” drove the project further. Walters shared that “after the project, we spoke with hundreds of people in the blind, low vision, and autistic communities…we heard the same pain point: nonverbal communication cues are invisible, yet they’re essential in work, school, and social life.” Those inputs led them to the next steps and eventually pushed them to build the device.

The testing stage involved more than 50 people using AlEye, and the team’s responses were powerful. One told them that when he began to use the vibration sets and was able to interpret the meaning, “it began to restore a once long-lost depth of everyday communication.”

Initially, the device was able to send 12 foundational cues like happiness, waves, and thumbs up. The team is working to improve the algorithm to help read complex expressions such as eyebrow movement, eye gaze, or advanced body language. Currently, the project is at the private pilot MVP stage, working with some of the largest non-profits in the US. Aiming to release AlEye by 2026, the company envisions that millions of people with vision impairment will experience non-verbal communication.


YouTube: HapWare AlEye Brief Demo

HapWare AlEye Brief Demo

By clicking play, you agree to YouTube's Terms of Service and Privacy Policy. Data may be shared with YouTube/Google.

Photo credit: The images used are owned by HapWare and have been provided for press usage.

Pheba Mathai
Pheba Mathai
Tech Journalist
- Advertisment -
- Advertisment -
- Advertisment -
- Advertisment -
- Advertisment -
- Advertisment -
- Advertisment -
- Advertisment -
- Advertisment -
- Advertisment -
- Advertisment -