Xiaomi Goes Virtually Edgeless By Using Ultrasound

By | October 26, 2016

NewImage

Regular readers will know I’ve been looking out for this to happen for a while: the use of sound, or rather ultrasound, as a form of interface. Here’s a Reuters piece I did on it a year ago:  From pixels to pixies: the future of touch is sound | Reuters:

Ultrasound – inaudible sound waves normally associated with cancer treatments and monitoring the unborn – may change the way we interact with our mobile devices.

But the proof will be in the pudding, I reckoned:

Perhaps the biggest obstacle to commercialising mid-air interfaces is making a pitch that appeals not just to consumers’ fantasies but to the customer’s bottom line.

Norwegian start-up Elliptic Labs, for example, says the world’s biggest smartphone and appliance manufacturers are interested in its mid-air gesture interface because it requires no special chip and removes the need for a phone’s optical sensor.

Elliptic CEO Laila Danielsen says her ultrasound technology uses existing microphones and speakers, allowing users to take a selfie, say, by waving at the screen.

Gesture interfaces, she concedes, are nothing new. Samsung Electronics had infra-red gesture sensors in its phones, but says “people didn’t use it”.

Danielsen says her technology is better because it’s cheaper and broadens the field in which users can control their devices.

That day has happened. Xiaomi’s new MIX phone, Elliptic Labs says, is the first smartphone to use their Ultrasound Proximity Software:

INNER BEAUTY replaces the phone’s hardware proximity sensor with ultrasound software and allows the speaker to be completely invisible, extending the functional area of the screen all the way to the top edge of the phone.

Until now, all smartphones required an optical infrared hardware proximity sensor to turn off the screen and disable the touch functionality when users held the device up to their ear.

Without the proximity sensor, a user’s ear or cheek could accidentally trigger actions during a call, such as hanging up the call or dialing numbers while the call is ongoing.

However, INNER BEAUTY — built on Elliptic Labs’ BEAUTY ultrasound proximity software — uses patented algorithms not only to remove the proximity sensor, but also to hide the speaker behind the phone’s glass screen.

Besides eliminating the unsightly holes on a phone’s screen, Elliptic Labs’ technology eliminates common issues with hardware proximity sensors, such as their unreliability in certain weather conditions or in response to various skin colors as well as dark hair.

This is a good first step. The point here of course, for the company, is that they can push the display right to the top, which definitely looks nice (the front-facing camera, if you’re wondering, is now at the bottom.) But the use of ultrasound has lots of interesting implications — not least for how we interact with our phones. If gestures work, rather than just say they work, it will make interacting with other devices as interesting, maybe more interesting, than voice.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.