Apple, Again, Creates a Market Out of Nothing. And It’s Massive

White AirPods

Having recently (finally) bought a pair of big chunky Bluetooth headphones, thinking they were so commonplace I wouldn’t get any weird looks, I now realise that once again I’m at the wrong end of a trend curve. People are staring at me — and not for my rugged visage. I’m the oddity: everyone else is sporting wireless earphones, the Apple AirPods variety (although I suspect quite a few of them are the cheap knockoffs which are indistinguishable in look and a tenth the price.)

Man wearing white AirPods.

Reality bites: what once looked a bit weird — massive headphones — looks weird again, and what looked even weirder — wireless earphones with little sticks dangling out of them — looks cool, and increasingly normal.

Man wearing a Bluetooth headphone
Man wearing a Bluetooth headset.

The data is surprising.  Canalys reports that what it calls “smart personal audio devices”– lumping together all the various wireless or semi-wireless buds, earphones and headphones — are this year set for strongest year in history, with real wireless earphones (true wireless stereo, or TWS) “the largest and fastest growing category.”

Indeed, it’s not only the fastest and largest growing category. It has leapfrogged the other two in the space of a year.

Hearable pr20191226 final slide2final

That’s particularly interesting because the original AirPods were launched three years ago. It’s taken that long for them to conquer the market, and this is a product that cost anywhere between $140 and $250. Yes, I know people spent silly money on headphones but that’s a lot of dough for something so small you’re likely to lose it down the back of the couch or running to catch the bus. But it has become, in quite short order, a massive market when you consider how many smartphones there are. In terms of units, it’s a quarter the size of the smartphone market (see below) which, according to IDC was about 360 million units in Q3 2019. And that market is virtually static, while the ‘smart personal audio devices’ market has nearly tripled.

Hearable pr20191226 final slide3

This is all of Apple’s doing. They created the wireless earphone market singlehandedly. They were slow on headphones, and they never went for the wireless earpieces connected by cord, and their ordinary earphones have never really, in my view, stacked up, but it seems with the second version of the AirPod, and the AirPod Pro, they’ve taken the market they created and dominated it:

Hearable pr20191226 final slide1final

You could argue that since they only work with Apple devices the data is skewed but you could also look at it the other way: the Samsungs, Huaweis and Xiaomis of this world have not risen to the challenge for the Android market, and are lagging woefully. Given that Samsung shipped 78 million devices in Q3, while Huawei shipped 67 million against Apple’s 47 million (IDC numbers again), it’s clear just how much of a market opportunity they’ve missed. Canalys’ numbers, meanwhile, suggest that Apple shipped 18.5 million AirPods that quarter, meaning that 40% of every iPhone sold was sold alongside, or nearby, an AirPod. That’s impressive stuff.

While Canalys focus on the ‘smartness’ of these devices — the control they allow, the possibility of sensors etc capturing health data and serving as payment devices — I think that’s not the point. The likes of Jabra have been trying to sell wireless earphones for swimmers, runners etc for years, and it’s remained a niche market. Apple have instead done what they do best — mastering the technology to make the experience of listening to stuff easy, seamless and, at least now, so cool it’s become de rigeur. The problem was always a simple one: wires. They got rid of the wires, and they made devices that sound good, fit snugly and well (at least with the Pros) and connect relatively painlessly.

That was the problem to solve, and hence the market unleashed.

Don’t overcomplicate it.

Xiaomi Goes Virtually Edgeless By Using Ultrasound

NewImage

Regular readers will know I’ve been looking out for this to happen for a while: the use of sound, or rather ultrasound, as a form of interface. Here’s a Reuters piece I did on it a year ago:  From pixels to pixies: the future of touch is sound | Reuters:

Ultrasound – inaudible sound waves normally associated with cancer treatments and monitoring the unborn – may change the way we interact with our mobile devices.

But the proof will be in the pudding, I reckoned:

Perhaps the biggest obstacle to commercialising mid-air interfaces is making a pitch that appeals not just to consumers’ fantasies but to the customer’s bottom line.

Norwegian start-up Elliptic Labs, for example, says the world’s biggest smartphone and appliance manufacturers are interested in its mid-air gesture interface because it requires no special chip and removes the need for a phone’s optical sensor.

Elliptic CEO Laila Danielsen says her ultrasound technology uses existing microphones and speakers, allowing users to take a selfie, say, by waving at the screen.

Gesture interfaces, she concedes, are nothing new. Samsung Electronics had infra-red gesture sensors in its phones, but says “people didn’t use it”.

Danielsen says her technology is better because it’s cheaper and broadens the field in which users can control their devices.

That day has happened. Xiaomi’s new MIX phone, Elliptic Labs says, is the first smartphone to use their Ultrasound Proximity Software:

INNER BEAUTY replaces the phone’s hardware proximity sensor with ultrasound software and allows the speaker to be completely invisible, extending the functional area of the screen all the way to the top edge of the phone.

Until now, all smartphones required an optical infrared hardware proximity sensor to turn off the screen and disable the touch functionality when users held the device up to their ear.

Without the proximity sensor, a user’s ear or cheek could accidentally trigger actions during a call, such as hanging up the call or dialing numbers while the call is ongoing.

However, INNER BEAUTY — built on Elliptic Labs’ BEAUTY ultrasound proximity software — uses patented algorithms not only to remove the proximity sensor, but also to hide the speaker behind the phone’s glass screen.

Besides eliminating the unsightly holes on a phone’s screen, Elliptic Labs’ technology eliminates common issues with hardware proximity sensors, such as their unreliability in certain weather conditions or in response to various skin colors as well as dark hair.

This is a good first step. The point here of course, for the company, is that they can push the display right to the top, which definitely looks nice (the front-facing camera, if you’re wondering, is now at the bottom.) But the use of ultrasound has lots of interesting implications — not least for how we interact with our phones. If gestures work, rather than just say they work, it will make interacting with other devices as interesting, maybe more interesting, than voice.