A Tale of Three Asias


Source: GfK data

I’ve just been playing around with some smartphone data from GfK, which collects its data by point-of-sale (POS) tracking in 90+ markets and estimates values based on unsubsidized retail pricing — meaning I guess that these are not the prices that folk may be paying for their phones exactly, but ultimately. The chart above is me calculating the Average Selling Price by dividing unit sales with sales value.  

Raw conclusions: Emerging APAC — India, Indonesia, Cambodia, Philippines, Malaysia, Thailand and Vietnam — have the cheapest smartphones in the world, and they’re getting cheaper. Two years ago they were above $200, now they’re less than $160. 

Then there’s Developed Asia: Australia, Hong Kong, Japan, New Zealand, Singapore, South Korea and Taiwan. There smartphones are the most expensive in the world, by a yard or two. Although prices have fallen too, by 8%, in the two years, folk in this part of the world still pay $150 more for their smartphones.

And then there’s China. China started below the Middle East and Africa, Central/Eastern Europe and Latin American but ended it up above all three, with the ASP rising by 16%. Interestingly, the rise occured in one spurt (making me worry there’s a problem with the data, though this might be down to the launch of the iPhone 6 in China in the last quarter of 2014. ASPs there have held steady since.) 

Bottom line: Anyone selling phones in Asia — indeed, anything that involves mobile — needs to think in terms of at least three distinct markets, in terms of purchasing power, in terms of computing power, in terms of screen size and connectivity. 

iPad Pro Thoughts

Jean-Louis Gassée again hits the right note in his piece on the iPad Pro: Wrong Questions | Monday Note. Tim Cook shouldn’t go around saying it will replace the laptop. It might for him, but the laptop/PC has evolved to be used in myriad ways, not all of which are best suited to a big screen and unwieldy, optional keyboard. 

Why not say that the iPad Pro will helpfully replace a laptop for 60%, or 25% of conventional personal computer users? In keeping with Steve Jobs’ Far Better At Some Key Things formula, why not say that the iPad Pro is a great laptop replacement for graphic designers, architects, mechanical engineers, musicians, videographers…and that the audience will grow even larger as new and updated apps take advantage of the iPad Pro’s screen size, speed, and very likable Pencil.

And it’s not just that. Taking up his and others’ theme that at each stage of hardware evolution we’ve lacked the imagination to realise what these devices might best be used for, I imagine the big screen and power of the iPad Pro will yield uses that we so far have not considered. 

As with wearables, these devices are as much about creating (this is something I’ve never been able to do before) or extending new markets (I could do this before, but it wasn’t much fun) as anything else. I’m not about to replace my laptop with an iPad Pro, but I could see a lot of things I would love to do with it — music editing, photo editing and organising, and maybe a bit of doodling. As in Horace Dediu’s video  The new iPad is like nothing we’ve ever seen before there’s lots of great visualization possibilities too. 

Is it a work tool? Could be, for some industries. It’s not a very mobile beast. 

The question is: while developers see enough reward in supporting it with apps? 

From pixels to pixies: the future of touch is sound

My piece on using sound and lasers to create 3-dimensional interfaces. It’s still some ways off, but it’s funky. 

Screenshot 2015 10 01 10 49 33

Screenshot from Ultrahaptics video demo

From pixels to pixies: the future of touch is sound | Reuters:


(The video version: The next touchscreen is sound you can feel | Reuters.com

Ultrasound – inaudible sound waves normally associated with cancer treatments and monitoring the unborn – may change the way we interact with our mobile devices.

Couple that with a different kind of wave – light, in the form of lasers – and we’re edging towards a world of 3D, holographic displays hovering in the air that we can touch, feel and control.

UK start-up Ultrahaptics, for example, is working with premium car maker Jaguar Land Rover [TAMOJL.UL] to create invisible air-based controls that drivers can feel and tweak. Instead of fumbling for the dashboard radio volume or temperature slider, and taking your eyes off the road, ultrasound waves would form the controls around your hand.

‘You don’t have to actually make it all the way to a surface, the controls find you in the middle of the air and let you operate them,’ says Tom Carter, co-founder and chief technology offjauiclinkeer of Ultrahaptics.

Such technologies, proponents argue, are an advance on devices we can control via gesture – like Nintendo’s Wii or Leap Motion’s sensor device that allows users to control computers with hand gestures. That’s because they mimic the tactile feel of real objects by firing pulses of inaudible sound to a spot in mid air.

They also move beyond the latest generation of tactile mobile interfaces, where companies such as Apple and Huawei [HWT.UL] are building more response into the cold glass of a mobile device screen.

Ultrasound promises to move interaction from the flat and physical to the three dimensional and air-bound. And that’s just for starters.

By applying similar theories about waves to light, some companies hope to not only reproduce the feel of a mid-air interface, but to make it visible, too.

Japanese start-up Pixie Dust Technologies, for example, wants to match mid-air haptics with tiny lasers that create visible holograms of those controls. This would allow users to interact, say, with large sets of data in a 3D aerial interface.

‘It would be like the movie ‘Iron Man’,’ says Takayuki Hoshi, a co-founder, referencing a sequence in the film where the lead character played by Robert Downey Jr. projects holographic images and data in mid-air from his computer, which he is then able to manipulate by hand.


Japan has long been at the forefront of this technology. Hiroyuki Shinoda, considered the father of mid-air haptics, said he first had the idea of an ultrasound tactile display in the 1990s and filed his first patent in 2001.

His team at the University of Tokyo is using ultrasound technology to allow people to remotely see, touch and interact with things or each other. For now, the distance between the two is limited by the use of mirrors, but one of its inventors, Keisuke Hasegawa, says this could eventually be converted to a signal, making it possible to interact whatever the distance.

For sure, promises of sci-fi interfaces have been broken before. And even the more modest parts of this technology are some way off. Lee Skrypchuk, Jaguar Land Rovers’ Human Machine Interface Technical Specialist, said technology like Ultrahaptics’ was still 5-7 years away from being in their cars.

And Hoshi, whose Pixie Dust has made promotional videos of people touching tiny mid-air sylphs, says the cost of components needs to fall further to make this technology commercially viable. ‘Our task for now is to tell the world about this technology,’ he says.

Pixie Dust is in the meantime also using ultrasound to form particles into mid-air shapes, so-called acoustic levitation, and speakers that direct sound to some people in a space and not others – useful in museums or at road crossings, says Hoshi.


But the holy grail remains a mid-air interface that combines touch and visuals.

Hoshi says touching his laser plasma sylphs feels like a tiny explosion on the fingertips, and would best be replaced by a more natural ultrasound technology.

And even laser technology itself is a work in progress.

Another Japanese company, Burton Inc, offers live outdoor demonstrations of mid-air laser displays fluttering like fireflies. But founder Hidei Kimura says he’s still trying to interest local governments in using it to project signs that float in the sky alongside the country’s usual loudspeaker alerts during a natural disaster.

Perhaps the biggest obstacle to commercializing mid-air interfaces is making a pitch that appeals not just to consumers’ fantasies but to the customer’s bottom line.

Norwegian start-up Elliptic Labs, for example, says the world’s biggest smartphone and appliance manufacturers are interested in its mid-air gesture interface because it requires no special chip and removes the need for a phone’s optical sensor.

Elliptic CEO Laila Danielsen says her ultrasound technology uses existing microphones and speakers, allowing users to take a selfie, say, by waving at the screen.

Gesture interfaces, she concedes, are nothing new. Samsung Electronics had infra-red gesture sensors in its phones, but says ‘people didn’t use it’.

Danielsen says her technology is better because it’s cheaper and broadens the field in which users can control their devices. Next stop, she says, is including touchless gestures into the kitchen, or cars.

(Reporting by Jeremy Wagstaff; Editing by Ian Geoghegan)

Making 3-D objects disappear

I’m a big fan of invisibility cloaks, partly cos they’re cool, and partly because I think the principles behind them could end up in a lot of things. Here’s another step forward: Making 3-D objects disappear:

“Making 3-D objects disappear: Ultrathin invisibility cloak created Date: September 17, 2015 Source: DOE/Lawrence Berkeley National Laboratory Summary: Researchers have devised an ultra-thin invisibility ‘skin’ cloak that can conform to the shape of an object and conceal it from detection with visible light. Although this cloak is only microscopic in size, the principles behind the technology should enable it to be scaled-up to conceal macroscopic items as well.”

(Via ScienceDaily.)

Cook: 3D Touch a Game Changer

I think 3D Touch is the most important thing that Apple has done for a while, and I think as with all such things we don’t really see it until later. Cook seems to agree: 20 Minutes With Tim Cook – BuzzFeed News:

“But he’s most excited by 3D Touch. ‘I personally think 3D Touch is a game changer,’ he says. ‘I find that my efficiency is way up with 3D touch, because I can go through so many emails so quickly. It really does cut out a number of navigational steps to get where you’re going.’ Even with just a quick demo, it’s easy to see his point. It’s a major new interface feature, one that threatens to upend the way we navigate through our phones, especially once third-party developers begin implementing it in their applications. Apple has engineered the hell out of this 3D Touch to ensure they’ll do just that.

For Cook, 3D Touch is a tentpole feature of not just the iPhone 6s series, but of the iPhone itself and one that shows the company isn’t saving marquee innovations for those ‘tick’ years. ‘As soon as products are ready we’re going to release them,’ Cook explains. ‘There’s no holding back. We’re not going to look at something and say ‘let’s let’s keep that one for next time.’ We’d rather ship everything we’ve got, and put pressure on ourselves to do something even greater next time.’”

Force field: Apple’s pressure-based screens promise a world beyond cold glass

A piece looking at the technology behind the pressure sensing. My prediction: once people play with it they’ll find it hard to go back to the old way of doing things. Maybe typing on an touchscreen may one day feel natural, and maybe even enjoyable. 

Force field: Apple’s pressure-based screens promise a world beyond cold glass | Reuters:


By adding a more realistic sense of touch to its iPhone, Apple Inc may have conquered a technology that has long promised to take us beyond merely feeling the cold glass of our mobile device screens.

In its latest iPhones, Apple included what it calls 3D Touch, allowing users to interact more intuitively with their devices via a pressure-sensitive screen which mimics the feel and response of real buttons.

In the long run, the force-sensitive technology also promises new or better applications, from more lifelike games and virtual reality to adding temperature, texture and sound to our screens.

‘Force Touch is going to push the envelope of how we interact with our screens,’ says Joel Evans, vice president of mobile enablement at Mobiquity, a mobile consultancy.

The fresh iPhones, unveiled on Wednesday, incorporate a version of the Force Touch technology already in some Apple laptop touchpads and its watches. Apple also announced a stylus that includes pressure sensing technology.

As with previous forays, from touch screens to fingerprint sensors, Apple isn’t the first with this technology, but by combining some existing innovations with its own, it could leverage its advantage of control over hardware, interface and the developers who could wrap Force Touch into its apps.

‘Here we go again. Apple’s done it with gyroscopes, accelerometers, they did it with pressure sensors, they’ve done it with compass, they’ve been great at expediting the adoption of these sensors,’ said Ali Foughi, CEO of US-based NextInput, which has its own technology, trademarked ForceTouch. ‘Apple is at the forefront.’


Haptic technology – a tactile response to touching an interface – isn’t new, even in mobile devices. Phones have long vibrated to alert users of incoming calls in silent mode, or when they touch an onscreen button.

But efforts to go beyond that have been limited.

BlackBerry incorporated pressure sensing into its Storm phone in 2008. And Rob Lacroix, vice president of engineering at Immersion Corp, said his company worked in 2012 with Fujitsu on the Raku-Raku Smartphone, an Android phone that could distinguish between a soft and firm touch to help users unfamiliar with handheld devices.

But most efforts have been hamstrung by either a poor understanding of the user’s needs, or technical limitations. A vibrating buzz, for instance, has negative connotations, causing most people to turn off any vibration feature, says James Lewis, CEO of UK-based Redux, which has been working on similar touch technology for several years.

The technology powering vibrations is also primitive, he said, meaning there’s a slight delay and a drain on the battery. Early versions of pressure-sensing technology also required a slight gap between screen and enclosure, leaving it vulnerable to the elements.

Apple seems to have solved such problems, experts said, judging from their trackpads and the Apple Watch. Indeed, the trackpad carries the same sensation of a physical click of its predecessors, but without the actual pad moving at all.

The result: In the short term, Force Touch may simply make interacting with a screen more like something we’d touch in real life – a light switch, say, or a physical keyboard. With Force Touch, the device should be able to tell not only whether we are pressing the screen, but how firmly. It should in turn respond with a sensation – not just a vibration, but with a click – even if that click is itself a trick of technology.

‘What we’re going to see initially is putting life back into dead display,’ said Redux’s Lewis. ‘We just got used to the cold feel of glass.’


To be sure, mobile is not the first industry to flirt with haptics.

For example, for car drivers, Redux demonstrates a tablet-like display which creates the illusions of bumps and friction when you run your fingers over the glass, mimicking physical buttons and sliders so your eyes don’t need to leave the road.

Mobiquity’s technical adviser Robert McCarthy points to several potential uses of Apple’s technology – measuring the force of touch when entering a password, say, to indicate how confident the user is of their selection, or keying in a numeric passcode using different pressure levels as an extra layer of security.

While Apple’s adoption of the technology has awoken the mobile industry to its possibilities, it was pipped to the post by Chinese handset maker Huawei, which this month unveiled one model with what it also tagged Force Touch technology. Pressing harder in a photo app, for example, allows you to zoom in on a picture without the usual two-finger spread.

Other manufacturers are exploring how to make touching a device more friendly, and more advanced, says Freddie Liu, CFO of Taiwan-based TPK Holding Co Ltd, an Apple supplier.

‘This is just the beginning for Force Touch,’ he said.

(Reporting by Jeremy Wagstaff and Michael Gold, with additional reporting by Reiji Murai in TOKYO; Editing by Ian Geoghegan and Raju Gopalakrishnan)”

Factbox: iPhone 3D Touch suppliers and haptics companies | Reuters

BBC: Game of Drones

Here’s the BBC World Service version of my Reuters piece on drones from a few months back. Transcript below:

America may still be the tech centre of the world — and it is — but regulatory dithering over whether and how to allow drones — or unmanned aerial vehicles as most call them — in its airspace is throwing up opportunities for other countries to get a head-start.

And that’s no small thing, for a couple of reasons. One is that drones as an industry is moving amazingly quickly. Some liken it to the PC: the technology is getting better, smaller, cheaper, and prices are falling so rapidly that everyone can have one, and the gap between what constitutes a serious drone and a toy has narrowed considerably.

There’s another element in this, and it’s also comparable to the PC era. Back then we knew we all wanted a PC but we weren’t quite sure what we wanted it for. We bought one anyway, and felt slightly guilty that it sat in the corner gathering dust. Naysayers questioned the future of an industry that seemed to revolve around convincing people to buy something even when they couldn’t give them a reason to do so.

Sound familiar? A lot of folk, including my self, have bought a drone in the past year. Mine was a tiny one and upon its maiden flight floated high into the air and disappeared into next door’s garden. Its second landed in a gutter that could only be reached by small children and my wife drew the line at sending our daughter up there. So I’m now drone-less.

This is the bigger issue with drones — not whether to propel reluctant tikes up ladders, but to figure out what they’re good for. And this is where companies in Europe and Asia are stealing a march on their U.S. cousins. The hardware is all well and good but the future of drones, like that of computers, is going to be about harnessing their unique capabilities to solving problems, developing use cases, building ecosystems (sorry, I’m obliged by contract to use that word at least once a week) .

So, for example, a company here in Singapore is working with companies and government agencies around the region on a range of interesting things — what they and others are calling drones as a service. So if you’re flying over a palm oil plantation in Malaysia doing something quite basic like mapping where, exactly, the edges of the property are, why not calibrate your cameras so they can also measure moisture level — and likely yield — of individual trees?

And rather than have building engineers hang dangerously out of skyscrapers to check structural damage, why not have a drone do it? Not only do you save on safety, you also have a virtual model of your building you can refer back to. Tired of despatching dog catchers in response to citizens’ complaints? Deploy a drone above the target areas and build a heat map of their movements so you know when best to pounce, and how many leads you’re going to need.

There’s lots of other opportunities being explored out there beyond the obvious ones. The trick is going to build business models around theses services so when companies see drones they don’t think ‘toy I play with at the weekend’ but ‘this could really help me do something I’ve long thought impossible’.

No question, of course, that the U.S. will be the centre of drone innovation. It already is, if you think in terms of developing the technologies and absorbing venture capital. But it may yet be companies beyond American shores which make the most of their head-start that emerge into major players as drones become as commonplace in business, if not homes, as computers are.

BBC – Cybercrime: One of the Biggest Ever

My contribution to the BBC World Service – Business Daily, Cybercrime: One of the Biggest Ever

Transcript below. Original Reuters story here

If you think that all this cybersecurity stuff doesn’t concern you, you’re probably right. If you don’t have any dealings with government, don’t work for an organisation or company, and you never use the Internet. Or an ATM. Or go to the doctor. Or have health insurance. Or a pension.

You get the picture. These reports of so-called data breaches — essentially when some bad guy gets into a computer network and steals information — are becoming more commonplace. And that’s your data they’re stealing, and it will end up in the hands of people you try hard not to let into your house, your car, your bank account, your passport drawer, your office, your safe. They may be thieves, or spies, or activists, or a combination of all three.

And chances are you won’t ever know they were there. They hide well, they spend a long time rooting around. And then when they’ve got what they want, they’re gone. Not leaving a trace.

In fact, a lot of the time we only know they were there when we stumble upon them looking for something else. It’s as if you were looking for a mouse in the cellar and instead stumbled across a SWAT team in between riffling through your boxes, cooking dinner and watching TV on a sofa and flat screen they’d smuggled in when you were out.

Take for example, the case uncovered by researchers at a cybersecurity company called RSA. RSA was called in by a technology company in early 2014 to look at an unrelated security problem. The RSA guys quickly realized there was a much bigger one at hand: hackers were inside the company’s network. And had been, unnoticed, for six months.

Indeed, as the RSA team went through all the files and pieced together what had happened, they realised the attack went back even further.

For months the hackers — almost certainly from China — had probed the company’s defenses with software, until they found a small hole.

On July 10, 2013, they set up a fake user account at an engineering website. They loaded what is called malware — a virus, basically — to another a site. The trap was set. Now for the bait. Forty minutes later, the fake account sent emails to company employees, hoping to fool one into clicking on a link which in turn would download the malware and open the door.

Once an employee fell for the email, the hackers were in, and within hours were wandering the company’s network. For the next 50 days they mapped the network, sending their findings back to their paymasters. It would be they who would have the technical knowledge, not about hacking, but about what documents they wanted to steal.

Then in early September they returned, with specific targets. For weeks they mined the company’s computers, copying gigabytes of data. They were still at it when the RSA team discovered them nearly five months later.

Having pieced it all together, now the RSA team needed to kick the hackers out. But that would take two months, painstakingly retracing their movements, noting where they had been in the networks and what they had stolen. Then they locked all the doors at once.

Even then, the hackers were back within days, launching hundreds of assaults through backdoors, malware and webshells. They’re still at it, months later. They’re probably still at it somewhere near you too.

Hunt for Deep Panda intensifies in trenches of U.S.-China cyberwar | Reuters

My piece on what Deep Panda looks like in action: Hunt for Deep Panda intensifies in trenches of U.S.-China cyberwar | Reuters:

Security researchers have many names for the hacking group that is one of the suspects for the cyberattack on the U.S. government’s Office of Personnel Management: PinkPanther, KungFu Kittens, Group 72 and, most famously, Deep Panda. But to Jared Myers and colleagues at cybersecurity company RSA, it is called Shell Crew, and Myers’ team is one of the few who has watched it mid-assault — and eventually repulsed it.

Myers’ account of a months-long battle with the group illustrates the challenges governments and companies face in defending against hackers that researchers believe are linked to the Chinese government – a charge Beijing denies.

‘The Shell Crew is an extremely efficient and talented group,’ Myers said in an interview.Shell Crew, or Deep Panda, are one of several hacking groups that Western cybersecurity companies have accused of hacking into U.S. and other countries’ networks and stealing government, defense and industrial documents.The attack on the OPM computers, revealed this month, compromised the data of 4 million current and former federal employees, raising U.S. suspicions that Chinese hackers were building huge databases that could be used to recruit spies.

China has denied any connection with such attacks and little is known about the identities of those involved in them.  But cybersecurity experts are starting to learn more about their methods.

Researchers have connected the OPM breach to an earlier attack on U.S. healthcare insurer Anthem Inc (ANTM.N), which has been blamed on Deep Panda.

RSA’s Myers says his team has no evidence that Shell Crew were behind the OPM attack, but believes Shell Crew and Deep Panda are the same group.

And they are no newcomers to cyber-espionage.CrowdStrike, the cybersecurity company which gave Deep Panda its name due to its perceived Chinese links, traces its activities to 2011, when it launched attacks on defense, energy and chemical industries in the United States and Japan. But few have caught them in the act.


In February 2014 a U.S. firm that designs and makes technology products called in RSA, a division of technology company EMC (EMC.N), to fix an unrelated problem. RSA realized there was a much bigger one at hand: hackers were inside the company’s network, stealing sensitive data. 

‘In fact,’ Myers recalls telling the company, ‘you have a problem right now.’Myers’ team could see hackers had been there for more than six months. But the attack went back further than that.

For months Shell Crew had probed the company’s defenses, using software code that makes use of known weaknesses in computer systems to try to unlock a door on its servers. Once Shell Crew found a way in, however, they moved quickly, aware this was the point when they were most likely to be spotted.        SPEARPHISHING

On July 10, 2013, they set up a fake user account at an engineering portal. A malware package was uploaded to a site, and then, 40 minutes later, the fake account sent emails to company employees, designed to fool one into clicking on a link which in turn would download the malware and open the door. 

‘It was very well timed, very well laid out,’ recalls Myers.

Once an employee fell for the email, the Shell Crew were in, and within hours were wandering the company’s network. Two days later the company, aware employees had fallen for the emails – known as spearphish – reset their passwords. But it was too late: the Shell Crew had already shipped in software to create backdoors and other ways in and out of the system. 

For the next 50 days the group moved freely, mapping the network and sending their findings back to base. This, Myers said, was because the hackers would be working in tandem with someone else, someone who knew what to steal.

‘They take out these huge lists of what is there and hand it over to another unit, someone who knows about this, what is important,’ he said. 

Then in early September 2013, they returned, with specific targets. For weeks they mined the company’s computers, copying gigabytes of data. They were still at it when the RSA team discovered them nearly five months later. 

Myers’ team painstakingly retraced Shell Crew’s movements, trying to catalogue where they had been in the networks and what they had stolen. They couldn’t move against them until they were sure they could kick them out for good. 

It took two months before they closed the door, locking the Shell Crew out.  But within days they were trying to get back in, launching hundreds of assaults through backdoors, malware and webshells.

Myers says they are still trying to gain access today, though all attempts have been unsuccessful.  

‘If they’re still trying to get back in, that lets you know you’re successful in keeping them out,’ he said.

(Additional reporting by Joseph Menn; Editing by Rachel Armstrong and Mark Bendeich)”

Moleskines Redux

Moleskin ® redux

Of course, I claim a lot of the credit for this decade-long trend Why Startups Love Moleskines: 

“The notion that non-digital goods and ideas have become more valuable would seem to cut against the narrative of disruption-worshipping techno-utopianism coming out of Silicon Valley and other startup hubs, but, in fact, it simply shows that technological evolution isn’t linear. We may eagerly adopt new solutions, but, in the long run, these endure only if they truly provide us with a better experience—if they can compete with digital technology on a cold, rational level.”

I have returned to Moleskines recently, partly because I realised I have a cupboard full of them, and partly because of exactly this problem: there’s no digital equivalent experience. 

  • easier to conceptualise on paper
  • you can doodle when the speaker is waffling; those doodles embellish, even turn it into Mike Rohde’s sketchnotes
  • you can whip it out in places where an electronic device would be weird, or rude, or impractical;
  • there’s a natural timeline to your thoughts
  • there’s something sensual about having a pen in your hands and holding a notebook
  • pen and moleskine focus your thoughts and attention
  • the cost of the book acts as a brake on mindless note taking (writing stuff down without really thinking why) 
  • no mindmap software has ever really improved the mindmapping experience. 
There’s probably more to it. But maybe the point is that this isn’t a fad. People have been using these in the geeky community for more than a decade, suggesting that they have established themselves as a viable tool. Being able to easily digitise them — for saving, or processing, as I did this morning with a chart I sketched out which my graphics colleague wanted to poach from — is a bonus, and saves us from the fear of losing our work. 

(Via.Newley Purnell)