Why Won’t Computers Do What We Want Them To?

By | July 15, 2020
Photo by Aubrey, 2014
Tottenham Court Road. Photo by Aubrey, 2014

(Post updated 2020-07-15 07:37 to include postscript on JK Rowling’s planning technique.) 

Computers and the software that runs them have long denied us the basic right of dictating to them — not letters and grocery lists, but of what they should actually do for us – most importantly in the first step of thinking: the art of taking notes.

In the mid 80s I was studying history in London, and the first consumer PC came out: the Amstrad. I was immediately intrigued, though I’m no techie. I remember going into Dixon’s one rainy winter afternoon on Tottenham Court Road and explaining my problem to the salesman. It was simple, I thought: I am a collector of events, and I want a computer which will do exactly what I currently do, but store it so I don’t have to carry around this pile of paper. It was simple, I told him. And I explained how I took my history notes, involving two or three basic steps. He looked at me blankly and tried to change the subject. “It comes with a printer and three spare disks.” I bought it anyway. But oh, how naive was I.

Because the reality is that 35 years on — 35 years! — there is still no way to do this. No app allows you to draw lines on a page and then add pieces to it wherever you want. I should know, I’ve tried hundreds of them (and if anyone does read this, I will get responses like ‘Have you tried OneNote?’ or ‘Aeon Timeline allows you to do just that.’ Yes, and no it doesn’t. No app, in short, is smart enough to just ask you what you have in mind and just evolve into that, to help you shape the app in the way you want.

This is the fundamental failure of computers, and computer software. As a technology it’s failed to really find a place in our lives that we’re comfortable with, and that’s because it has demanded too much change in our behaviours. We are mostly compliant: back in the late 2000s executives at telcos were worried 3G was for nought, because people didn’t show any interest in using their phones for anything more than calls and SMS. It took Steve Jobs to change that, by building a consumer device we craved to hold. The rest came naturally, because of a great UI, but no one is claiming that the smartphone adapted to us; we adapted to it. That’s not to say it’s not useful, it’s just not useful in a way that we might have envisaged, if we ever sat down to think about it.

Indeed, the Apple revolution, which I would date from about 2008 cannot be detached from the broader mobile data revolution, which we’re just emerging from. This was a revolution in interfaces, but it wasn’t a revolution in terms of computing. We have become more productive, in narrow terms — we are online a lot more, we send more messages, we might even finish projects quicker — but no one is claiming that our computers mould themselves to our thinking. It’s apt that movies like Her try to explore what that might mean — that our computers learn our thinking and adapt themselves to it.

So back to me and my history problem. There of course are answers to it, but they all require us understanding the mind of the person or people who developed them. And I’m not ungrateful to these apps; they have long been welcome bedfellows. From TheBrain to Roam, MyInfo to Tinderbox, TiddlyWiki to DEVONthink, they have all rewarded the hours — days, weeks, even — I have invested in trying to understand them. But therein lies the problem. The only reward one can get is if one adapts one’s own mind to that of the creator’s vision, and, however amazing that vision is, this in itself is an admission of failure. I don’t want to have to report everything to someone else’s vision, I have one myself, but there’s no software on this earth in 35 years of looking that I can wrestle into submission to my simple vision.

This is not to say the apps in question are a failure. I love them dearly and still use many of them. I have used my pulpits to promote them, and have gotten to know some of the developers behind them. These people are geniuses, without exception, and it’s not their fault their tools cannot be more than interpretations of that genius. We just lack the tools to tell our computers what to do from scratch.

Such as

‘Take an A4 sheet of paper, turn it horizontally so it’s in landscape, and then draw three perpendicular lines equidistant apart. Allow the user to write anywhere between the lines, and interpret a three-line dash as the end of each nugget. Interpret the digits at the beginning of each nugget as a date, which can be as vague as a decade and as specific as a minute. Order each nugget chronologically, whichever line it sits between, relative to each other, with gaps between according to the dates. etc etc’.

 

If only. 

I still don’t see why I can’t have that software. I don’t see why I couldn’t have it in 1985. I probably could get a developer to whip something up, but then that’s already demonstrated the failure I’m talking about. I want the computer to do it for me, and not being able to, to have to rely on someone else’s coding skills, or even my own, means it’s not doing that.

This feeds into a broader point. Tiago Forte, a young productivity guru, wrote an interesting thread about the serial failure of hypertext, which was a precursor (and loser) to the simpler Web, and the lessons we can draw from it. In the case he describes, Roam. The simple truth: taking notes is a niche area because it’s not taken seriously at any stage of the education process (my history chronology capture was shown to me by the late and excellent Ralph B. Smith, who understood the power of note taking; I can still remember him demonstrating the technique in our first class. It has stuck with me ever since.) Note-taking is the essence of understanding, retaining, collating, connecting and propounding. And yet it’s mostly done in dull notebooks, or monochrome apps, none of which really mould themselves to what we write, take pictures of, record or otherwise store. (And no, Clippy doesn’t count.)

Tiago may well be right: the trajectory of knowledge information management apps (and there you have it; already segmented into what sounds like the most boring cocktail party ever) is that they just aren’t sexy enough to break out of a niche. Evernote was closest, but it got dragged down in part by its dependence on a vocal core of users who pushed it one way and its desperate need to justify its valuation by trying to go value. Truth is, people don’t value collecting information, in part because it’s so easy to recall: even with my 60GB DEVONthink databases, I more often than not Google something because I know I can find the document more quickly that way than in my offline library.

But this doesn’t explain the pre-Google world. Why did we let software go in the wrong direction by not demanding it submit to our will, not the other way around? Well, the truth is probably that computers were basic things, oversized calculators and typewriters for the most part. Sure it helped us write snazzier-looking letters, but heaven forbid us doodling on them, or moving the address around beyond the margins.

We’re still hidebound by our computers, so much so that we don’t realise it. I am rebuilding my life around the new tools, like Roam, and old ones like Tinderbox — a wonderful piece of exotica that is massive for those of us who like to poke around in a piece of software, but which basically means poking around in the head of its developers — and I get a lot out of them. But I am keenly aware that I would rather be just telling a blank computer screen to “take an A4 sheet of paper…”

And perhaps, one day, I will.

—-

2020-07-15 07:28: I came across a post which describes JK Rowling doing something very similar for her story-planning as the way my history tutor taught me. Hers looks like this:

rowling outline

The similarities are that the page is used horizontally, the dates are organised on the left, and columns allow her to record what other parties are doing at the same time. It’s actually a magnificent example of the use of the technique. 

Covid 2: The Best/Worst of Times

By | March 26, 2020

This is part two of a series on the lessons we should, and shouldn’t be drawing from the Coronavirus Pandemic. Part One is here.

Woman Wearing Mask on Train
Remembering the Cluetrain/Anna Shvets, Moscow

A crisis defines us. Perhaps more precisely, a crisis highlights what we have lost, and what defines us is how quickly we can regain it (or not).

First there’s the humanity. One of the redeeming features about Hong Kong’s dense life — and a restorer of faith in human nature — was that once you got into the hills and trails passing strangers would greet each other as if they were on a jaunt in Kew Gardens. It’s a strange world now, when we venture out alone and nervously pass another lone pedestrian at a safe distance, and bark when they get too close. And you know we are in the territory beyond compassion when a family loses one of its kind to the coronavirus, and yet doesn’t want to say that is the reason for fear of stigma. Society, even at its smallest component unit, can break down quickly if social norms don’t catch up with the crisis in its midst. Now that family (friends of a friend) can’t properly mourn, or warn others in the tightly packed neighbourhood where they live of a lethal infection, and those around can’t find a way to offer support because they can’t speak the killer’s name out loud. Empathy withers, rumours prosper and dog ignores dog.

But there are other trends afoot. When there are practical things to do, air rushes in to fill the gaps. Our obsession with just-in-time supply chains has exposed its achilles heel — a lack of the things we need the most. But others have stepped up. A group calling itself OSCMS-Mods (Open Source Covid-19 Medical Supplies) has emerged on Facebook “to evaluate, design, validate, and source the fabrication of open source emergency medical supplies around the world.” It has quickly proved its mettle: a document listing needs ranges from hand sanitizer to laryngoscopes, complete with glossary and warnings about safety and liability.

The Facebook group is dynamic. At the time of writing it has nearly 48,000 members. In the past hour or so I’ve seen posts by an electrical engineering instructor at the University of Wisconsin-Madison mulling whether to assign as a final project to his students an infrared thermometer. Others chip in, saying they’re working on something similar, offering advice (“don’t forget to program in an emissivity factor”) or help (“I’m pretty fast with 3D modeling and a 3D printer. If you guys need some help with the casing, hit me up!”). In another post a resident of Vancouver Island reports, after driving 500 km, that there is no hand sanitiser or isopropyl on the island. Some offer alternatives (not all them wise) but someone in Hong Kong offers a simpler solution: shipping some of the excess from Hong Kong, where emergency orders arrived too late and the price has fallen below its suppliers’ cost. Medical workers from Southampton, New York, post a photo of themselves in full gear to ask for help alleviate a shortage of surgical caps and masks. The jury is still out on this one, but there are some good suggestions, and hopefully the gap is plugged.

Such adhoc approaches are reassuring. For one thing, people want to help, and the platforms are there to make that a reality — Facebook, with its groups, and a growing mastery of the technology: 3D printing, materials, Arduinos and Raspberry Pis. Of course, these initiatives are only going to be meaningful if they are consistent, and find a way of ensuring that requests for help are not just met by comments and virtue-signalling responses, but concrete action. Time will tell. Technology can often be a hammer looking for a nail.

I think another side to this that may outlast these emergency responses is that the technology will find a way through to real usage, rather than a pure business model. In other words, that tools emerge not because people want to make money, but because they can be useful, and there’s nothing like a jolt to the system for us to realise we need different, or better tools and to define those needs better. A piece in TechCrunch relates how Jahanzeb Sherwani, who developed a popular screensharing app called ScreenHero which he sold to Slack, pushed out a follow-up app called Screen ahead of time — and made it free — to help teams stuck at home share their screens. Given how quickly we’ve grown sick of the ‘heads in a box’ conferencing view (where everyone to me looks like they’re baffled seniors, looking around for their false teeth) this tool works well by going the other way, realising that we’re online to work on something, not look up each other’s noses.

Image result for zoom multiple members
Zoom

This reminds me much more of the 2000-2005 era, when collaboration and thoughtfulness tended to be the norms, a post-boom pause which led to the development (or propagation) of the tools that became Web 2.0, and which in turn provided the (largely unacknowledged) foundations of the social media era. I’m thinking RSS, XML, podcasts, wikis, tagging, web-based apps, microformats, the Cluetrain Manifesto, simple beautiful interfaces. If you think we could have got to Twitter, Facebook etc without the work of largely unrewarded pioneers of that age, you’re mistaken. But it was born out of a particular era, a transition from the web’s beginnings to the mobile, silo-ed era we know today. It was a vastly underestimated period, where an explosion of ideas and connectedness led to an explosion of tools to make the most of that. Nearly all those tools were open source, nearly all became bedrock of these later, frankly more selfish, times. But the spirit, it seems, lives on, and I am hoping that what I’m seeing in initiatives like OSCMS and Sherwani’s generosity is something akin to that: a realisation that technology is the handservant, not of viral growth and big payoffs, but of building connections between us – in times of calamity (personal or global) and beyond, by providing tools for free because they might be useful and might lead to something great.

I don’t think for a moment that these initiatives will of themselves be enough; it’s clear that nearly every public medical facility and service was woefully underfunded and hence underprepared. No battery of 3D printers is going to be able to fill that void. But hopefully the level of interest and involvement — call it civic, call it individualist — in trying to address that gap contributes to a broader discussion about what is our baseline for supplying, funding, equipping and populating such services in the future. And just as Wikipedia arose, not out of commissioning existing experts to write up entries (an effort that failed abysmally) but out of just letting anyone — no need to flash credentials — to contribute and allow the water level to rise by itself, so may we find that out of this Quarantined Era emerges a new sense of how individuals might contribute, and what mechanisms and tools need to be developed or honed to make that happen.

The Changes A-coming

By | March 26, 2020

Covid-19 has reminded us, if we needed reminding, that people behave in unpredictable ways. We are not, it turns out, rational beings. Our leaders lead from the front, the back, not at all, or just feather their own nest first. People defy curfews; they cough on others, smear their saliva on lift buttons, and fight over toilet rolls. Others sacrifice themselves helping strangers, look out for neighbours they barely know, sing and perform to lift others’ spirits. This should give us pause before we start predicting what the world will look like after the virus.

A piece by Politico confirmed my bias that there is a tendency among those viewing the crisis unfold towards confirmation bias — nearly all the experts asked to contribute their thoughts on how the world will be changed effectively said what you think they would say: the author of a book called “The Way Out: How to Overcome Toxic Polarisation” said that there would be a, er, decline in polarisation. The author of a book called “The Death of Expertise” said there would be a, um, return to seriousness and respect for expertise. The author of a book about how social infrastructure can help fight inequality said the virus would “force us to reconsider who we are and what we value” and “make substantial new investments in public goods-for health.”

I’m not mocking these writers, or the article itself. It’s natural enough to see in the virus the seeds of the change one is hoping for or has already predicted would happen. Such predictions rarely stand the test of time. We saw the same phenomenon after 9/11, the last great external shock to the West’s system. People talked then about leaving New York, about embracing a different, simpler life. They bought canoes, bulletproof vests, ammunition, parachutes. Analysts predicted a quite different future for us all. A N.R. Kleinfield wrote a decade on in the New York Times:

Paul Simon said he didn’t know if he could ever complete another album. A woman wrote on a remembrance site that she regretted that she had had children, that she had brought their innocence into a world no longer fathomable to her.

But there has been a chasm between expectations and reality. The prophecy of more attacks on the United States has not been the case, not yet at least. Bumbling attempts got close — involving underwear and a shoe and a 1993 Nissan Pathfinder — but the actuality has been that terrorist acts on American soil in the succeeding years have been, as always, largely homegrown.

So many things were expected to be different that have not been. Time passes, and passes some more. Exigencies of living hammer away impatiently. People — most of them, at least — began to become themselves. New York, which by its nature accommodates so much, was willing to absorb 9/11 and keep moving.

That day for many of us is as fresh as if it were yesterday, but the way we thought it would change us has grown stale. Yes, we have the security theatre of airport checks — though they too, might change emphasis once the viral dust has settled — but for most of us our lives didn’t change substantially. (Paul Simon has released six albums, six compilations and one boxed set since 9/11.)

It’s understandable we feel that momentous events have momentous, long-term impacts on our lives, but the reality is that the changes wrought are both less and more than what we anticipate, even by the boffins among us.  

Probably the best way to view the impact of Covid-19 is to view the impact of its predecessor. Not SARS or MERS, although they highlighted how those countries with a institutional and collective memory of a recent epidemic are best equipped mentally and logistically for a new one; but the Spanish ‘Flu of 1918-20, which affected much of the same territory as Covid has — namely the world.

Beds with patients in an emergency hospital in Camp Funston, Kansas, in the midst of the influenza epidemic. Date: circa 1918. Photo from Wikimedia Commons.
With masks over their faces, members of the American Red Cross remove a victim of the Spanish Flu from a house at Etzel and Page Avenues, St. Louis, Missouri. Photo from Wikimedia Commons.

Firstly there are significant similarities between the two in the way they played out. As we have seen in Europe, Australia and the U.S., there’s a reluctance on the part of government to impose unpopular measures — most obviously to get people out of pubs, off beaches and indoors. The same was true in France in 1918, where local officials were reluctant to enforce measures such as closing theatres, cinemas, churches and markets “for fear of annoying the public.” Japan happily banned mass gatherings in its Korean colony, but didn’t even consider trying the same thing back home.

People are people. Officials don’t want to do unpopular things (except when they do not actually face the voter — Japan was by then a democracy of sorts.) And while during the pandemic itself people behaved much as we’re behaving — most of us with “collective resilience”, as Laura Spinney puts it in her excellent Pale Rider — that group identity eventually splinters, and “bad” behaviour emerges. She points to the 1919 Rio carnival, intended to mark the end of the crisis even while the flu was still claiming lives, where the partying took a dark twist: one historian, Sueann Caulfield, found that in the period after the epidemic, there was a surge in reported rapes in the city, temporarily outnumbering other types of crime. The point — beyond the horror of the crimes themselves — is that people behave in strange ways, and crises both fundamentally change their behaviour, but also amplify existing traits. There is no simple outcome.

So predicting is a dangerous game, or it would be if we were ever held to the predictions we make. And it is, of course, far too soon to even know how this crisis will unfold, how long it will take and how many of us it will take with it. So it’s probably unfair to ask others to predict the lasting impacts, at least at this point, and unfair to mock them for their confirmation bias. I would love a more civil society that takes electing its leaders seriously enough to realise they aren’t electing someone to entertain them as much as operate the levers of government. I would love to believe that the selflessness we’ve seen come out of the crisis thus far would linger after peace returns, that we will properly honour those in and serving the medical professions — from the cleaner to the surgeon. That we will realise it can’t go on like this, that we have to take better care of the planet, not move so selfishly through it and past each other, that Gaia is a complex being that weaves everything into her web, even unseen droplets that can pass between us, which we can use to kill each other if we do not take the utmost care.

But that would probably be asking too much. We have to assume that the crisis brings out both the best in us and the worst in us, and we need to stop virtue-signalling about helping old folk with their groceries or checking in on neighbours and just do it, sotto voce, both during the quarantine and after it. If you need a reason why, it’s because collective resilience is as selfish as looking after yourself alone; during crises we tend to perceive ourselves not as individuals but as members of a group, and hence (so the psychological theory goes) helping others in the group is a form of selfishness. Do it, but don’t pat yourself on the back and post something to Facebook about it. If you were really serious about it you would have been doing it long ago, and keep doing it long after.

So my predictions? I’ll jump off that cliff in a later post, but for now, it seems likely that we will both underestimate and overestimate the length and impact of this crisis. Those of us who think we’re well prepared for this, will find that it hits us in other ways. Those of us fearful for the future will probably find fresh reservoirs of strength. The only thing I can predict with any certainty is that it will start to get boring quickly, and while people are dying, others will be defying curfews and sabotaging efforts to stamp out the virus. At the same time, I believe there will be more quiet heroics that will go untold, more quiet domestic solidarity among families that once fought, and the rise of business ideas amidst the lockdown that will make millions for those who nurse them to life. I’ll hang my hat on those predictions, alongside my mask and hand sanitizer gel.

How do subscriptions fare in a recession?

By | March 9, 2020

Source: App Annie, State of Mobile 2020

The subscription model (‘subscription economy’ was a term apparently coined at least four years ago) is becoming de rigeur in many zones. App Annie’s recent State of Mobile report found that In App subscriptions contributed to 96% of spend in the top non-gaming apps. As an overall proportion of spend they rose from 18% in 2016 to 28% in 2019 (games, of course, still dominate.) It concluded in a recent post: “Clearly companies across industries need to not only be thinking about their mobile strategy, but also their subscription strategy, if they want to succeed in 2020.”

But is this a wise move?

The attention economy, as folk call it, depends on competing for a limited resource — our attention. But it will always be trumped by a resource that determines what can be done with that attention — money. If we have no job, then our attention tends to be focused elsewhere. If we have a job but not much money, or are afraid of losing that job, then our attention to other non-job issues is probably limited.

The other thing the attention economy relies on increasingly is the subscription model. Recurring fees are much more appealing to a company than a one-time cost, which is why everyone is heading that way. But the subscription model has an achilles heel: most services that used the subscription model in the old days were because of the way they were produced and delivered — electricity, water, telephone, gas, newspapers, cable. And most involved some lock-in: an annual or quarterly contract etc, which hid the overhead costs of connecting, delivering and disconnecting in the subscription. But to disrupt these entrenched subscription services OTT upstarts which didn’t have those costs like Netflix made it real easy to subscribe — and unsubscribe.

And here’s the rub. When subscription becomes a discretionary spend — something you can shed like a skin when the rain comes, then you find the weakness of the subscription model. This is why old guard subscription model players like the New York Times have transferred their approach to digital, knowing it’s better to alienate a few users by making unsubscribing disproportionately harder than subscribing, absorbing the hit of a few angry folk like me in order to keep the bulk of subscribers who couldn’t be bothered to jump through the hoops.

So when the Coronavirus Recession hits you, what are you going to shed? Discretionary spend is the first one to go, and that usual means monthly outgoings that just don’t seem to be as important as they were when you were coasting. Indeed, a lot subscription economy players, like Statista and others, only offer an annual subscription, although they price it per month to make it sound less. It’s cheaper, and more predictable, to charge per year.

I’m not convinced that software is a good candidate for subscription models. I understand its appeal, and I am as frustrated as them how the mobile appstore has reduced the amount that people are willing to pay for good software.

When Fantastical, a calendar on steroids for macOS and iOS from Flexibits, went from a one-time fee to a subscription model it split the community — especially those on iOS who suddenly had to pay 10 times what they were paying before. John Gruber argued $40 a year for a professional task app on all Mac platforms was a decent deal, arguing that those who don’t want to upgrade can still use the old version, and he’s probably right. But I haven’t upgraded and have instead shifted over to another calendar app, BusyCal, that is included in Setapp, another subscription model which bundles together multiple apps for $10 a month. In part that was because of the annoyance of finding certain features still available as menu items in Fantastical but blocked by popups:

Not the kind of productive experience I am looking for. Hobbling or crippling, as it’s sometimes called, is never a pretty look. You either have the functionality or you hide it.

A better route is to be flexible. Of course, there’s an upside to monthly subscriptions that are real easy to start and stop — when the sun shines, you can easily resubscribe. Indeed, the smartest subscription model in my book is the freemium one — where you can easily move between subscription levels depending on usage and how empty your pockets are. I recently canceled my paid Calendly subscription, downgrading to the free model and was told by a helpful customer service person that “you can certainly choose the monthly plan on your billing page and pay for only the months you need it for! That might work better for you.”

I would recommend any company moving to the subscription model to do this. Or to pursue the bundling model. Not to lock people in — where one subscription depends on another — but to make what might have been discretionary spend something that becomes necessary spend through a compelling use case. Setapp is that model (though sometimes I baulk and wonder if I’m paying over the odds). A lot of the apps I use on Setapp are ones that I would have not otherwise found — and I’m an inveterate hunter of new apps. By making the marginal cost of using them zero, I find they worm their way into my workflow. Setapp helps this by taking  an interesting route, in that its appstore-like mothership is so baked into macOS that searching for an app installed on my computer via Spotlight or Alfred will include in the results apps that haven’t been installed but are part of Setapp. So if I’m looking for a photo editor, or screenshot taker, or calendar app, on my Mac the results will include those in Setapp that I haven’t installed.

This shoehorns productivity into the subscription model. It’s helping to make Setapp more useful by introducing me to new apps it is has in its portfolio — thus making all the apps in Setapp more recession-proof because the more Setapp apps I use, the less likely I’m going to cancel the subscription overall. (Yes, those apps I don’t install or use won’t get a cut, or will get a smaller cut, but the overall rising tide will help keep all the boats afloat. Or in a tweak of the analogy: all the apps in the Setapp boat, amid the buffeting recessional sea, rely on the size of the boat to keep them all afloat. Only if the boat sinks will they sink).

Bundling makes a lot of sense in disparate fields — I’ve been advising media clients to seek out bundling options with other subscription model companies which previously might have been regarded as competitors. Bundling should not be the cable TV model of putting the good stuff and crap together and forcing subscribers to pay for both, but to try to anticipate — if your customer data is good enough you shouldn’t have to guess — what else of value is in your customer’s discretionary bucket, and try to move both yours and those into a necessary one. A tech news site coupling with a tech research service, say. 

In the meantime, expect a lot of subscription-based approaches to suffer in the recession. I expect by the end of it the subscription model won’t be so appealing, or will require more creative thought processes to evolve. The key is in not treating the consumer as either stupid (that we don’t realise $5 a month adds up over a year) or lazy (that we won’t do what is necessary to cancel a subscription if we have to), but to take the freemium model seriously: make it really easy to reduce our payment when we need to, and really easy to go back when we’re feeling flush again. Just don’t cripple the quality of the service you have committed to deliver, even if it’s free, by ads beseeching us to pony up or by drawing arbitrary and punitive lines which make the free version more irritating than alluring.

Then just wait out the storm, as are we all, and hopefully you’ll remain useful enough in the free version to stay on our radar when the sun returns. 

Swiss to Cheese: Apple Transforms Another Industry

By | February 14, 2020

Another Apple product I’m unlikely to purchase — a smartwatch. I don’t need more screens to look at frankly, but I doff my smartcap to the company for the way they’ve usurped an industry that already existed and then doubled it. This approach has some parallels to the AirPod strategy, which I looked at before : take a market that exists, wait until the technology works, have a couple of shots at it, dominate it and then expand it. Here are the latest numbers, courtesy of Strategy Analytics:

UntitledImage

In short, Apple has not only grown its shipments by more than a ⅓, it’s eaten a sizeable portion of the Swiss watch industry’s cheese lunch. As SA’s Steven Waltzer puts it: “Traditional Swiss watch makers, like Swatch and Tissot, are losing the smartwatch wars. Apple Watch is delivering a better product through deeper retail channels and appealing to younger consumers who increasingly want digital wristwear. The window for Swiss watch brands to make an impact in smartwatches is closing. Time may be running out for Swatch, Tissot, TAG Heuer, and others.” The full report can be purchased here.

So let’s put this in a slightly broader perspective. This is a tipping point in the evolution of the watch and a hammer blow to the Swiss watch industry. While the figures don’t quite tally with Strategy Analytics’, those from the Federation of the Swiss Watch Industry show just how effective Apple has not only created a market for itself, but also usurped another’s. For years the Swiss watch industry had been relatively settled, only to see Apple — and knee-jerk competitors like Huawei and Samsung, who have also carved a market for themselves on Apple’s coat-tails — gradually erode their business. Last year shows just how far it has gone:

UntitledImage

This is classic Apple in many ways. There were lots of ‘this is make or break for Apple’ type stories in the first year, and overblown predictions of 2015, and 2016, had to be revised. Indeed, while overall shipments of  smartwatches rose in 2016 (from 20.8 million to 21.1 million, according to Strategy Analytics, Apple’s shipments actually shrunk, while others rose. But these were teething problems: sensors needed to be more accurate, sales channels with telcos needed to be tweaked. By 2017 Apple had fixed most of this, and the trajectory is clear. Probably more importantly, consumers realised that if you were going to put a smartwatch on your wrist, it had to be a classy one. There was no ‘good enough’ syndrome for that bit of prime real estate. And, like the Air Pods, the device needs to have a seamless relationship with the parent device.

Lessons learned? I once again wasn’t convinced about the smart watch. I haven’t bought one, and don’t intend to. But I get it; Apple is currently making much from the stories of how these devices may have saved lives. This isn’t the reason people buy these things, but it’s a good argument to win over the spouse, or conscience, and it does point to how, eventually, medtech and consumer device will merge beyond the hobbyist and fitness fanatic. And it’s not hard to see how soon enough the ear piece and the wrist will eventually become The Device, and we can ditch the smartphone altogether.