Connected cows, cars and crockery prod chip mega mergers

My Reuters piece attempting to place the recent chip mergers in a longer timeline. Yes, I hate the term internet of things too. 

Connected cows, cars and crockery prod chip mega mergers | Reuters:

SINGAPORE/TAIPEI | BY JEREMY WAGSTAFF AND MICHAEL GOLD

Chip companies are merging, signing $66 billion worth of deals this year alone in preparation for an explosion of demand from all walks of life as the next technological revolution takes hold: the Internet of Things.

As cars, crockery and even cows are controlled or monitored online, each will require a different kind of chip of ever-diminishing size, combining connectivity with processing, memory and battery power.

These require makers to pool resources and intellectual property to produce smaller, faster, cheaper chips, for a market that International Data Corp said would grow to $1.7 trillion by 2020 from $650 billion last year.

By comparison, chip markets for personal and tablet computers are stagnant or in decline, and even smartphones are near peaking, said Bob O’Donnell, a long-time consultant to the chip industry.

‘We’re very much done in terms of growth of those traditional markets,’ said O’Donnell. ‘That’s why they are looking at this.’

Last month saw the biggest-ever chip merger with Avago Technologies Ltd agreeing to buy Broadcom Corp for $37 billion. That eclipsed the $17 billion Intel Corp agreed last week for Altera Corp, and the $12 billion NXP Semiconductors NV offered in March for Freescale Semiconductor Ltd.

On Friday, Lattice Semiconductor Corp said it was open to a sale.

 

CONNECTED COWS

The Internet of Things relies on chips in devices wirelessly sending data to servers, which in turn process the data and send results to a user’s smartphone, or automatically tweak the devices themselves.

Those devices range from a light bulb to a nuclear power plant, from a smartwatch to a building’s air-conditioning system. This range presents both opportunity and a challenge for semiconductor companies: their potential customer base is huge, but diverse, requiring different approaches.

Qualcomm Inc, for example, is used to selling chips to around a dozen mobile phone manufacturers. The Internet of Things has brought it business from quite different players, from makers of water meters to street lights that sport modems and traffic-monitoring cameras. All have their own needs.

‘You can’t think the new market is just like the old one,’ Qualcomm Vice President of Marketing Tim McDonough said in an interview.

Qualcomm estimates that the Internet of Things will bring in more than 10 percent of its chip revenue this business year.

And then there are those cows. Instead of monitoring herds by sight, farmers in Japan have tagged them with Internet-connected pedometers from Fujitsu Ltd and partner Microsoft Corp, to measure when they might be ready for insemination. Cows in season, it turns out, tend to pace more.

SPECK OF CHIP

This new business is pushing chip companies together in part to consolidate their expertise onto one chip, a trend forged by mobile phones.

The Avago-Broadcom deal, for instance, brings together motion control and optical sensors from Avago with chips from Broadcom that specialize in connectivity via wireless technologies such as Bluetooth and Wi-Fi.

In the past ‘if you wanted to build a board that has all the components, then you needed to buy three different chips,’ said Dipesh Patel of ARM Holdings PLC, which licenses much of the technology inside mobile phones – and, increasingly, in the Internet of Things.

‘Now you only need to buy one chip. But you’re trying to get more of the same system on the same chip.’

As chips get smaller, they could be tiny enough to ingest, according to Vital Herd Inc. The Texas-based startup’s pill-like sensor, once a cow swallows it, can transmit vital signs, warning farmers of illness and other problems.

Jen-Hsun Huang, co-founder and chief executive officer of graphics chips maker Nvidia Corp, predicts chips will shrink to the size of a speck of dust and find their way into almost anything, from shoes to cups.

‘Those little tiny chips, I think they’re going to be sold by the trillions,’ Huang said in an interview. ‘Maybe even sold by the pound.’

PROCESSING

Installing chips into end products is only one side of the equation. The more things connect, the bigger the number and capability of servers needed to process the vast amount of specialized data those chips transmit.

To meet the demand, Intel could employ chips for its servers designed by new purchase Altera that analyze streams of similar data – specializing in one function, as opposed to multiple functions like chips inside personal computers – industry consultant O’Donnell said.

Combining such strengths is going to be vital, said Malik Saadi of ABI Research, because consolidation is not over yet.

More chip companies ‘will have to make that radical decision to merge,’ said Saadi. ‘This is just the starting point.’ 

(Additional reporting by Liana Baker in New York; Editing by Christopher Cushing)”

BBC: Cluetraining Disruption

Has technology, convinced of its own rectitude, lost its sense of moral direction? 

Disruptive innovation is one of those terms that worms its way into our vocabulary, a bit like built-in obsolescence or upselling. It’s become the mantra of the tech world, awhich sees its author Clayton Christensen, as a sort of messiah of the changes we’re seeing in industries from taxis, hotels and media. Briefly put the theory goes: existing companies are undercut and eventually replaced by competitors who leverage technology to come up with inferior but good enough alternatives — think the transistor radio displacing vacuum tube radios — or come up with wholly new products that eventually eclipse existing markets — think the iPhone killing off the MP3 player (and radios, and watches, and cameras, and guitar tuners etc.) 

Backlash 

A backlash has emerged against this theory, partly because it’s somewhat flawed — even Prof Christensen himself has misapplied it, as in the case of the iPhone — but also because it’s scary. Uber may be a great idea if you’re looking for a ride, but not if you’re an old-style cabbie. Airbnb is great for a place to crash, but feels like a car crash if you’re running a real b’n’b. And don’t get me started on being a journalist.   But there’s a much bigger problem here. The tech world is full of very inspiring, bright, charismatic people and that’s one reason I choose to write about it for a living. But it has changed in the past decade or so, undeniably. 15 years ago, just before the last dot.com crash, a tome appeared: The Cluetrain Manifesto, and you’d either read it or you hadn’t. It was a collection of writings by some fine thinkers, the great bloggers of the day like Doc Searls and Dave Weinberger. The main thesis: the Internet is unlike ordinary, mass media, because it allows human to human conversations — and that this would transform marketing, business, the way we think. Markets are conversations, it said.   For a while we were giddy with the power this gave us over corporations. We could speak back to them — on blogs, and later on what became known as social media. Even Microsoft hired a blogger and let him be a tiny bit critical of things at Redmond.

Last blast

Looking back, it was probably the last naive blast of the old dying Internet rather than a harbinger of the new. The language, if not the underlying philosophy, lives on in conferences and marketing pitches. Most social media conversations are harsh, mostly inhuman — we refer to deliberate online baiters as trolls, which I suppose makes them subhuman — and we’ve largely given up influencing the companies we do business with except in the occasional diatribe or flash hashtag full frontal mob assault.

And more importantly, there is no longer any of that idealism or utopianism in any startup movement that I can see. For sure, we cheer on these players because they seem to offer something very seductive, from free email, calendars, spreadsheets to cheaper rides, stays, music, video and goodies, to shinier bling, gadgets, wearables and cars. And they all sing the same mantra: we’re disruptive, we’re disintermediating, we’re leveraging technology, we’re removing friction, we’re displacing old cozy cartels, we’re doing it all for you.

The problem is that underneath this lies an assumption, an arrogance, that technology is a natural ally of good, that disruption is always a good thing, that the geeks parlaying it into products are natural leaders, and that those opposing it are reactionaries, doomed to the scrapheap.

Rapid cycle

The result: we’re just getting into a more rapid cycle of replacing one lot of aloof, cloth-eared giants with another lot, who in short order will be replaced by another. Microsoft, IBM, and HP, the giants of when Cluetrain was written, have been replaced by Amazon, Apple, Alibaba, Facebook and Google, all of them as hard to hold a conversation with as Microsoft ever was. And the big players of tomorrow, which may or may not be Uber, Airbnb, Tencent and Twitter, don’t seem particularly interested in a conversation either.

We need to recover some of that old Cluetrain idealism, naivety, when we thought that what we were doing was building a new platform for anyone to use, to talk back to authority, to feel heard and appreciated — and not just a cult-like celebration of the rugged individuals who dismantled Babel only to build a bigger, shinier and more remote one its place.

This was a piece I wrote and recorded for the BBC World Service. It’s not Reuters content – JW

BBC: The Decline of Self Expression

Here’s a BBC piece which the World Service broadcast recently. This isn’t Reuters content.

It’s taken us a long time to get to here, but I think I can safely declare us as, dextrously speaking, back before the caveman.

If we had stumbled into your average cave in about 40,000 BC, we might have chanced upon someone drawing on his bedroom wall, as it were, mixing ochre, hematite and charcoal. We might call this the dawn of manual input of user generated content.

Avail yourself of public transport these days and the best you’ll likely see would be a few people swiping upwards on their mobile screen in a now-familiar gesture meaning — I’m reading about my alleged friends on Facebook to check they’re not doing anything as exciting as I am.

You might, if you’re lucky, see someone actually trying to input some user generated content. A caveman would notice with some surprise that this is not as easy as it was in his day. One old fella I saw laboriously typing a missive on his iPad, tapping out each letter with one finger of his left hand, his right hand holding the device. Indeed, for the most part that is how people write on their mobile devices. Some have physical keyboards, but these are an endangered species.

Why is this a problem? Well, let me count the ways. Firstly, it’s kind of distressing to see people tap away at their screens like hens. Fifty years ago we’d have been lovingly writing letters, poems, diaries in longhand, dipping our quills in ink. Or at least gazing out the window composing poetry in our head.

The other reason is that we think we’re clever, and that somehow each iteration of technology is an advance. It’s an advance for people who make money out of us buying these devices, plugging them into a network and sharing pictures of frowning cats. It’s not an advance in terms of what we’ve come to call interfaces – of making it easier for us to convey our feelings, thoughts and mental creations from our head to others via a permanent or semi-permanent canvas.

In that sense it’s quite a retreat. We’re basically using a century-old technology — the QWERTY typewriter — to enter our thoughts into a device that’s more powerful than the one which put men on the moon. On a keypad the size of a matchbox. And on a piece of glass. That isn’t the sound of keys being hit, it’s the sound of cave people laughing at us.

One of my colleagues feels it necessary to add an apology to the bottom of his overly short emails from his mobile phone, I’m told: apologies if I sound terse, I’m not. I’m writing this on my phone. I can think of no greater indictment of our devices than having to apologise because entering text into them is so fiddly they don’t allow us to express ourselves adequately.

Now the thing is, it’s not all like this. Apple have recently done another splendid video ad extolling all the wonderful applications other people have come up with for their iPads and iPhones. Architects, artists, marine-debris experts, all love the devices for the things they can do with them.

Which is great. But that doesn’t really help the 99% rest of us who are stuck trying to use an anachronistic technology to express ourselves in words. Yes, there’s voice recognition. Yes, there’s software that lets us swipe letters across a keyboard. But there’s no getting away from the fact that mobile devices were not made for writing. Just one percent of changes to Wikipedia articles are done on a mobile device, according to the NYT.

It’s time we recognised a sobering reality: while we blithely talk about this being the age of user generated content, the reality is that very little of that is actual text, arguments, thoughts etc strung together via words. Instead it’s photos, videos, comments and emoticons, or just passing along other people’s content. We may not all be writing with quills, but then again, we’re not exactly writing, either.

When was the last time you did more than click, swipe or pinch on your mobile device?