(Post updated 2020-07-15 07:37 to include postscript on JK Rowling’s planning technique.)
Computers and the software that runs them have long denied us the basic right of dictating to them — not letters and grocery lists, but of what they should actually do for us – most importantly in the first step of thinking: the art of taking notes.
In the mid 80s I was studying history in London, and the first consumer PC came out: the Amstrad. I was immediately intrigued, though I’m no techie. I remember going into Dixon’s one rainy winter afternoon on Tottenham Court Road and explaining my problem to the salesman. It was simple, I thought: I am a collector of events, and I want a computer which will do exactly what I currently do, but store it so I don’t have to carry around this pile of paper. It was simple, I told him. And I explained how I took my history notes, involving two or three basic steps. He looked at me blankly and tried to change the subject. “It comes with a printer and three spare disks.” I bought it anyway. But oh, how naive was I.
Because the reality is that 35 years on — 35 years! — there is still no way to do this. No app allows you to draw lines on a page and then add pieces to it wherever you want. I should know, I’ve tried hundreds of them (and if anyone does read this, I will get responses like ‘Have you tried OneNote?’ or ‘Aeon Timeline allows you to do just that.’ Yes, and no it doesn’t. No app, in short, is smart enough to just ask you what you have in mind and just evolve into that, to help you shape the app in the way you want.
This is the fundamental failure of computers, and computer software. As a technology it’s failed to really find a place in our lives that we’re comfortable with, and that’s because it has demanded too much change in our behaviours. We are mostly compliant: back in the late 2000s executives at telcos were worried 3G was for nought, because people didn’t show any interest in using their phones for anything more than calls and SMS. It took Steve Jobs to change that, by building a consumer device we craved to hold. The rest came naturally, because of a great UI, but no one is claiming that the smartphone adapted to us; we adapted to it. That’s not to say it’s not useful, it’s just not useful in a way that we might have envisaged, if we ever sat down to think about it.
Indeed, the Apple revolution, which I would date from about 2008 cannot be detached from the broader mobile data revolution, which we’re just emerging from. This was a revolution in interfaces, but it wasn’t a revolution in terms of computing. We have become more productive, in narrow terms — we are online a lot more, we send more messages, we might even finish projects quicker — but no one is claiming that our computers mould themselves to our thinking. It’s apt that movies like Her try to explore what that might mean — that our computers learn our thinking and adapt themselves to it.
So back to me and my history problem. There of course are answers to it, but they all require us understanding the mind of the person or people who developed them. And I’m not ungrateful to these apps; they have long been welcome bedfellows. From TheBrain to Roam, MyInfo to Tinderbox, TiddlyWiki to DEVONthink, they have all rewarded the hours — days, weeks, even — I have invested in trying to understand them. But therein lies the problem. The only reward one can get is if one adapts one’s own mind to that of the creator’s vision, and, however amazing that vision is, this in itself is an admission of failure. I don’t want to have to report everything to someone else’s vision, I have one myself, but there’s no software on this earth in 35 years of looking that I can wrestle into submission to my simple vision.
This is not to say the apps in question are a failure. I love them dearly and still use many of them. I have used my pulpits to promote them, and have gotten to know some of the developers behind them. These people are geniuses, without exception, and it’s not their fault their tools cannot be more than interpretations of that genius. We just lack the tools to tell our computers what to do from scratch.
‘Take an A4 sheet of paper, turn it horizontally so it’s in landscape, and then draw three perpendicular lines equidistant apart. Allow the user to write anywhere between the lines, and interpret a three-line dash as the end of each nugget. Interpret the digits at the beginning of each nugget as a date, which can be as vague as a decade and as specific as a minute. Order each nugget chronologically, whichever line it sits between, relative to each other, with gaps between according to the dates. etc etc’.
I still don’t see why I can’t have that software. I don’t see why I couldn’t have it in 1985. I probably could get a developer to whip something up, but then that’s already demonstrated the failure I’m talking about. I want the computer to do it for me, and not being able to, to have to rely on someone else’s coding skills, or even my own, means it’s not doing that.
This feeds into a broader point. Tiago Forte, a young productivity guru, wrote an interesting thread about the serial failure of hypertext, which was a precursor (and loser) to the simpler Web, and the lessons we can draw from it. In the case he describes, Roam. The simple truth: taking notes is a niche area because it’s not taken seriously at any stage of the education process (my history chronology capture was shown to me by the late and excellent Ralph B. Smith, who understood the power of note taking; I can still remember him demonstrating the technique in our first class. It has stuck with me ever since.) Note-taking is the essence of understanding, retaining, collating, connecting and propounding. And yet it’s mostly done in dull notebooks, or monochrome apps, none of which really mould themselves to what we write, take pictures of, record or otherwise store. (And no, Clippy doesn’t count.)
Tiago may well be right: the trajectory of knowledge information management apps (and there you have it; already segmented into what sounds like the most boring cocktail party ever) is that they just aren’t sexy enough to break out of a niche. Evernote was closest, but it got dragged down in part by its dependence on a vocal core of users who pushed it one way and its desperate need to justify its valuation by trying to go value. Truth is, people don’t value collecting information, in part because it’s so easy to recall: even with my 60GB DEVONthink databases, I more often than not Google something because I know I can find the document more quickly that way than in my offline library.
But this doesn’t explain the pre-Google world. Why did we let software go in the wrong direction by not demanding it submit to our will, not the other way around? Well, the truth is probably that computers were basic things, oversized calculators and typewriters for the most part. Sure it helped us write snazzier-looking letters, but heaven forbid us doodling on them, or moving the address around beyond the margins.
We’re still hidebound by our computers, so much so that we don’t realise it. I am rebuilding my life around the new tools, like Roam, and old ones like Tinderbox — a wonderful piece of exotica that is massive for those of us who like to poke around in a piece of software, but which basically means poking around in the head of its developers — and I get a lot out of them. But I am keenly aware that I would rather be just telling a blank computer screen to “take an A4 sheet of paper…”
And perhaps, one day, I will.
2020-07-15 07:28: I came across a post which describes JK Rowling doing something very similar for her story-planning as the way my history tutor taught me. Hers looks like this:
The similarities are that the page is used horizontally, the dates are organised on the left, and columns allow her to record what other parties are doing at the same time. It’s actually a magnificent example of the use of the technique.
Pingback: How A Twitter Scrap, and Covid-19, Reveal a Disruption In Process -
This statement seems crucial:
”I want the computer to do it for me, and not being able to, to have to rely on someone else’s coding skills, or even my own, means it’s not doing that.”
Even if a developer had your exact vision and built it that’s not the point. Why wasn’t there an intermediate tool sitting between programming languages and that vision, so everyone can have their particular vision?
Maybe this is no-code. I don’t know.
I think this failure of software is caused by the challenges and disincentives of pursuing grand unifying tools. It’s hard to build something so flexible that is also usable. It’s even harder to make it compete on features with a tool optimized for a single use-case or vertical. How, as the entrepreneur, do you avoid simplifying a vision to better meet customers’ needs? We’re told to start small. Abstract tools are harder to sell. It’s like an unstable equilibrium — so much pushes