The Leap-launch of the 64-bit Raspberry Pi 3 yesterday set me thinking: if I’d been born 4 years later, my life it bits could have been very different. I might have been in at the start of the PC revolution and progressed through the 8, 16, 32 and 64 bit Intel architectures with everyone else. I’d be one of those people who considers Windows PCs to be Real Computers. I find that concept hard to grasp.
I actually used 24 and 36 bit computers at university and started work on a 32-bit, virtual memory machine. We got free software from other people like us, via the DECUS library. I’ve never had to worry about allocation of physical memory, like some kind of primitive savage. Since the mid-80s, I’ve been waiting for the world to realise that they took a wrong turn and I finally think it might be happening.
Free software happened. Real Operating Systems became available for ‘desktop computers’ (so we could shove them under the desk and stop worrying about regular access to the reboot button.) We got always-on Internet access and now we’re starting to think about parallel processing and functional programming again, like we were in the 80s. If I had the chance to choose a time to start computing, it would be now, at the age of 6.
We were worrying about nuclear annihilation, over-population and running out of fossil fuels then too. Maybe we’ll remember those soon too, now our houses are full of stuff.
“What did the capitalist dream ever do for you Grandad?”
“It wasted my precious time, Best Beloved.”
(sweet because stolen from that nice Mr. Kipling)
….and maybe Mr. Dylan:
“I ain’t a-saying you treated me unkind
You could have done better but I don’t mind
You just kinda wasted my precious time
But don’t think twice, it’s all right.”
In between me starting to learn to ‘computer science’ and stopping writing compiled code in about 1985/6, the world of information systems moved from punched-cards and teletypes to on-line editing and batch-processing to full screen editing then windows and code management and source-code level debugging. I moved from mainframes that made you choose between upper-case letters or maths, to super-minis and graphical workstations. I didn’t move to PCs because they were so obviously THE WRONG WAY to go. This took less than 10 years.
Shortly after I stopped coding, I became aware that the portability of C and Unix wasn’t just snake-oil, and of the availability of relational databases, network filing systems, the object model and parallel computing: Hoare’s Communicating Sequential Processes.
Later, lost in server-land I became a ‘user’ of hypertext and browsers for documentation, a little later of the Internet. I use ‘social’ tools and believe that they are an enabling tool for a new society based on networks rather than hierarchy. What a pity they’re so clunky.
Inspired to make things better, I decided to learn to code again. It must all be so great by now!
What do you mean, “Do I want to be a front-end or a back-end developer?”
Well, since you ask, I want to be both and sideways and up and down, virtually travelling freely through multi-dimensional networks, working with 3D graphical representations of algorithms and business objects moving in object pipelines, using standardised Free tools in the Cloud, with the display and persistence mechanisms abstracted away, out of sight as implementation details to worry about later, but with automated parallelism and fault tolerance.
What? What’s THIS? Why am I expected to work with a tagging language designed for sequential documents? Seriously, you want me to specify a font? I have to write different code for web browsers or phones?! And where is my hover-board? What the hell have you people been doing?
Did Microsoft lead you into the woods to see their unicorn foals? Were they goats in a dunce’s cap?
Are we all back now? Hold someone’s hand. We can get through this if we all stay together and look out for the traps. Don’t get too close to the googles. They seem friendly but they’ll pick your packets.