Tag Archives: Unix

Learning to grok Lispen

This is not my late entrance into the Unix editor flame wars. I’ve always disliked vi and emacs about equally. I’m sure that both are amazing if you have a memory and use them every day. I don’t. I am, however, interested in computational models. The Unix ‘small pieces loosely joined’ philosophy had always leaned me towards vi. I knew emacs had ‘Lisp inside’ but I didn’t care. I had a bad experience with Lisp at university, but what really put me off was that emacs isn’t just an editor; it’s an environment. It duplicates things that happened elsewhere in Unix. You go in there and you don’t come out until home time. In the Winter, you don’t see light. It is neither small nor loose and I didn’t understand why. Was it the first IDE?

Richard M. Stallman hacked on emacs at MIT’s famous AI Lab. The Lab and its culture were torn apart by a war over intellectual property  of the family Lisp machines. It was a difficult breakup and RMS was abandoned by both halves of his family. In reaction to creeping commercialisation he started the GNU project which later enabled GNU/Linux &c.

I’ve realised only recently how incredibly unimportant Unix was to RMS. He simply wanted somewhere to run a Lisp environment that couldn’t be taken away from him, or others who subscribed to the original MIT AI hippy culture and ethics of free sharing of code and information. He ported a ‘C’ compiler to port emacs and started a movement to maintain everything else he needed.

The latest trend in current computing is ‘platforms’. We have gone back to worrying about the ancient concern of application portability. We’ve divided into language tribes: Java, JavaScript, Ruby, Python, .Net, Apple, Google – each with their own library system, to free us from the tyranny of operating systems, designed to free us from hardware. RMS did that in the 70s/80s.

I fought against the idea of Clojure (a Lisp dialect) running on the Java VM rather than a real OS. Another version runs on .Net and one is being ported to JavaScript. I get it now. People want to get stuff done and to do that, they need the support of a tribe (or two.)

MIT’s free educational videos contributed to my understanding of these issues. They used Scheme (another Lisp) before moving to Python to get access to more libraries. Perhaps they should move back to Clojure.



Software Life-cycle. Part 1 – From Engineering to Craftsmanship

I graduated just after the Structured Programming War was won. I was probably the first generation to be taught to program by someone who actually knew how; to be warned of the real and present danger of the GOTO statement and to be exposed first to a language that didn’t need it. I didn’t need to fall back to assembler when the going got tough or to be able to read hex dumps or deal with physical memory constraints. I entered the computing profession just as people were starting to re-brand their programmers as ‘software engineers’ and academics were talking of ‘formal methods’ then ‘iterative development’ and ‘prototyping’ as we lost faith and retreated, as the techniques borrowed from other engineering disciplines continued to disappoint when applied to software.

After some years away from software development, I returned to find ‘Agile’, ‘Lean’ and ‘Software Craftsmanship’. We’d surrendered to the chaos, accepted that we weren’t designers of great engineering works but software whittlers. I was pleased that we’d dropped the pretence that we knew what we were doing but disappointed that we’d been reduced to hand-weaving our systems like hipsters.

There had been another good change too: The Object Model. The thrust of software engineering had often been decomposition but our model had been the parts breakdown structure, the skeletal parts of a dead system. Objects allowed us to model running systems, the process network at the heart of computation. I’ve recently seen a claim that the Unix command line interface with its pipes and redirection was the first object system. Unix begat GNU and Free software and Linux and close to zero costs for the ‘means of production’ of software. I don’t believe that Agile or Lean start-ups could have happened in a world without objects, the Internet or Free software. Do you know how much work it takes to order software on a tape from the US by post? I do.

So here we are, in our loft rooms, on a hand crafted loom made of driftwood and sweat, iterating towards a vague idea emerging out of someone’s misty musings and watching our salary eroded towards the cost of production. Is this why I studied Computer Science for 3 years? Who turned my profession into a hobby activity?

20 Years Since Historic Brum Linux Event – ‘A storm was coming’

I did some Twitts this morning:

“The history of Welsh computing: Inmos Transputer, Raspberry Pi. Impressive. Whatever happened to parallel processing? Or druids.”

[At this point I did a search to check ‘David’ Cox’s name]

“…I should probably have included Alan Cox’s networking contributions to the Linux kernel in between those two.”

“… At least I now know what happened to one of the druids.”

[When I saw the photos, I thought 2 of them were a young RMS ]

“…I just learned that Alan Cox comes from Solihull, which may explain why I think I may have met him at the first Linux event I ever attended”

[Then I went back to the search Window and found this link]

“…Isn’t The Internet good? This meeting!

[18th September but which year: 1994? The first release of the kernel was in 1991.
but http://www.ukuug.org/about/timeline/
shows MH was UKUUG newsletter editor 1995-6.
Are we approaching 20 years of Linux (or “Free Unix”) in Birmingham?]

Yes kids, my first Linux distro was Lasermoon. Martin Houston also wrote the magazine article that caused me to be there and started SBLUG.”

Martin Houston was a quiet, unassuming programmer who first brought Linux to the attention of me and probably most people in Britain who’d heard of it at that point. He was “the organiser” of the UK  Unix User Group Linux SIG. I think his article in one of the DEC magazines was the first time I ever saw Linux mentioned and this meeting was at DEC’s office on the Birmingham Business Park, organised by the DEC User Society, DECUS. They must have been trying hard to recover from Ken Olsen’s accusation that Unix was snake-oil.

Soon afterwards I went to either the first or second meeting of the South Birmingham Linux User Group. Martin understood the importance of marketing and coined the phrase “A storm is coming and its name is Linux” which,  for 1994, showed remarkable foresight and possibly misplaced confidence. A few years later, Martin turned up at Powergen in Coventry as a contract programmer but I haven’t heard of him since.

I remember that the demonstration of a Linux installation on a “portable PC” (they didn’t fit on your lap then) by Colin Bruce of Coventry University involved floppy disks and a parallel port network adapter (‘portables’ didn’t have a network connection. What do you think this was, The Future?)

And yes, Linux kernel hacker Alan Cox, famously Welsh, is a Brummie.

Management Summary of ‘Social vs Capital’ parts 1 – 3

  1. Avoid hardware vendor lock-in by using Unix-style operating systems that can run on any appropriate hardware.
  2. Avoid software vendor lock-in by using Free & Open Source Software (FOSS) so development can be done by anyone with appropriate skills and shared for the common good.
  3. Avoid service provider lock-in by only using FOSS on Linux to provide services,  operated by multiple service providers in a distributed network with easy data transfer/mirroring between providers.

All 3 suggestions encourage fair competition between competitors in a free market, without the potential to abuse a market-leading position, to the benefit of consumers.

Social vs Capital Part 2

To recap:

Unix happened because companies trying to run their businesses using software didn’t like being dependent on the whims of hardware manufacturers. Each manufacturer defined their own hardware architecture. Customers wanted hardware-independence.

GNU/Linux and the Free & Open-Source Software movement happened because coders didn’t like waiting for someone else to fix their problems or to decide not to fix them. They wanted access to the source code and the legal right to change software and share their changes. They wanted software-supplier independence.

I simplified last time. BSD, GNU & Linux weren’t the only game in town. An important book was published: ‘Operating Systems, Design And Implementation’ by Andrew Tanenbaum. Tanenbaum believed in giving his students access to working source code. He had used Unix but when AT&T pulled up their draw-bridge, he needed a replacement that his readers could use and change – so he wrote one. ‘Minix’ was a minimal rewrite  of the key functions of Unix v7 which ran on a twin-floppy IBM PC (just about.) Unfortunately, the publishers, Prentice-Hall, insisted on retaining Copyright to the software. To get a copy, you had to buy a computer science book you probably didn’t want but. But I bought it, as did a young man called Linus Torvald. Tanenbaum was also instrumental in the production of the Amsterdam Compiler Kit which was the starting point for the GNU C compiler. My first sight of the GNU effort was on a listing for a 9-inch tape from the DEC User Society: It included emacs, gcc & the ACK, to be run on your own commercial Unix system.

Stallman and the GNU organisation were writing another replacement for Unix, free of copyrighted code. Their aim was to ensure that if you used their software, no-one else would ever prevent you from using code, particularly if you had contributed to it. Copyleft was born. ‘Free’ (as in beer) licences were not new. The BSD licence allows anyone to take BSD code and do what they wish with it, including building non-Free code on top and selling it.  OS X is built on FreeBSD but Apple sells licences and protects “its” intellectual property from re-use by competitors, including those companies that contributed code that Apple used. ‘Strangely’ Stallman didn’t think this was fair so worked towards creating the GNU licence, the GPL. The original idea has been described as “viral”. If you used GNU-licensed code then your code was required to be GNU-licensed too and you were required to make it available to anyone who wanted it, for only the cost of reproduction. THe GPL was arguably ‘less free’ because it enforced sharing and prevented commercial exploitation. GPL supporters point out that the code you contribute becomes a marketing tool to sell a future service. You offer a service (typically to write other software) rather than sell a software product licence. There is downward market pressure on price and upward on quality, to provide the best value service.
Compromises were made to the GPL  later, leading to the ‘Lesser’ LGPL. This allowed  software libraries to be used in conventionally licensed commercial software.

Having established the new ground-rules, GNU started work, from the top downwards. The bottom layer, the HURD kernel, has still not been delivered. Fortunately, that guy Linus started at the bottom and worked up. When he and his Internet recruits started needing to test their kernel, the GNU tools were ready. Because GNU & Linux were both copying the same open system interfaces, they worked together.

The Free Software movement happened because all these individuals knew they couldn’t do everything on their own. If they wrote something new or fixed a problem, they gave it away. In return, someone else would have fixed a future problem before they found it and shared the solution with them, free.

People sometimes question what happened to ‘the hippy generation’. It appears that many of them went into computer science and carried right on with implementing a community based on freedom & love, inside any institution that would pay them a salary. ‘The Community’ developed a culture and distributed processes and tools that anyone was allowed to use. When people who had grown up in this community started their own companies, they didn’t follow the IBM model, as Bill Gates and Steve Jobs had. They adopted the Free tool-kits of the hippies. Facebook, Google, the Nokia Maemo/MeeGo team, Red Hat, Canonical (Ubuntu Linux) and the Steam gaming platform come from a new breed of entrepreneur. They are not in business to sell you hardware or software as a product. They sell you a service on the Internet. Many of these services are ‘free at point of delivery’.

But there are costs and someone has to pay. You are no longer locked into a hardware or software supplier but to a single service provider and they have your information, probably the most valuable asset of your organisation.

Social vs Capital Part 1

When I joined ‘the computing industry’ (or was it ‘the data processing trade’?,) there were two kinds of computers: those made by IBM and the others. The others came in two flavours: IBM mainframe clones and ‘trying to be different’. Trying to be different was so successful that IBM were eventually forced to try being different to themselves.  The various hardware families all ran different operating systems. Changing hardware required all your software to be rewritten. Moving from IBMish mainframes meant your data had to be translated into ASCII. The proposed solution to the operating system problem was Unix. Unix was created to give hardware independence, through software portability. It was made easier to ‘port’ Unix by writing it in the C programming language rather than the specific assembly language of the hardware.

This revolution happened within AT&T, a company prevented from competing with IBM by anti-trust legislation. Freed from the profit motive, other than the desire to save costs, they did with Unix what was best for everyone. They gave it and its source code away free to anyone who wanted it. More importantly, they allowed its improvement by universities.

Later, the US government started to allow commercial exploitation of Unix by AT&T. Key source code became subject to non-disclosure agreements and the fastest period of cooperative computing innovation up to that point was closed down.

Two important things came out of this disaster – 1) PCs and hence Microsoft and 2) the Berkeley System Distribution (BSD) of Unix and GNU’s Not Unix (GNU), led by Richard Stallman, whose frustration at not having the source code to fix his own printer gave him such a mighty itch, he kick-started the whole Free and Open Source Software (FOSS) movement and it’s biggest success, the Linux Operating System, recently made popular by Google. Bill Gates’ biggest competitor was never Steve Jobs; it was an idea set loose by idealistic academics – that people are stronger when they share the product of their labours, that you pay people for producing, not for the product. This was a harmless ideal at first because large organisations owned the computers that were a key part of the means of production.

I am indebted to Robin Ince again, for pointing out in his TEDx Dublin talk ‘The Mind is a Chaos of Delight’ http://www.youtube.com/watch?v=0pfOHaWeTr8 that Evolution doesn’t predict only “survival of the fittest” but ‘survival of the just good enough not to die’, which I think explains Microsoft’s success, and for poking me in the profit-motive with his blog entry http://robinince.wordpress.com/2013/11/26/i-was-going-to-jump-in-the-canal-to-save-the-drowning-man-but-then-i-thought-whats-in-it-for-me/, to finally start this troubled tale of open software.

FOSS has been running around in the background, largely unnoticed by the lumbering beasts, much like the early mammals. Apple OS X is built on FreeBSD and Google Android and Chrome OS are based on the Linux kernel. In the long run, Apple and Google may look like the last of the small, fast raptors rather than the first intelligent apes, because somewhere along the way, the sharing became one-directional, and their essentially predatory nature struggled to survive as their more social competitors saw the danger and drove them into the swamp.

In forthcoming posts, I plan to look at the dangers the FOSS communities’ dreams of Freedom are facing in the current collision with Capitalism.

Masters of TeX

What are these things we call ‘posts’,  ‘documents’, ‘essays’ or ‘books’? If a picture is worth a thousand words then pictures are clearly CHEATING so, for now, let’s restrict our  attention to text-only ‘things’. In an earlier post, I called them “text objects”.

I have an unusual writing technology history. I was late enough for Computer Science to exist but early enough for it not to have had much effect. My final-year project was typed on a manual typewriter, by my Mum who had the necessary ‘office skills’. I joined the workplace when type-writers were on the wane but before word-processors had taken hold. A time of daisy-wheel printers as an alternative to clickety-clack line-printing onto stripy fan-fold paper. PCs hadn’t happened yet. My first software for writing, after a text editor, was a text processor. I learned to use DEC’s Runoff to format text but I might equally have used roff, nroff or troff on Unix. They were similar tagging languages derived from Runoff for Multics. Runoff arrived at DEC via MIT’s CTSS and University of California at  Berkeley. This was an age in which an idea for software was not patentable. Everyone copied the good ideas and progressed together.

One of the key decisions in the design of the Unix operating system was that all data was a bit-stream. One type of bit-stream was the text-file: printable, characters, each represented by a unique 8-bit byte-code. Unix came with a free set of tools that allowed people who thought like programmers to manipulate text files files very efficiently. An author created a text stream with embedded ‘markup’ language to give hints about structure and style. It was a technique borrowed from publishing.

Word processors were for typists. They were an incremental upgrade to a type-writer. ‘Normal’ people didn’t think like programmers. Normal people just want to print letters, not ‘run-off’ a new copy of their maths PhD, full of strange characters, diagrams and correct pagination.

In 1978, mathematician Donald Knuth moved text-processing forward into full, commercial grade type-setting with TeX. It had all the complexity that always comes free with flexibility. Technical authors no longer needed to risk having their beautiful formulae mangled by an innumerate type-setter. Leslie Lamport introduced a macro compiler to make TeX easier to use but it was too late. The ‘adequate mouse-trap’ had been sold, to the lowest time-investment bidder. You probably came in at WordStar or WordPerfect or Word or even Google Docs and you have my sincere sympathy.

Now, you are unlikely to use a ‘text processing language’ unless you are a Unix, or more likely Linux weirdo. Which is a great shame, because, even if you are normal, you have probably started to think like a programmer. You may find yourself wanting version control or shared document authorship or multiple output formats from the same source, or with variants.

We all need to take a few steps back before we can move forward. Word processors and WYSIWYG are wrong-headed. What-You-Get is a many headed monster. You can’t see it because it may not exist yet.

I’d love to paint a happy picture now, of software tools, available Free that are going to make it all easy – but I can’t. There are 3 viable competing tagging standards: LaTeX, DocBook XML and DITA. The newer standards have fewest mature tools. The state of the Free market in this area is ‘broken’. There are commercial tools and Unix can offer you tagging modes for text editors.

This is about where I came in, in 1982. 30 years of progress lost because accountants picked the wrong path to take computer science.