Category Archives: General

The importance of a letter, to the BBC

I have entered politics. I asked for this tweet to be corrected and the BBC did.
https://twitter.com/BBCr4today/status/904958811546554368
It originally said “your”. I think that 1-letter difference is quite important.

Iceland’s foreign minister says countries want free trade deals with the UK #r4today

It could have been a simple typo. It could have been an optimist, wanting to believe some good news about Brexit or it could have been government propaganda. I wish I could be sure that it wasn’t the latter.

1 letter to change a flood of international imports into hope of exports and fixing the UK’s balance of trade deficit. Politics is the art of getting people to agree with you, whether you are right or wrong. Sadly, the reality of Brexit will still be true, whatever it is, and believing it will work doesn’t actually help much.

How Low, Linux Hardware Punks?

I volunteered a couple of times to help at kids’ coding events. One of the first things I noticed was that they had better kit than me. I like keeping old computer hardware going as long as is humanly possible. I’m not retro, I’m ‘careful’; with the Earth’s resources rather than with money. I was giving my time freely in the hope of giving young people a better start in life. I realised I was giving it, almost exclusively to the children of Middle Class professionals. They were great young people, they’d had their parents support.

What the ‘other’ kids need, is a punk movement. They need the junk-shop chic of battered second-had guitars and home re-styled charity shop clothing. They have Raspberry Pi’s at school so they’ve seen Linux but they need their schools to give them access to downloads. More than 15% of UK homes don’t have Internet. THIS 15%. Meanwhile, companies throw away perfectly good kit because it’s cheaper than upgrading. There’s now a Raspian Linux for PCs so I’m not the only person aware of this untapped resource for education.

I wrote a couple of tweets this morning:

“What low spec hardware running Linux are people doing things with? Is already a Thing? (please RT: for reach)”

and https://twitter.com/WooTube/status/900646598019080194
“I want to make disposable kit cool for kids without access to disposable income (from my Eee PC 1000 netbook )”

I don’t see this working as a charity. No-one wants to be a charity case.
It might work as a youth movement. I’m both too old and too comfortable to lead a ‘working’ class rebellion. Do any of you young’uns fancy a go?

“A storm is coming and it’s name is Linux” as we used to say in the olden days.
(CopyLeft Martin Houston. He’s the guy who corrupted me at a DECUS meeting
http://www.deluxetech.co.uk/history-linux-magazine-cover-disks/
Linux-FT was my first distro because of him.)

Truth is beauty; but is that all?

I was reading about Clojure’s views on truth and falsehood this morning. Some of them are interesting:
(true? true) ; -> true
(false? false) ; -> true – a classic double negative

Clojure also has the value ‘nil’.
(true? nil) ; -> false
(false? nil) ; -> false

Then I went on LinkedIn and someone asked if there is ever an absolute truth. It’s a question I’ve been thinking about, so I wrote this:

I think that “at non-quantum scales”, it seems likely there can be only one set of events which actually happened but every human is wandering around in their own model of reality, based on their perception of incomplete data. Sometimes, a slow-motion replay helps fill in some of the gaps but it will be the same every time we look and it may not reveal ‘the whole truth’ we seek. We all have a simplified view of what reality is, based on our personal knowledge and beliefs and we can’t go back in time for more data, so our view of truth is an approximation. Heisenberg’s Uncertainty Principle says that it can only ever be an approximation.

In summary: I believe there is one truth but that doesn’t mean we will ever know what it is. Alternative perceptions may be just as valid as our own; possibly more so.

[ The original version of this post said nil meant “I don’t know.” I immediately discovered that was wrong. In Clojure, only ‘false’ returns ‘true’ to ‘false?’ but nil is ‘falsey’, so each of the following forms returns “F”:
(if false “T” “F”)
(if nil “T” “F”)
You can see how that could confuse a stupid person ]

Process + Data + Structure = Engineered Software

I tried to address a question about data structure on Quora. This post is a stand-alone version of the answer I gave.

‘In the beginning’ there was ‘Data Processing’. That is: ‘data’ and ‘process’, expressed in the form of a program. Programs implement algorithms.

In 1976, the practice of ‘Structured Programming’ was trending and a book was written: Algorithms + Data Structures = Programs ( Wikipedia entry)

Processes and their structure + Data and it’s structure = Programs.

If you ignore interactions with the real world, that’s all there is. If you take any working program and ignore the processes and their structure and the raw data, then whatever is left is data structure.

We structure data because the alternative is data sauce, traditionally only ever served with spaghetti code.

Reality has Levels

It’s been a while since I blogged. I’ve been busy.

A major theme emerging from ‘writing my book’ is that we humans are very bad at confusing our models of reality with the reality we are modelling.

I started planning with the ‘Freemind mind-mapping tool for hierarchical brains’ before finding my own creative process had a network architecture and discovering ‘concept mapping’ which uses graphs to represent concepts and propositions. I saw that graphs were what I needed and decided to experiment with building my own software tools from bits I had lying around.

I didn’t have a current programming language, so I set out to learn Clojure. Being a Lisp, Clojure uses tree-structures internally to represent lists and extends the idea to abstractions such as collections but the only native data structures available to me appeared to be 1-dimensional.  I confidently expected to be able to find ways to extend this to 3 or more dimensional graphs but despite much reading and learning lots of other things, I’d failed to find what I was looking for. I had in mind the kind of structures you can build with pointers, in languages like ‘C’. There are graph libraries but I was too new to Clojure to believe my first serious program needed to be dependent on language extensions, when I haven’t securely grasped the basics.

This morning, I think I ‘got it’. I am trying to build a computational model of my graphical view of a mathematical idea which models a cognitive model of reality. There was always scope for confusion. Graphs aren’t really a picture, they are a set of 1-dimensional connections and potentially another set of potential views of those connections, constructed for a particular purpose. I was trying to build a data structure of a view of a graph, not the graph itself and that was a really bad idea. At least I didn’t get software confused with ‘actual’ reality, so there’s still hope for me.

Yesterday, I used Clojure/Leiningen’s in-built Test-Driven Development tool for the first time. It looks great. The functional model makes TDD simple.

When your Netbook falls off its Sky-hook

I have an Eee PC 1000 ‘netbook’ that has been with me for a while. It’s not very fast but then I type quite slowly. It’s become a personal challenge to see how long I can keep using it. It’s always run Linux but the latest version of Ubuntu won’t fit into its (ample, in my opinion) 1GB of memory. I can’t upgrade it in place either because it has 2 SSDs, currently configured as / on the 8GB and /user on the 32GB. A couple of days ago my desktop environment went AWOL.

Trying to deal with the space imitations, I’d tried booting it from a Lubuntu Live memory stick. It seemed quicker with LXDE but Lubuntu also has leaner apps than the ones I’m used to, so I decided not to install it permanently. I may find another way to keep my current apps but replace Unity by LXDE. Afterwards, I think I rebooted it to check it was OK but I may have shut down from the login screen. The following morning I logged in and got an empty, frozen desktop display. I couldn’t even open a terminal window but I found I could log in to the Guest account. Odd. I opened a console window from my normal account and rolled up my sleeves. I had a /user (a different one, I later discovered.)

To cut a long story short, the answer was:

$ sudo mount -a /dev/sdb1 /home

My home directory wasn’t there but because it wasn’t it had fallen back to the original /user folder on the system disk. The Guest account logs in on /tmp on the system disk, so didn’t have a problem. Now, I just need to work out why whatever was auto-magically mounting it for me and why it decided to stop.

[ Update: The permanent fix was to find out the ID of my device with
$ sudo blkid

then add the following line to /etc/fstab

UUID=4b18fe5c-2d2a-4d12-938b-a38046a3cf84 /home ext4  errors=remount-ro 0  0

I still haven’t found the hole in the sky where the hook came detached. ]

Living a Virtual Life

There is a Taoist story about it being impossible to know at the time whether an event is lucky or unlucky. At my age, you start to reflect how things have gone, from a safe distance.

I planned to go to Birmingham University to study mathematics with a side-order of computer science. My ‘A’ Level results were, to put it mildly, ‘below expectation’ so I scraped into Aston through the Clearing process, to study mathematics, computer science and physics. The teaching language was Algol 68 and the visionary assumption throughout the course was that within a few years all computers would be virtual memory systems. We would never have to worry about physical restrictions on memory allocation. We had a linear address space to play with, that could be as big as the available disk space and there would be a garbage collector to tidy up after me. A few years later, PCs were to make those assumptions invalid.

I actually graduating into a recession caused by a war to the death between Margaret Thatcher and the unions. Many large companies cancelled their graduate recruitment programmes. I was unemployed until just before Christmas, when I took the first job I was offered, as a programmer in a College of Higher Education in Cambridge. I’d never heard of the computer they used. It was one of the first batch of half a dozen DEC VAXes delivered to the UK: a 32-bit super-mini running the new Virtual Memory System OS, VAX/VMS. I specialised in VMS/OpenVMS for the next 25 years, gradually becoming a system manager and specialist in high-availability clusters and development environments. I had side-stepped Bill Gates’ “No-one needs more than 640K” pronouncement and all the mess that went with it.

I lost direct touch with software development until a few years ago when I joined an agile team as analyst and decided I wanted to get back into writing code. Initially I picked Python, until I saw a demonstration of Clojure. I knew I had to have it. Clojure designer Rich Hickey says that we can treat disk space as effectively infinite. That has a huge impact on our ability to design software as temporal flow rather than last known state. Servers have become virtual too. Software is doing everything it can to escape the physical realm entirely. I’m holding on for a free ride, hoping to stay lucky, a link to a virtual copy of ‘The Wizard Book’ on my Cloud-drive. Nothing is Real. I’m not even sure about Time.

Things I used to be Wrong about – Part 1

I get very annoyed about politicians being held to account for admitting they were wrong, rather than forcefully challenged when they were wrong in the first place. Unless they lied, if someone was wrong and admits it, they should be congratulated. They have grown as a human being.

I am about to do something very similar. I’m going to start confessing some wrong things I used to think, that the world has come to agree with me about. I feel I should congratulate you all.

You can’t design a Database without knowing how it will be used

I was taught at university that you could create a single abstract data model of an organisation’s data. “The word database has no plural”, I was told. I tried to create a model of all street furniture (signs and lighting) in Staffordshire, in my second job. I couldn’t do it. I concluded that it was impossible to know what was entities and what was attributes. I now know this is because models are always created for a purpose. If you aren’t yet aware of that purpose, you can’t design for it. My suspicion was confirmed in a talk at Wolverhampton University by Michael ‘JSD’ Jackson. The revelation seemed a big shock to the large team from the Inland Revenue. I guess they had made unconscious assumptions about likely processes.

Relations don’t understand time

(They would probably say the same about me.) A transaction acting across multiple tables is assumed to be instantaneous. This worried me. A complex calculation requiring reads could not be guaranteed to be consistent unless all accessed tables are locked against writes, throughout the transaction. Jackson also confirmed that the Relational Model has no concept of time. A dirty fix is data warehousing which achieves consistency without locking by the trade-off of guaranteeing the data is old.

The Object Model doesn’t generalise

I’d stopped developing software by the time I heard about the Object Oriented Programming paradigm. I could see a lot of sense in OOP for simulating real-world objects. Software could be designed to be more modular when the data structures representing the state of a real-world object and the code which handled state-change were kept in a black box with a sign on that said “Beware of the leopard”. I couldn’t grasp how people filled the space between the objects with imaginary software objects that followed the same restrictions, or why they needed to.

A new wave of Functional Programming has introduced immutable data structures. I have recently learned through Clojure author Rich Hickey’s videos that reflecting state-change by mutating the value of variables is now a sin punishable by a career in Java programming. Functional Programmers have apparently always agreed with me that not all data structures belong in an object

There are others I’m still waiting for everyone to catch up on:

The Writable Web is a bad idea

The Web wasn’t designed for this isn’t very good at it. Throwing complexity bombs at an over-simplified model rarely helps.

Rich Hickey’s Datomic doesn’t appear to have fixed my entity:attribute issue

Maybe that one is impossible.

A Brexit Thought Experiment

I’m a big fan of thought experiments. I like science but I’m too lazy to do real experiments. Why do something when you can think about doing it?

I’ve been observing the political manoeuvring around Brexit and 2nd referendums. I think people are saying things they don’t really believe in order to get an outcome they believe to be right and people are saying things which sound good, to hide the evil swirling beneath the surface.

I asked myself: Which is the greater wrong: doing a good thing for a bad reason or a bad thing for a good reason?

I thought:

‘A good thing’ is highly subjective, depending on your personal values and consequent belief in what is fair. A comparison of  ‘bad thing’s is probably even more fluid. I see it in terms of balance between good and harm to self and others. It’s complex.

‘Good’ and ‘bad’ reasons also depend on your personal targets and motivations along with another subjective moral evaluation of those.

An individual may see a good thing as a positive value and a bad thing as a negative value and believe that as long as the sum is positive, so is the whole package. People call this “pragmatism”. They also tell me it is easier to ask for forgiveness than permission. These people get things done and, generally, only hurt other people.

‘A reason’ sounds like dressing up something you feel you want in logic. Is that always reasonable?

We need to balance what we want and our chances of success against the risks and uncertainty of what we might lose or fail to achieve. To measure success objectively, we need to have specified some targets before we start.

Brexit didn’t have either a plan or targets. It appears to be driven by things that people don’t want. How will we know if it has succeeded or failed? We are told the strategy and tactics must be kept secret or the plan will fail and targets will be missed. If this was a project I was working on, I’d be reading the jobs pages every lunch time. I’ve stopped worrying about the thought experiment.

Women’s Day Intuition

The first thing I did yesterday, on International Women’s Day 2017, was retweet a picture of Margaret Hamilton, allegedly the first person in the world to have the job title ‘Software Engineer’. The tweet claimed the pile of printout she was standing beside, as tall as her, was all the tweets asking “Why isn’t there an International Men’s Day?” (There is. It’s November 19th, the first day of snowflake season.) The listings were actually the source code which her team wrote to make the Apollo moon mission possible. She was the first virtual woman on the Moon.

I followed up with a link to a graph showing the disastrous decline of women working in software development since 1985, by way of an explanation of why equal opportunities aren’t yet a done deal. I immediately received a reply from a man, saying there had been plenty of advances in computer hardware and software since 1985, so perhaps that wasn’t a coincidence. This post is dedicated to him.

I believe that the decade 1975 – 1985, when the number of women in computing was still growing fast, was the most productive since the first, starting in the late 1830s, when Dame Ada Lovelace made up precisely 50% of the computer software workforce worldwide. It also happens to approximately coincide with the first time I encountered computing, in about 1974 and stopped writing software in about 1986.

1975 – 1985:
As I entered: Punched cards then a teletype, connected to a 24-bit ICL 1900-series mainframe via 300 Baud accoustic coupler and phone line. A trendy new teaching language called BASIC, complete with GOTOs.

As I left: Terminals containing a ‘microprocessor’, screen addressable via ANSI escape sequences or bit-mapped graphics terminals, connected to 32-bit super-minis, enabling ‘design’. I used a programming language-agnostic environment with a standard run-time library and a symbolic debugger. BBC Micros were in schools. The X windowing system was about to standardise graphics. Unix and ‘C’ were breaking out of the universities along with Free and Open culture, functional and declarative programming and AI. The danger of the limits of physics and the need for parallelism loomed out of the mist.

So, what was this remarkable progress in the 30 years from 1986 to 2016?

Good:

Parallel processing research provided Communicating Sequential Processes and the Inmos Transputer.
Declarative, non-functional languages that led to ‘expert systems’. Lower expectations got AI moving.
Functional languages got immutable data.
Scripting languages like Python & Ruby for Rails, leading to the death of BASIC in schools.
Wider access to the Internet.
The read-only Web.
The idea of social media.
Lean and agile thinking. The decline of the software project religion.
The GNU GPL and Linux.
Open, distributed platforms like git, free from service monopolies.
The Raspberry Pi and computer science in schools

Only looked good:

The rise of PCs to under-cut Unix workstations and break the Data Processing department control. Microsoft took control instead.
Reduced Instruction Set Computers were invented, providing us with a free 30 year window to work out the problem of parallelism but meaning we didn’t bother.
In 1980, Alan Kay had invented Smalltalk and the Object Oriented paradigm of computing, allowing complex real-world objects to be simulated and everything else to be modelled as though it was a simulation of objects, even if you had to invent them. Smalltalk did no great harm but in 1983 Bjarne Stroustrup left the lab door open and C++ escaped into the wild. By 1985, objects had become uncontrollable. They were EVERYWHERE.
Software Engineering. Because writing software is exactly like building a house, despite the lack of gravity.
Java, a mutant C++, forms the largely unrelated brand-hybrid JavaScript.
Microsoft re-invents DEC’s VMS and Sun’s Java, as 32-bit Windows NT, .NET and C# then destroys all the evidence.
The reality of social media.
The writeable Web.
Multi-core processors for speed (don’t panic, functions can save us.)

Why did women stop seeing computing as a sensible career choice in 1985 when “mine is bigger than yours” PCs arrived and reconsider when everyone at school uses the same Raspberry Pi and multi-tasking is becoming important again? Probably that famous ‘female intuition’. They can see the world of computing needs real functioning humans again.