Category Archives: General

When your Netbook falls off its Sky-hook

I have an Eee PC 1000 ‘netbook’ that has been with me for a while. It’s not very fast but then I type quite slowly. It’s become a personal challenge to see how long I can keep using it. It’s always run Linux but the latest version of Ubuntu won’t fit into its (ample, in my opinion) 1GB of memory. I can’t upgrade it in place either because it has 2 SSDs, currently configured as / on the 8GB and /user on the 32GB. A couple of days ago my desktop environment went AWOL.

Trying to deal with the space imitations, I’d tried booting it from a Lubuntu Live memory stick. It seemed quicker with LXDE but Lubuntu also has leaner apps than the ones I’m used to, so I decided not to install it permanently. I may find another way to keep my current apps but replace Unity by LXDE. Afterwards, I think I rebooted it to check it was OK but I may have shut down from the login screen. The following morning I logged in and got an empty, frozen desktop display. I couldn’t even open a terminal window but I found I could log in to the Guest account. Odd. I opened a console window from my normal account and rolled up my sleeves. I had a /user (a different one, I later discovered.)

To cut a long story short, the answer was:

$ sudo mount -a /dev/sdb1 /home

My home directory wasn’t there but because it wasn’t it had fallen back to the original /user folder on the system disk. The Guest account logs in on /tmp on the system disk, so didn’t have a problem. Now, I just need to work out why whatever was auto-magically mounting it for me and why it decided to stop.

[ Update: The permanent fix was to find out the ID of my device with
$ sudo blkid

then add the following line to /etc/fstab

UUID=4b18fe5c-2d2a-4d12-938b-a38046a3cf84 /home ext4  errors=remount-ro 0  0

I still haven’t found the hole in the sky where the hook came detached. ]

Living a Virtual Life

There is a Taoists story about it being impossible to know at the time whether an event is lucky or unlucky. At my age, you start to reflect how things have gone, from a safe distance.

I planned to go to Birmingham University to study mathematics with a side-order of computer science. My ‘A’ Level results were, to put it mildly, ‘below expectation’ so I scraped into Aston through the Clearing process, to study mathematics, computer science and physics. The teaching language was Algol 68 and the visionary assumption throughout the course was that within a few years all computers would be virtual memory systems. We would never have to worry about physical restrictions on memory allocation. We had a linear address space to play with, that could be as big as the available disk space and there would be a garbage collector to tidy up after me. A few years later, PCs were to make those assumptions invalid.

I actually graduating into a recession caused by a war to the death between Margaret Thatcher and the unions. Many large companies cancelled their graduate recruitment programmes. I was unemployed until just before Christmas, when I took the first job I was offered, as a programmer in a College of Higher Education in Cambridge. I’d never heard of the computer they used. It was one of the first batch of half a dozen DEC VAXes delivered to the UK: a 32-bit super-mini running the new Virtual Memory System OS, VAX/VMS. I specialised in VMS/OpenVMS for the next 25 years, gradually becoming a system manager and specialist in high-availability clusters and development environments. I had side-stepped Bill Gates’ “No-one needs more than 640K” pronouncement and all the mess that went with it.

I lost direct touch with software development until a few years ago when I joined an agile team as analyst and decided I wanted to get back into writing code. Initially I picked Python, until I saw a demonstration of Clojure. I knew I had to have it. Clojure designer Rich Hickey says that we can treat disk space as effectively infinite. That has a huge impact on our ability to design software as temporal flow rather than last known state. Servers have become virtual too. Software is doing everything it can to escape the physical realm entirely. I’m holding on for a free ride, hoping to stay lucky, a link to a virtual copy of ‘The Wizard Book’ on my Cloud-drive. Nothing is Real. I’m not even sure about Time.

Things I used to be Wrong about – Part 1

I get very annoyed about politicians being held to account for admitting they were wrong, rather than forcefully challenged when they were wrong in the first place. Unless they lied, if someone was wrong and admits it, they should be congratulated. They have grown as a human being.

I am about to do something very similar. I’m going to start confessing some wrong things I used to think, that the world has come to agree with me about. I feel I should congratulate you all.

You can’t design a Database without knowing how it will be used

I was taught at university that you could create a single abstract data model of an organisation’s data. “The word database has no plural”, I was told. I tried to create a model of all street furniture (signs and lighting) in Staffordshire, in my second job. I couldn’t do it. I concluded that it was impossible to know what was entities and what was attributes. I now know this is because models are always created for a purpose. If you aren’t yet aware of that purpose, you can’t design for it. My suspicion was confirmed in a talk at Wolverhampton University by Michael ‘JSD’ Jackson. The revelation seemed a big shock to the large team from the Inland Revenue. I guess they had made unconscious assumptions about likely processes.

Relations don’t understand time

(They would probably say the same about me.) A transaction acting across multiple tables is assumed to be instantaneous. This worried me. A complex calculation requiring reads could not be guaranteed to be consistent unless all accessed tables are locked against writes, throughout the transaction. Jackson also confirmed that the Relational Model has no concept of time. A dirty fix is data warehousing which achieves consistency without locking by the trade-off of guaranteeing the data is old.

The Object Model doesn’t generalise

I’d stopped developing software by the time I heard about the Object Oriented Programming paradigm. I could see a lot of sense in OOP for simulating real-world objects. Software could be designed to be more modular when the data structures representing the state of a real-world object and the code which handled state-change were kept in a black box with a sign on that said “Beware of the leopard”. I couldn’t grasp how people filled the space between the objects with imaginary software objects that followed the same restrictions, or why they needed to.

A new wave of Functional Programming has introduced immutable data structures. I have recently learned through Clojure author Rich Hickey’s videos that reflecting state-change by mutating the value of variables is now a sin punishable by a career in Java programming. Functional Programmers have apparently always agreed with me that not all data structures belong in an object

There are others I’m still waiting for everyone to catch up on:

The Writable Web is a bad idea

The Web wasn’t designed for this isn’t very good at it. Throwing complexity bombs at an over-simplified model rarely helps.

Rich Hickey’s Datomic doesn’t appear to have fixed my entity:attribute issue

Maybe that one is impossible.

A Brexit Thought Experiment

I’m a big fan of thought experiments. I like science but I’m too lazy to do real experiments. Why do something when you can think about doing it?

I’ve been observing the political manoeuvring around Brexit and 2nd referendums. I think people are saying things they don’t really believe in order to get an outcome they believe to be right and people are saying things which sound good, to hide the evil swirling beneath the surface.

I asked myself: Which is the greater wrong: doing a good thing for a bad reason or a bad thing for a good reason?

I thought:

‘A good thing’ is highly subjective, depending on your personal values and consequent belief in what is fair. A comparison of  ‘bad thing’s is probably even more fluid. I see it in terms of balance between good and harm to self and others. It’s complex.

‘Good’ and ‘bad’ reasons also depend on your personal targets and motivations along with another subjective moral evaluation of those.

An individual may see a good thing as a positive value and a bad thing as a negative value and believe that as long as the sum is positive, so is the whole package. People call this “pragmatism”. They also tell me it is easier to ask for forgiveness than permission. These people get things done and, generally, only hurt other people.

‘A reason’ sounds like dressing up something you feel you want in logic. Is that always reasonable?

We need to balance what we want and our chances of success against the risks and uncertainty of what we might lose or fail to achieve. To measure success objectively, we need to have specified some targets before we start.

Brexit didn’t have either a plan or targets. It appears to be driven by things that people don’t want. How will we know if it has succeeded or failed? We are told the strategy and tactics must be kept secret or the plan will fail and targets will be missed. If this was a project I was working on, I’d be reading the jobs pages every lunch time. I’ve stopped worrying about the thought experiment.

Women’s Day Intuition

The first thing I did yesterday, on International Women’s Day 2017, was retweet a picture of Margaret Hamilton, allegedly the first person in the world to have the job title ‘Software Engineer’. The tweet claimed the pile of printout she was standing beside, as tall as her, was all the tweets asking “Why isn’t there an International Men’s Day?” (There is. It’s November 19th, the first day of snowflake season.) The listings were actually the source code which her team wrote to make the Apollo moon mission possible. She was the first virtual woman on the Moon.

I followed up with a link to a graph showing the disastrous decline of women working in software development since 1985, by way of an explanation of why equal opportunities aren’t yet a done deal. I immediately received a reply from a man, saying there had been plenty of advances in computer hardware and software since 1985, so perhaps that wasn’t a coincidence. This post is dedicated to him.

I believe that the decade 1975 – 1985, when the number of women in computing was still growing fast, was the most productive since the first, starting in the late 1830s, when Dame Ada Lovelace made up precisely 50% of the computer software workforce worldwide. It also happens to approximately coincide with the first time I encountered computing, in about 1974 and stopped writing software in about 1986.

1975 – 1985:
As I entered: Punched cards then a teletype, connected to a 24-bit ICL 1900-series mainframe via 300 Baud accoustic coupler and phone line. A trendy new teaching language called BASIC, complete with GOTOs.

As I left: Terminals containing a ‘microprocessor’, screen addressable via ANSI escape sequences or bit-mapped graphics terminals, connected to 32-bit super-minis, enabling ‘design’. I used a programming language-agnostic environment with a standard run-time library and a symbolic debugger. BBC Micros were in schools. The X windowing system was about to standardise graphics. Unix and ‘C’ were breaking out of the universities along with Free and Open culture, functional and declarative programming and AI. The danger of the limits of physics and the need for parallelism loomed out of the mist.

So, what was this remarkable progress in the 30 years from 1986 to 2016?

Good:

Parallel processing research provided Communicating Sequential Processes and the Inmos Transputer.
Declarative, non-functional languages that led to ‘expert systems’. Lower expectations got AI moving.
Functional languages got immutable data.
Scripting languages like Python & Ruby for Rails, leading to the death of BASIC in schools.
Wider access to the Internet.
The read-only Web.
The idea of social media.
Lean and agile thinking. The decline of the software project religion.
The GNU GPL and Linux.
Open, distributed platforms like git, free from service monopolies.
The Raspberry Pi and computer science in schools

Only looked good:

The rise of PCs to under-cut Unix workstations and break the Data Processing department control. Microsoft took control instead.
Reduced Instruction Set Computers were invented, providing us with a free 30 year window to work out the problem of parallelism but meaning we didn’t bother.
In 1980, Alan Kay had invented Smalltalk and the Object Oriented paradigm of computing, allowing complex real-world objects to be simulated and everything else to be modelled as though it was a simulation of objects, even if you had to invent them. Smalltalk did no great harm but in 1983 Bjarne Stroustrup left the lab door open and C++ escaped into the wild. By 1985, objects had become uncontrollable. They were EVERYWHERE.
Software Engineering. Because writing software is exactly like building a house, despite the lack of gravity.
Java, a mutant C++, forms the largely unrelated brand-hybrid JavaScript.
Microsoft re-invents DEC’s VMS and Sun’s Java, as 32-bit Windows NT, .NET and C# then destroys all the evidence.
The reality of social media.
The writeable Web.
Multi-core processors for speed (don’t panic, functions can save us.)

Why did women stop seeing computing as a sensible career choice in 1985 when “mine is bigger than yours” PCs arrived and reconsider when everyone at school uses the same Raspberry Pi and multi-tasking is becoming important again? Probably that famous ‘female intuition’. They can see the world of computing needs real functioning humans again.

My New Model of Left-Right Politics

Because (I like to think) I’m human, I make models of the world around me. Because I’m a computer scientist/a bit weird, I write them down or draw pictures of them. Since I got interested in why some intelligent people have different political views to me, a couple of years ago, I’ve been trying to model the values which underlie people’s belief systems, which I believe determine their political views.

My working model for the values of Left-Right politics (I’m a fluffy compromise near the middle of this scale but I have other scales, upon which I weigh myself a dangerous radical) has been that The Left believe in Equality and The Right in Selfishness. As a radical liberal, I obviously think both extremes are the preserve of drivelling idiots – compromise is all. The flies in my ointment have been the selfishness of the Far Left and the suicidal economic tendencies of working class nationalists in wanting to #Brexit. My model clearly had flaws.

This morning I was amusing myself with a #UKIP fan who countered being told by a woman that it was best to have O type blood (presumably because it is the universal donor) by saying it was best to be AB, so he could receive any blood (a universal recipient.) On the surface this seems to confirm the selfishness theory but I made an intuitive leap that he thought he was too special to lose, which was far from the conclusion I’d arrived at, during our discussion.

My new, modified theory is that the Left think ‘no-one should get special treatment’ and the Right think ‘My DNA is special. I deserve more’. This belief that “I am/am not special” has almost no correlation with the evidence, or even with class. I have no evidence of whether the characteristic is inherited or learned but Michael Gove and members of the BNP clearly  decided that they were special and deserve to be treated better than other people. Tony Benn, on the other hand, argued himself out of believing that he had a God-given right to a place in the House of Lords. Please let me know why I’m wrong.

Hybrid Vigour

Having been forced by Mrs. Woo to take a week of holiday from what she usually refers to as “staying at home, doing nothing”, I found myself on the Snowdrop Trail tour in the garden at https://www.nationaltrust.org.uk/sunnycroft. I was quite surprised to discover that the leaflet I was handed had photographs of 17 of the forty-odd different varieties of snowdrop in the garden. Later in the day I was talking to our guide’s long suffering wife who explained that there were over 400 varieties in his garden, of the approximately four and a half thousand types currently known to be in existence.

People with a passion often interest me. Quite why these smallish, mostly white flowers that I would previously have assumed were all the same had become the focus of this man’s life was not clear but his obvious fascination was infectious. It almost made me wish that I was a little less promiscuous in my obsessions. For our second snowdrop trip of the week, we visited https://www.nationaltrust.org.uk/attingham-park to see snowdrops at scale, lining the woodland floor. They all looked the same. I have much to learn.

My favourite Sunnycroft snowdrop fact was that when chasing the holy grail of a large ‘double’ snowdrop, gardeners had used the advantage of ‘hybrid vigour’ to mix and match the desirable characterisics of different species and create a giant amongst snowdrops. Take that, racial supremacists: you’re all snowflakes, by comparison!

https://en.wikipedia.org/wiki/Heterosis