Tag Archives: functional programming

Things I used to be Wrong about – Part 1

I get very annoyed about politicians being held to account for admitting they were wrong, rather than forcefully challenged when they were wrong in the first place. Unless they lied, if someone was wrong and admits it, they should be congratulated. They have grown as a human being.

I am about to do something very similar. I’m going to start confessing some wrong things I used to think, that the world has come to agree with me about. I feel I should congratulate you all.

You can’t design a Database without knowing how it will be used

I was taught at university that you could create a single abstract data model of an organisation’s data. “The word database has no plural”, I was told. I tried to create a model of all street furniture (signs and lighting) in Staffordshire, in my second job. I couldn’t do it. I concluded that it was impossible to know what was entities and what was attributes. I now know this is because models are always created for a purpose. If you aren’t yet aware of that purpose, you can’t design for it. My suspicion was confirmed in a talk at Wolverhampton University by Michael ‘JSD’ Jackson. The revelation seemed a big shock to the large team from the Inland Revenue. I guess they had made unconscious assumptions about likely processes.

Relations don’t understand time

(They would probably say the same about me.) A transaction acting across multiple tables is assumed to be instantaneous. This worried me. A complex calculation requiring reads could not be guaranteed to be consistent unless all accessed tables are locked against writes, throughout the transaction. Jackson also confirmed that the Relational Model has no concept of time. A dirty fix is data warehousing which achieves consistency without locking by the trade-off of guaranteeing the data is old.

The Object Model doesn’t generalise

I’d stopped developing software by the time I heard about the Object Oriented Programming paradigm. I could see a lot of sense in OOP for simulating real-world objects. Software could be designed to be more modular when the data structures representing the state of a real-world object and the code which handled state-change were kept in a black box with a sign on that said “Beware of the leopard”. I couldn’t grasp how people filled the space between the objects with imaginary software objects that followed the same restrictions, or why they needed to.

A new wave of Functional Programming has introduced immutable data structures. I have recently learned through Clojure author Rich Hickey’s videos that reflecting state-change by mutating the value of variables is now a sin punishable by a career in Java programming. Functional Programmers have apparently always agreed with me that not all data structures belong in an object

There are others I’m still waiting for everyone to catch up on:

The Writable Web is a bad idea

The Web wasn’t designed for this isn’t very good at it. Throwing complexity bombs at an over-simplified model rarely helps.

Rich Hickey’s Datomic doesn’t appear to have fixed my entity:attribute issue

Maybe that one is impossible.

Advertisements

Women’s Day Intuition

The first thing I did yesterday, on International Women’s Day 2017, was retweet a picture of Margaret Hamilton, allegedly the first person in the world to have the job title ‘Software Engineer’. The tweet claimed the pile of printout she was standing beside, as tall as her, was all the tweets asking “Why isn’t there an International Men’s Day?” (There is. It’s November 19th, the first day of snowflake season.) The listings were actually the source code which her team wrote to make the Apollo moon mission possible. She was the first virtual woman on the Moon.

I followed up with a link to a graph showing the disastrous decline of women working in software development since 1985, by way of an explanation of why equal opportunities aren’t yet a done deal. I immediately received a reply from a man, saying there had been plenty of advances in computer hardware and software since 1985, so perhaps that wasn’t a coincidence. This post is dedicated to him.

I believe that the decade 1975 – 1985, when the number of women in computing was still growing fast, was the most productive since the first, starting in the late 1830s, when Dame Ada Lovelace made up precisely 50% of the computer software workforce worldwide. It also happens to approximately coincide with the first time I encountered computing, in about 1974 and stopped writing software in about 1986.

1975 – 1985:
As I entered: Punched cards then a teletype, connected to a 24-bit ICL 1900-series mainframe via 300 Baud accoustic coupler and phone line. A trendy new teaching language called BASIC, complete with GOTOs.

As I left: Terminals containing a ‘microprocessor’, screen addressable via ANSI escape sequences or bit-mapped graphics terminals, connected to 32-bit super-minis, enabling ‘design’. I used a programming language-agnostic environment with a standard run-time library and a symbolic debugger. BBC Micros were in schools. The X windowing system was about to standardise graphics. Unix and ‘C’ were breaking out of the universities along with Free and Open culture, functional and declarative programming and AI. The danger of the limits of physics and the need for parallelism loomed out of the mist.

So, what was this remarkable progress in the 30 years from 1986 to 2016?

Good:

Parallel processing research provided Communicating Sequential Processes and the Inmos Transputer.
Declarative, non-functional languages that led to ‘expert systems’. Lower expectations got AI moving.
Functional languages got immutable data.
Scripting languages like Python & Ruby for Rails, leading to the death of BASIC in schools.
Wider access to the Internet.
The read-only Web.
The idea of social media.
Lean and agile thinking. The decline of the software project religion.
The GNU GPL and Linux.
Open, distributed platforms like git, free from service monopolies.
The Raspberry Pi and computer science in schools

Only looked good:

The rise of PCs to under-cut Unix workstations and break the Data Processing department control. Microsoft took control instead.
Reduced Instruction Set Computers were invented, providing us with a free 30 year window to work out the problem of parallelism but meaning we didn’t bother.
In 1980, Alan Kay had invented Smalltalk and the Object Oriented paradigm of computing, allowing complex real-world objects to be simulated and everything else to be modelled as though it was a simulation of objects, even if you had to invent them. Smalltalk did no great harm but in 1983 Bjarne Stroustrup left the lab door open and C++ escaped into the wild. By 1985, objects had become uncontrollable. They were EVERYWHERE.
Software Engineering. Because writing software is exactly like building a house, despite the lack of gravity.
Java, a mutant C++, forms the largely unrelated brand-hybrid JavaScript.
Microsoft re-invents DEC’s VMS and Sun’s Java, as 32-bit Windows NT, .NET and C# then destroys all the evidence.
The reality of social media.
The writeable Web.
Multi-core processors for speed (don’t panic, functions can save us.)

Why did women stop seeing computing as a sensible career choice in 1985 when “mine is bigger than yours” PCs arrived and reconsider when everyone at school uses the same Raspberry Pi and multi-tasking is becoming important again? Probably that famous ‘female intuition’. They can see the world of computing needs real functioning humans again.

Becoming Functional

I’ve been playing with the idea of doing some functional programming for a while now. I’ve been trying to learn and paddling around in the shallows but this week I dived right in the emacs/CIDER pool. I was aware of some dangers lurking beneath the surface: recursion, immutable data structures and the functional holy trinity of map, reduce & filter, so I came up with some ideas to face my fears. I’ve also realised my maths has got rusty so: Some of That Too.

  1. I’ve ‘done recursion’ before but I thought I’d read that my chosen weapon Clojure didn’t do tail-end recursion. This isn’t true. What it can’t do is automatic optimisation of tail-end  recursion, to stop it blowing the stack after a few thousand iterations but Clojure has a ‘recur’ expression to manually signal tail recursion and fix that. I knocked off the programme in a couple of hours and went to bed happy. My code was happily printing the first n numbers of the Fibonacci sequence but a day later I still couldn’t get it the return the numbers as a sequence.
  2. I was finding out about immutable data the hard way. You can’t build up an immutable vector, 1 element at a time. You get to keep the empty vector you created first. It’s a big mind-set change to not have variables that can vary. In my next post, I’ll try to say what I’ve learned. On this occasion it was lazy sequences.
  3. I mentioned the Algorave in my last post. I only found out about that because of an idea I had for improving my theoretical understanding of music. I realised that I could write, for example, a function that would return the 1st, 3rd and 5th notes in a major scale, using a map function.While working the theory out, I found out that Lisps are already popular in the live-coding world.
  4. At Algorave, I was inspired by the live-coded graphics to try automatically generating some graphics too, to work out the maths of mapping triangular grids onto Cartesian co-ordinates. I need that for another idea.

Three basic working programmes in about a week. They aren’t ‘finished’ but is software ever? They have delivered value via increased Clue.

My First Algorave

@algobbz

On Saturday night I went to ‘Algorave Birmingham’, curated  by Antonio Roberts at Vivid Projects. I said I might write ‘a review’ but I’m not going to, because I wouldn’t know how. This is ‘a reaction’ – a digital feedback loop, an emission from the event horizon (should have worn my ‘Big Bang’ T-Shirt – the noughties Brum band, not the nerd show.)

My background is information technology. My current work is writing. I use the word ‘work’ in the artistic sense: something I spend my time on but may never get paid for. Themes recur. Are science and art actually different things? Is maths real or a model? Is software any different to magic, existing only outside the physical realm and communicating via intermediary objects?

Q: How much can you strip away from music and it still exist as an idea: melody, scales, pitch?

I came to Algorave via my functional programming experiments. I’m trying to learn Clojure, a member of the Lisp family of languages but with added time-travel. It messes with whether time is a wave or a set of discrete steps that can be retraced. Not real time, obviously but the model of time our software deals with. Time travel outside of the magical realm would be crazy-talk.

Dance music is often first. Drum machines. I got really frustrated the first time I saw how hard it was to programme beats. Where was the programmatic interface? Sampling, pitch-shifting, the ‘sound’ being manipulated by code. Digits being manipulated by digits, like the higher order functions of functional programming. I wondered a few weeks ago if processors had got fast enough to generate live noises. They have. A Raspberry Pi has http://sonic-pI noti.net/http://sonic-pi.net/. From there I discovered Clojure has, via ‘Overtone’ on ‘SuperCollider’ http://sam.aaron.name/, which resonates with my theory of a super-massive idea colider to mash-up memes.

Algorave Birmingham presented live coders generating sound and visuals. At times I felt that the graphics were pulsing to the beats but I don’t know if that really happened. I saw two pixelated women on the screen typing on ‘real’ laptops and a live drummer on digital drums. Virtuality virtuosos. I had a chat about how to make a hit record and forgot the name of the Kaiser Chiefs but remembered Black Wire who were the first band with a drum machine that I actually liked, because it didn’t sound mechanical, then The Kills who insisted everything was analogue, but now I’m looping.

A: I enjoyed the pulsing white noise. Software can do things that aren’t possible in Reality.

Vacuous Thoughts

A minute ago, I juxtaposed 2 phrases on a Slack chat:

I listened to Rich Hickey’s video on Hammock Driven Development a couple of days ago. It’s about modification of mind mode without resorting to chemicals. There’s a long tradition in hacker-lore that points to Zen and the martial arts too. I find showers, lawn-mowing and writing what i think I know so far (a variation of the cardboard coder trick) all help. The poets seem to prefer long walks. ‘Empty Mind’. “Nature abhors a vacuum”.

As a result of my subsequent wanderings, I learned a new word, “plenist” and “plenism” http://englishdictionary.education/en/plenism (the usual suspects)

and I saw the word “idiom”. I’ve heard “idiomatic” a lot recently, in relation to styles associated with programming languages but I wasn’t sure precisely what it meant:

I think the intended meaning is Google’s 2nd choice:

a characteristic mode of expression in music or art.

but the alternative is interesting too:

a group of words established by usage as having a meaning not deducible from those of the individual words (e.g. over the moon, see the light ).
How often does a ceremony gets associated with an idea, long after anyone remembers why? I ask this after reading a thought provoking comparison of the functional and object paradigms that only partly agrees with the ideas I mapped out in stickies on a paper table-cloth yesterday.
I’m “still not working”, as people say. My Dad kept a dictionary beside his chair. I continue his work with ‘tear-off here’ computer science. At least my inherited etymology is idiomatic of the Clojure community.

Asynchronicity Traps

Don’t you love it when lack of planning comes together?

Last Thursday, I learned about JavaScript callbacks, which reminded me of OpenVMS Asynchronous Traps (ASTs.) I learned to code in a world of single-threaded processing, so this was an advanced topic, along with my interest in occam and Communicating Sequential Processes. Back then, only real-time coders and those looking to the future cared about parallelism. I remember not really seeing the point of Yourdon ‘state-diagrams’. I’d never experienced the complex state network that a GUI with a few option buttons can generate.

Last night I came across debates about the advisability of abandoning JS callbacks for the HTML5 ‘promise’ construct; “callback hell”, they called it. Promises are functions. This is another area where the elegant simplicity of functional programming appears to offer hope. Functions are mathematical constructs, so in functional languages perhaps all possible states that code might enter can be identified.

Alongside this, I’ve been reading about research into the energy requirements of computation. For a long time, computer scientists thought that every logic operation would have a cost in terms of energy and hence entropy, but that appears not to be true. It is information deletion that costs energy, so immutable data is more energy efficient. I’m only up (down?) to quantum bits, so I’ll have to let you know how the cat gets on another day. I worry when physics starts to look like mystical religions.

Odd-numbered Bits

The Leap-launch of the 64-bit Raspberry Pi 3 yesterday set me thinking: if I’d been born 4 years later, my life it bits could have been very different. I might have been in at the start of the PC revolution and progressed  through the 8, 16, 32 and 64 bit Intel architectures with everyone else. I’d be one of those people who considers Windows PCs to be Real Computers. I find that concept hard to grasp.

I actually used 24 and 36 bit computers at university and started work on a 32-bit, virtual memory machine. We got free software from other people like us, via the DECUS library.  I’ve never had to worry about allocation of physical memory, like some kind of primitive savage. Since the mid-80s, I’ve been waiting for the world to realise that they took a wrong turn and I finally think it might be happening.

Free software happened. Real Operating Systems became available for ‘desktop computers’ (so we could shove them under the desk and stop worrying about regular access to the reboot button.) We got always-on Internet access and now we’re starting to think about parallel processing and functional programming again, like we were in the 80s. If I had the chance to choose a time to start computing, it would be now, at the age of 6.

We were worrying about nuclear annihilation, over-population and running out of fossil fuels then too. Maybe we’ll remember those soon too, now our houses are full of stuff.

“What did the capitalist dream ever do for you Grandad?”

“It wasted my precious time, Best Beloved.”
(sweet because stolen from that nice Mr. Kipling)

….and maybe Mr. Dylan:

“I ain’t a-saying you treated me unkind
You could have done better but I don’t mind
You just kinda wasted my precious time
But don’t think twice, it’s all right.”