I got into computers in the “Golden Age” of computing, when 64K was trumpeted as being an elephantine-sized memory, microprocessors were universally 8-bit, blazing along at anywhere between 1- and 4-MHz, colours were a luxury, and well over 90% of the software available was in the form of games - on cassette tapes, no less!
Most of these machines plugged directly into the TV, turned on with a simple rocker switch, took at most two seconds to boot-up, and were then ready to ‘use’. Of course, what most of us did next was to execute whatever key combination was necessary to start loading a game!
For the more curious, programming beckoned. Apart from a few of the more obscure home micros of the day, most systems were sold with a built-in BASIC interpreter, meaning you could be programming as soon as the cursor appeared. For many the curiosity only went as far as the classic two-line ‘program’:
10 PRINT “JAX IS COOL!”
20 GOTO 10
You could even rock up to W.H. Smith or Boots and get it running on the display computers for instant fame and gratification. Yep, Boots (the chemist) used to sell computers and software!
The more adventurous would go on to fully explore the BASIC language, often writing non-trivial programs to scratch a particular itch or just for the challenge. Magazines of the day often included “program listings” for readers to type in. Invariably, a printing issue or shoddy typing resulted in a program that didn’t work, but fixing these was fun too (sometimes). The problem with BASIC, however, was that it was inherently limited in terms of speed and access to the underlying machine. Although there were often other programming languages available for home micros, the only way to really use these machines to their full potential was to program in assembly language…
Nowadays, very few programmers dabble with assembly language. To be fair, as processors and hardware have advanced, they have become many orders of magnitude more complicated to work with, let alone actually understand what black magic they perform, and in truth there isn’t much need to go to such a low level for anything beyond device drivers or extreme optimisation for speed or memory usage. But I believe the processors of the late 70’s and early 80’s were a great platform to learn the low level fundamentals of how computers actually work. If you were programming a home computer in the early- to mid-80’s, you were probably using a MOS Technology 6502 (BBC, Commodore, Apple, Atari), a Zilog Z-80 (Spectrum, Amstrad) or a Motorola 6809 (Dragon). All of these microprocessors had relatively small instruction sets and limited registers and addressing modes, meaning there wasn’t a huge amount to learn. Indeed, much of the learning was machine-specific – how to interact with the other chips in the computer, such as graphics, sound, peripheral access, hardware interrupts, etc., and adapting to binary and hexadecimal number systems.
A positive side-effect of learning to program a computer at this low-level is that everything at a higher level is somehow employing these methods in the end. Even modern, multi-core, super-pipelined processors ultimately execute tiny instructions, interact with buses and peripherals, read/write memory (granted, it’s through any number of caches, but you get the idea) and select what to execute next based on simple yes/no, true/false determinations.
Learning to work with a minimal set of instructions and a finite amount of memory is an invaluable skill – it might not seem that relevant today, given machines with multiple gigabytes of RAM and multi-core processors operating in the gigahertz range, but relying on advances in raw CPU power and ever-cheaper RAM leads to unnecessarily inefficient, bloated software. Being more conscious of memory and power consumption still has value, especially with the emergence of IoT (the Internet of Things), with very low power, low resource and (most importantly) low cost components. While there are languages available that let almost anybody enter the fray, a person programming in assembly language (or plain C) will be better utilising the resources, using less power, and managing their own memory. Oh yeah – with C and assembly language, there is no garbage collector – you control everything!