The Antikythera Mechanism to Spacewar! Computer Programming History For Dunces

Without programming, the software and the code, the hardware wouldn’t have meant a thing. From the Greeks to the modern-day era, we’ve got a lot to be thankful for

James Dargan
Zero Equals False

--

Source: pixabay

Stranger Things

When I started my short career as a ten-year-old ‘computer programmer’ back in the mid-1980s, when the vibe was certainly ‘Stranger Things’ in spirit at least, I didn’t know anything about the history of what lay behind the scenes, how the Amstrad CPC 464 I used, worked. Or how, the geniuses that had lived before me had laid the groundwork for a mega technological revolution of bits and bytes. One that is still going on. And one that will, I hope, continue with my current passion:

Quantum computing.

Since the early days, programming and computers have changed beyond their first models, mutated and morphed into so many areas of specialization that to be a modern-day programmer takes more skill and education than I could ever accomplish.

Let me put it out there: I’m no programmer. I don’t know C++ from C#. Neither Python from ‘Anaconda’ or JavaScript from the Java Sea or any of the others that exist in the space. I wish I did, but I don’t.

Source: pixabay

Like I don’t know Russian, either — yet that doesn’t mean I don’t want to know it, admire it for what it is, of what pleasure it has given me while reading Dostoyevsky and Solzhenitsyn.

But that’s something else, for another day, another publication.

What I do know, though — and this took route in my nerdy nature for history — is how computers started, what code they used in their basic form and how, more importantly, they evolved through the ages.

Togaland

We could start, in essence, with the toga-garbed Greeks and their ancient analogue ‘Antikythera Mechanism’, designed to predict astronomical positions. Yet that would be cheating in some way, giving credit for sublime innovation when really it was still two thousand years away.

French Steampunk Hero

So let’s fast forward to France. The year is 1804. Napoleon has been crowned Emperor of the French at Notre-Dame. Meriwether Lewis and William Clark begin their epic adventure of the American West. The Industrial Revolution is at full pace. A time and epoch of great men and greater inventions, none more so than Joseph Marie Jacquard’s programmable loom.

source: pixabay

This unknown genius, born in Lyon in 1752, improved on a device that was at the centre of the time, an innovation so important that it changed the lives of countless people. For centuries before man worked looms by hand. Textile production was slow. It was hard, time-consuming work. The power loom changed everything.

Jaquard’s invention was based on a series of punchcards placed on top of looms that could program intricate and complex patterns on the fabric and save time on production.

This was the introduction of the punchcard onto the world.

And then came the vacuum. A dark space of nothingness, where humanity’s mind was blank to the future of computers.

The Georgian period passed to the Regency era which handed the baton to the Victorians where Babbage and Lovelace took the stage with the Analytic Engine.

Change was afoot.

Hollerith Cards

In 1889 Herman Hollerith invented the ‘electric tabulating system’, which used his own version of punchcards called Hollerith cards. This ingenious device could read data.

Wow.

Not only wow. His company went on to be one of the founding companies that created IBM.

Punchcards ruled until the mid-20th century when Citizen Turing, the Polish General Cipher Bureau, IBM and ENIAC’s coding system were born.

Alan Turing. Source: Wikipedia

The way they worked was simple. Data was inputted on punch cards concocted by a programmer, a human, which was then fed into the said device to realize the operation required.

The Colossus, the machine designed to break the German encryption, of World War II, too.

A Bug’s Life

Know where the word ‘bug’ comes from in computer language?

Blame the moth, I say.

The poor thing got stuck in a Mark II Aiken relay calculator and was discovered by navy computer programmer Grace Murray Hopper until she took it out and ‘debugged’ it.

Oh, and we can’t forget the Atanasoff-Berry Computer, or the ABC, of 1942, which was the first-ever electronic digital computer. Its main function was to solve linear equations. Unfortunately, the machine wasn’t programmable, but it could boast a separate memory and parallel processing amongst other things, leaving punch cards in the dust.

The EDSAC followed, the world’s first practical stored-program electronic computer, advancing the technology by calculating prime numbers.

A few years later, the first advanced computer programming language appeared. Invented by the lazy man of computer scientists John Backus, FORTRAN — as it was called — was unique in its user-friendliness, far removed from what had preceded it.

‘Much of my work has come from being lazy. I didn’t like writing programs, and so, when I was working on the IBM 701 (an early computer), writing programs for computing missile trajectories, I started work on a programming system to make it easier to write programs.’

- John Backus

There were others, as well — ALGOL, BASIC.

Cold War or Spacewar!

The 1950s moved into the 1960s. The Cold War had begun. People were scared of the Mushroom Nightmare. Humanity thought a nuclear war was imminent. Surrounded by this jingoistic tension, the first computer game was birthed: Spacewar!, in 1961.

Its programmer, Steve Russell, wrote it on DEC computer. What’s more important, however, is what followed: Russell met one Nolan Bushnell at Stanford University, showed him Spacewar! and a revolution began. Bushnell, an entrepreneurial and technical genius, went on to co-found Atari Computers, as well as designing the first coin-slot arcade game.

This was the decade of hippie Jobs’ Apple and the bespectacled nerd Gates’ Microsoft. Of Watergate and a cessation of the Vietnam War.

Like computer viruses?

Me neither.

The Worm

But there’s one man to blame for that, computer scientist Fred Cohen. While a student at the University of Southern California, he designed a program that was able to infect the host computer, copy itself before infecting other computers via floppy disk.

I bet that hurt!

But exhale a sigh of relief, because Cohen was a caped crusader, a geek with good intentions. His virus was never meant to be harmful. He only wanted to prove he could do it.

And he did.

Brownie points, boy.

Evil Empire

Computer languages in this decade included C++ and its object-oriented programming. Niklaus Wirth’s Pascal, designed for ethical ‘programming practices using structured programming and data structuring’.

The ten years of The Breakfast Club and Reagan’s Evil Empire rants brought us to 1990 and Python. Tim Berners-Lee developed the HTML language, the basis of the modern Internet. Mid-decade and we had Java, JavaScript, Ruby. The first web browsers, too.

Source: Pixabay

There was a cascade of computer programming languages. A shift had occurred. Programmers started thinking in a new way in regard to computer software. Application development cycles were faster. More code reviews became the norm to reduce or eliminate errors.

Club 2000

We pierced the new millennium before the Dot Com Bubble set in. Yet that didn’t hinder the developers, those futurists with an eye for what was not the reality.

The internet gained more users while companies, hungry for sales, adopted the web for their commercial needs. The economics made sense, though, but only the ballsiest CEOs made a move.

AJAX and the Web 2.0 — some say popularized because of the business potential it promised — made all this the more possible.

‘Web 2.0 is the network as platform, spanning all connected devices; Web 2.0 applications & [are] delivering software as a continually-updated service that gets better the more people use it, consuming and remixing data from multiple sources, including individual users, while providing their own data and services in a form that allows remixing by others, creating network effects through an ‘architecture of participation,’ and & deliver rich user experiences.’

- Tim O’Reilly coined the phrase Web 2.0

What followed was the Cloud, desktop email service Gmail which re-popularized, quite accidentally, the programming languages JavaScript.

Next was the smartphone: iOS, Android, Windows phone.

Go figure.

A Thank-You Note

So, thank you, Greeks for the start.

Babbage — and your Love Lace.

Jaquard with your loom, in bloom.

And all the rest of you, too — for the world owes you big time!

--

--

James Dargan
Zero Equals False

Author & futurist writing about quantum computers, AI, crypto/blockchain. Journalist @ thequantumdaily.com Read my fiction on Amazon or jamesdargan.com