Technology

Ada Lovelace: more than the world’s first computer programmer

Though articles like those in The Guardian like to remind us all that women originally coded because it was considered a menial task, there are examples of exceptional advances by women in computer programming.

The aforementioned article touches on some of these briefly, before moving on – intent on missing the point – to a justifiably impassioned stance.

Ada Lovelace, known as the ‘first computer programmer’, once said, “Imagination is the Discovering Faculty, pre-eminently. It is that which penetrates into the unseen worlds around us, the worlds of Science.”

I’m sure many scientists would still agree with that same understanding of imagination today, despite it having been uttered even before Darwin published On The Origin of The Species. And as it’s that pioneering attitude which has given humanity many (maybe even all?) of its advances, proponents of it deserve more than a sentence or two.

Ada Lovelace was the daughter of Anne Milbanke and Lord Byron. The pan-amorous, Bohemian poet had fled the country in the spring of 1816, before Lovelace was six months old, leaving Ada and her mother their own devices with one other woman, Mary Somerville, who tutored the fledgling intellect.

Somerville’s tuition of Lovelace was mainly based around music, languages and mathematics. Somerville was, upon her death, hailed as “The Queen of Nineteenth-Century Science”, and was the first signatory on John Stuart Mill’s petition to parliament to give women the right to vote. Her connections as a polymath extended well into mathematical academic circles, which she used to introduce Lovelace to Charles Babbage.

Babbage and Lovelace’s correspondence begun in 1835 and continued until her death in 1852. After five years of writing back and forth, when Lovelace was 27, Babbage went to Turin to give a seminar on his “analytical engine” which would produce mathematical tables automatically and, supposedly, error free.

Off the back of his visit an Italian engineer, Luigi Menabrea, wrote up the discussion in French and published it. Lovelace was commissioned to translate the paper into English. This took her a full year… because she didn’t just translate the paper. Unsatisfied that the paper stopped short of its potential, Lovelace supplemented her translation with 65 pages of “footnotes”, two and a half times more than the original paper.

Essentially what had happened was that Babbage, “the father of the computer”, had actually produced a calculator. No small feat. But it was Lovelace who made a more seminal jump: that if a something is capable of performing simple calculations then it must be capable of performing any number of other logical functions.

The idea of Lovelace being the first programmer is up for discussion. But, to be frank, that’s how science works – each builds on what has come before. Any half-witted pedant can look at a certain invention or theory and say “Ah! But before that you had…”; these arguments are not to be blessed with assumptions of intellectual sincerity.

For me the case is clear: Babbage created a bloody good calculator for the time, this much is apparent and commendable in those terms but no more. The first “calculator” was designed and constructed in 1623 by Wilhelm Schickard, refinements and improvements were made from then onwards. But the case is not about what “coding” or “programming” is, nor who was the first to do so.

Lady Lovelace was capable of what Babbage was not, she saw what could be rather than what was. Her mind took the properties of calculators and applied them to a bigger idea.

Her notes documented potential hardware and software necessary to get a machine to manipulate any data represented by numbers, and an example of the kind of thing it might be able to do. And, as we later found out, her code was flawless. The notes then went on to ponder the idea that we could get said machine to do anything we knew how to order it to perform.

What Lovelace had proposed was the first example of the means by which our modern world now functions. We are still functioning from her definition of computing, Turing’s dream is still the stuff of pipes.

Put it this way – when Turing published his famous paper on machines “thinking”, he devoted an entire section to “Lady Lovelace’s Objection”. Lady Lovelace didn’t ever publish any objections to the idea of a computer “thinking”, since it was Turing who came up with that idea a hundred years later.

But she had published the first example of the definition of a computer: “The Analytical Engine has no pretensions to originate anything. It can do whatever we know how to order it to perform” (her italics).

So what Turing referred to as “Lady Lovelace’s Objection” was a reference to what he considered to be a fundamental change to her thoughts which had been known as the definition of computing, from the Analytical Engine’s inception until then. He introduced it thusly “Our most detailed information of Babbage’s Analytical Engine comes from a memoir by Lady Lovelace…”

He was referencing her work because it was a crucial component of his own, and the community’s thinking on the subject. He gave only a cursory, token, reference to Babbage since it was his machine on which Lovelace projected the vital concept.

In other words, it was a reply to what he could assume Lady Lovelace’s objections might’ve been based on her writings. So formidable was her mind that the ideas which sprung from it should be taken with the utmost sincerity, lest his own work be thought of as incomplete or facetious.

She was not just the first computer programmer, a profession which women went on to do because “it was considered repetitive, unglamorous ‘women’s work”. Even today it can be done at entry-level with a bit of training.

Lady Lovelace was part of a far more exalted and noble discipline. She was a scientist, and – having offered the first definition of a programmable machine – the world’s first in the field of computing.

Ben Allen

Ben Allen is a traveller, a millennial and a Brit. He worked in the London startup world for a while but really prefers commenting on it than working in it. He has huge faith in the tech industry and enjoys talking and writing about the social issues inherent in its development.

Recent Posts

Can Bitcoin Be the Key to Ending Perpetual War?

Every now and then, I stumble upon posts such as these here and there: And,…

9 hours ago

The Coming AI Winter: How Physics May Be Leading the Way

Winter(Physics) is Coming It now looks like Large Language Models running on the GPT technology…

9 hours ago

Top 15 LatAm tech journalists and editors of 2024

Latin America’s tech industry is booming, with innovative new startups popping up across the region.…

12 hours ago

G20 announces initiative to crackdown on climate change disinformation

The Global Initiative for Information Integrity on Climate Change claims to 'safeguard those reporting on…

13 hours ago

How GPUs, widely used in gaming, are helping doctors get a better look inside us

In the late 19th Century, physicians began inserting hollow tubes equipped with small lights into…

24 hours ago

Top Five Trends Shaping Gaming in 2025

This year wasn’t exactly what the video gaming industry expected — it declined by 7%…

3 days ago