fbpx
01 / 05
Heroes of Progress, Pt. 49: Babbage and Lovelace

Blog Post | Computing

Heroes of Progress, Pt. 49: Babbage and Lovelace

Introducing the two 19th-century English mathematicians who pioneered early computing, Charles Babbage and Ada Lovelace.

Today marks the 49th installment in a series of articles by HumanProgress.org titled Heroes of Progress. This bi-weekly column provides a short introduction to heroes who have made an extraordinary contribution to the well-being of humanity. You can find the 48th part of this series here.

This week, our heroes are Charles Babbage and Ada Lovelace—two 19th century English mathematicians and pioneers of early computing. Babbage is often called “The Father of Computing” for conceiving the first automatic digital computer. Building on Babbage’s work, Lovelace was the first person to recognize that computers could have applications beyond pure calculation. She has been dubbed the “first computer programmer” for creating the first algorithm for Babbage’s machine. Babbage and Lovelace’s work laid the groundwork for modern-day computers. Without their contributions, much of the technology we have today would likely not exist.

Charles Babbage was born on December 26, 1791, in London, England. His father was a successful banker and Babbage grew up in affluence. As a child, Babbage attended several of England’s top private schools. His father ensured that Babbage had many tutors to assist with the latter’s education. As a teenager, Babbage joined the Holmwood Academy in Middlesex, England. The academy’s large library helped Babbage develop a passion for mathematics. In 1810, Babbage began studying mathematics at the University of Cambridge.

Before arriving at Cambridge, Babbage had already learned much of contemporary mathematics and was disappointed by the level of mathematics being taught at the university. In 1812, Babbage and several friends created the “Analytical Society,” which aimed to introduce new developments in mathematics that were occurring elsewhere in Europe to England.

Babbage’s reputation as a mathematical genius quickly developed. By 1815, he had left Cambridge and begun lecturing on astronomy at the Royal Institution. The following year, he was elected as a Fellow of the Royal Society. Despite several successful lectures at the Royal Institution, Babbage struggled to find a full-time position at a university. Throughout early adulthood, therefore, he had to rely on financial support from his father. In 1820, Babbage was instrumental in creating the Royal Astronomical Society, which aimed to reduce astronomical calculations into a more standard form.

In the early 19th century, mathematical tables (lists of numbers showing the results of  calculations) were central to engineering, astronomy, navigation, and science. However, at the time, all the calculations in the mathematical tables were done by humans and mistakes were commonplace. Given this problem, Babbage wondered if he could create a machine to mechanize the calculation process.

In 1822, in a paper to the Royal Astronomical Society, Babbage outlined his idea for creating a machine that could automatically calculate the values needed in astronomical and mathematical tables. The following year, Babbage was successful in obtaining a government grant to build a machine that would be able to automatically calculate a series of values of up to twenty decimal places, dubbed the “Difference Engine.”

In 1828, Babbage became the Lucasian Professor of Mathematics at Cambridge University. He was largely inattentive to his teaching responsibilities and spent most of his time writing papers and working on the Difference Engine. In 1832, Babbage and his engineer Joseph Clement produced a small working model of the Difference Engine. The following year, plans to build a larger, full scale engine were scrapped, when Babbage began to turn his attention to another project.

In the mid-1830s, Babbage started to develop plans for what he called the “Analytical Engine,” which would become the forerunner to the modern digital computer. Whereas the Difference Engine was designed for mechanized arithmetic (essentially an early calculator only capable of addition), the Analytical Engine would be able to perform any arithmetical operation by inputting instructions from punched cards—a stiff bit of paper that can contain data through the presence or absence of holes in a predefined position. The punched cards would be able to deliver instructions to the mechanical calculator as well as store the results of the computer’s calculations.

Like modern computers, the design of the Analytical Engine had both the data and program memory separated. The control unit could make conditional jumps, separate input and output units, and its general operation was instruction-based. Babbage initially envisioned the Analytical Engine as having applications only relating to pure calculation. That soon changed thanks to the work of Ada Lovelace.

Augusta Ada King, Countess of Lovelace (née Byron) was born on December 10, 1815, in London, England. Lovelace was the only legitimate child of the poet and Member of the House of Lords, Lord Byron and the mathematician Anne Isabella Byron. However, just a month after her birth, Byron separated from Lovelace’s mother and left England. Eight years later he died from disease while fighting on the Greek side during the Greek War of Independence.

Throughout her early life, Lovelace’s mother raised Ada on a strict regimen of science, logic, and mathematics. Although frequently ill and, aged fourteen, bedridden for nearly a year, Lovelace was fascinated by machines. As a child, she would often design fanciful boats and flying machines.

As a teenager, Lovelace honed her mathematical skills and quickly became acquainted with many of the top intellectuals of the day. In 1833, Lovelace’s tutor, Mary Somerville, introduced the former to Charles Babbage. The pair quickly became friends. Lovelace was fascinated with Babbage’s plans for the Analytical Engine and Babbage was so impressed with Lovelace’s mathematical ability that he once described her as “The Enchantress of Numbers.”

In 1840, Babbage visited the University of Turin to give a seminar on his Analytical Engine. Luigi Menabrea, an Italian engineer and future Prime Minister of Italy, attended Babbage’s seminar and transcribed it into French. In 1842, Lovelace spent nine months translating Menabrea’s article into English. She added her own detailed notes that ended up being three times longer than the original article.

Published in 1843, Lovelace’s notes described the differences between the Analytical Engine and previous calculating machines—mainly the former’s ability to be programmed to solve any mathematical problem. Lovelace’s notes also included a new algorithm for calculating a sequence of Bernoulli numbers (a sequence of rational numbers that are common in number theory). Given that Lovelace’s algorithm was the first to be created specifically for use on a computer, Lovelace thus became the world’s first computer programmer.

Whereas Babbage designed the Analytical Engine for purely mathematical purposes, Lovelace was the first person to see potential use of computers that went far beyond number-crunching. Lovelace realized that the numbers within the computer could be used to represent other entities, such as letters or musical notes. Consequently, she also prefigured many of the concepts associated with modern computers, including software and subroutines.

The Analytical Engine was never actually built in Babbage’s or Lovelace’s lifetime. However, the lack of construction was due to funding problems and personality clashes between Babbage and potential donors, rather than because of any design flaws.

Throughout the remainder of his life, Babbage dabbled in many different fields. Several times, he attempted to become a Member of Parliament. He wrote several books, including one on political economy that explored the commercial advantages of the division of labor. He was fundamental in establishing the English postal system. He also invented an early type of speedometer and the locomotive cowcatcher (i.e., the metal frame that attached to the front of trains to clear the track of obstacles). On October 18, 1871, Babbage died at his home in London. He was 79 years old.

After her work translating Babbage’s lecture, Lovelace began working on several different projects, including one that involved creating a mathematical model for how the brain creates thoughts and nerves—although she never achieved that objective. On November 27, 1852, Lovelace died from uterine cancer. She was just 36 years old.

During his lifetime, Babbage declined both a knighthood and a baronetcy. In 1824, he received the Gold Medal from the Royal Astronomical Society “for his invention of an engine for calculating mathematical and astronomical tables.” Since their deaths, many buildings, schools, university departments and awards have been named in Babbage’s and Lovelace’s honor.

Thanks to the work of Babbage and Lovelace, the field of computation was changed forever. Without Babbage’s work, the world’s first automatic digital computer wouldn’t have been conceived when it was. Likewise, many of the main elements that modern computers use today would likely have not been developed until much later. Without Lovelace, it may have taken humanity much longer to realize that computers could be used for more than just mathematical calculations. Together, Babbage and Lovelace laid the groundwork for modern-day computing, which is used by billions of people across the world and underpins much of our progress today. For these reasons, Charles Babbage and Ada Lovelace are our 49th Heroes of Progress.

The Guardian | Computing

Microsoft’s New Chip Could Bring Quantum Computing Within Years

“Quantum computers could be built within years rather than decades, according to Microsoft, which has unveiled a breakthrough that it said could pave the way for faster development.

The tech firm has developed a chip which, it says, echoes the invention of the semiconductors that made today’s smartphones, computers and electronics possible by miniaturisation and increased processing power.

The chip is powered by the world’s first topoconductor, which can create a new state of matter that is not a solid, liquid, or gas – making it possible to design quantum systems that fit in a single chip smaller than the palm of a hand, and to create more reliable hardware, a peer-reviewed paper published in Nature reports.”

From The Guardian.

Live Science | Computing

Supercomputer Runs Largest Simulation of the Universe Ever

“The potential for our understanding of the universe has taken a giant leap forward after Frontier, a supercomputer based in the Oak Ridge National Laboratory (ORNL), created a simulation of the universe at a scale never before achieved.

Frontier used a software platform called the Hardware/Hybrid Accelerated Cosmology Code (HACC) as part of ExaSky, a project that formed part of the U.S. Department of Energy’s (DOE) $1.8 billion Exascale Computing Project — the largest software R&D initiative backed by the DOE.

Under Exasky, scientific applications were required to run up to 50 times faster than previous benchmarks, but Frontier and HACC quickly raced ahead of expectations — running almost 300 times faster than similar simulations of Saturn’s moon Titan. The DoE/HACC team had spent seven years since the first simulation enhancing the capabilities on exascale supercomputers like Frontier.

This allowed for hydrodynamic cosmology simulations, a far more computationally intensive computer model that incorporates principles like the expansion of the universe and the influence of dark matter. Previous models only incorporated measures of gravity, gas or plasma.”

From Live Science.

Live Science | Computing

Hybrid Quantum Supercomputer Goes Online in Japan

“Engineers in Japan have switched on the world’s first hybrid quantum supercomputer.

The 20-qubit quantum computer, called Reimei, has been integrated into Fugaku — the world’s sixth-fastest supercomputer. The hybrid platform will work to tackle calculations that can take classical supercomputers much longer to process.

The machine, which is housed at the Riken scientific institute in Saitama, near Tokyo, will be used primarily for physics and chemistry research, representatives from Quantinuum, the makers of Reimei, and Riken said in a joint statement.

Quantum computers could one day overtake classical computers, with the potential to complete calculations in minutes or seconds that would otherwise take today’s most powerful machines millions of years. However, until quantum computers are large and reliable enough, scientists say that integrating their capabilities into supercomputers can be a stopgap.”

From Live Science.

Wall Street Journal | Computing

New “Automated Reasoning” to Reduce AI’s Hallucinations

“Amazon is using math to help solve one of artificial intelligence’s most intractable problems: its tendency to make up answers, and to repeat them back to us with confidence.

The issue, known as hallucinations, have been a problem for users since AI chatbots hit the mainstream over two years ago. They’ve caused people and businesses to hesitate before trusting AI chatbots with important questions. And, they occur with any AI model—from those developed by OpenAI and Meta Platforms to those from the Chinese firm DeepSeek.

Now, Amazon.com’s cloud-computing unit is looking to ‘automated reasoning’ to provide hard, mathematical proof that AI models’ hallucinations can be stopped, at least in certain areas. By doing so, Amazon Web Services could unlock millions of dollars worth of AI deals with businesses, some analysts say.

Simply put, automated reasoning aims to use mathematical proof to assure that a system will or will not behave a certain way. It’s somewhat similar to the idea that AI models can ‘reason’ through problems, but in this case, it’s used to check that the models themselves are providing accurate answers.”

From Wall Street Journal.