fbpx
01 / 05
Heroes of Progress, Pt. 49: Babbage and Lovelace

Blog Post | Computing

Heroes of Progress, Pt. 49: Babbage and Lovelace

Introducing the two 19th-century English mathematicians who pioneered early computing, Charles Babbage and Ada Lovelace.

Today marks the 49th installment in a series of articles by HumanProgress.org titled Heroes of Progress. This bi-weekly column provides a short introduction to heroes who have made an extraordinary contribution to the well-being of humanity. You can find the 48th part of this series here.

This week, our heroes are Charles Babbage and Ada Lovelace—two 19th century English mathematicians and pioneers of early computing. Babbage is often called “The Father of Computing” for conceiving the first automatic digital computer. Building on Babbage’s work, Lovelace was the first person to recognize that computers could have applications beyond pure calculation. She has been dubbed the “first computer programmer” for creating the first algorithm for Babbage’s machine. Babbage and Lovelace’s work laid the groundwork for modern-day computers. Without their contributions, much of the technology we have today would likely not exist.

Charles Babbage was born on December 26, 1791, in London, England. His father was a successful banker and Babbage grew up in affluence. As a child, Babbage attended several of England’s top private schools. His father ensured that Babbage had many tutors to assist with the latter’s education. As a teenager, Babbage joined the Holmwood Academy in Middlesex, England. The academy’s large library helped Babbage develop a passion for mathematics. In 1810, Babbage began studying mathematics at the University of Cambridge.

Before arriving at Cambridge, Babbage had already learned much of contemporary mathematics and was disappointed by the level of mathematics being taught at the university. In 1812, Babbage and several friends created the “Analytical Society,” which aimed to introduce new developments in mathematics that were occurring elsewhere in Europe to England.

Babbage’s reputation as a mathematical genius quickly developed. By 1815, he had left Cambridge and begun lecturing on astronomy at the Royal Institution. The following year, he was elected as a Fellow of the Royal Society. Despite several successful lectures at the Royal Institution, Babbage struggled to find a full-time position at a university. Throughout early adulthood, therefore, he had to rely on financial support from his father. In 1820, Babbage was instrumental in creating the Royal Astronomical Society, which aimed to reduce astronomical calculations into a more standard form.

In the early 19th century, mathematical tables (lists of numbers showing the results of  calculations) were central to engineering, astronomy, navigation, and science. However, at the time, all the calculations in the mathematical tables were done by humans and mistakes were commonplace. Given this problem, Babbage wondered if he could create a machine to mechanize the calculation process.

In 1822, in a paper to the Royal Astronomical Society, Babbage outlined his idea for creating a machine that could automatically calculate the values needed in astronomical and mathematical tables. The following year, Babbage was successful in obtaining a government grant to build a machine that would be able to automatically calculate a series of values of up to twenty decimal places, dubbed the “Difference Engine.”

In 1828, Babbage became the Lucasian Professor of Mathematics at Cambridge University. He was largely inattentive to his teaching responsibilities and spent most of his time writing papers and working on the Difference Engine. In 1832, Babbage and his engineer Joseph Clement produced a small working model of the Difference Engine. The following year, plans to build a larger, full scale engine were scrapped, when Babbage began to turn his attention to another project.

In the mid-1830s, Babbage started to develop plans for what he called the “Analytical Engine,” which would become the forerunner to the modern digital computer. Whereas the Difference Engine was designed for mechanized arithmetic (essentially an early calculator only capable of addition), the Analytical Engine would be able to perform any arithmetical operation by inputting instructions from punched cards—a stiff bit of paper that can contain data through the presence or absence of holes in a predefined position. The punched cards would be able to deliver instructions to the mechanical calculator as well as store the results of the computer’s calculations.

Like modern computers, the design of the Analytical Engine had both the data and program memory separated. The control unit could make conditional jumps, separate input and output units, and its general operation was instruction-based. Babbage initially envisioned the Analytical Engine as having applications only relating to pure calculation. That soon changed thanks to the work of Ada Lovelace.

Augusta Ada King, Countess of Lovelace (née Byron) was born on December 10, 1815, in London, England. Lovelace was the only legitimate child of the poet and Member of the House of Lords, Lord Byron and the mathematician Anne Isabella Byron. However, just a month after her birth, Byron separated from Lovelace’s mother and left England. Eight years later he died from disease while fighting on the Greek side during the Greek War of Independence.

Throughout her early life, Lovelace’s mother raised Ada on a strict regimen of science, logic, and mathematics. Although frequently ill and, aged fourteen, bedridden for nearly a year, Lovelace was fascinated by machines. As a child, she would often design fanciful boats and flying machines.

As a teenager, Lovelace honed her mathematical skills and quickly became acquainted with many of the top intellectuals of the day. In 1833, Lovelace’s tutor, Mary Somerville, introduced the former to Charles Babbage. The pair quickly became friends. Lovelace was fascinated with Babbage’s plans for the Analytical Engine and Babbage was so impressed with Lovelace’s mathematical ability that he once described her as “The Enchantress of Numbers.”

In 1840, Babbage visited the University of Turin to give a seminar on his Analytical Engine. Luigi Menabrea, an Italian engineer and future Prime Minister of Italy, attended Babbage’s seminar and transcribed it into French. In 1842, Lovelace spent nine months translating Menabrea’s article into English. She added her own detailed notes that ended up being three times longer than the original article.

Published in 1843, Lovelace’s notes described the differences between the Analytical Engine and previous calculating machines—mainly the former’s ability to be programmed to solve any mathematical problem. Lovelace’s notes also included a new algorithm for calculating a sequence of Bernoulli numbers (a sequence of rational numbers that are common in number theory). Given that Lovelace’s algorithm was the first to be created specifically for use on a computer, Lovelace thus became the world’s first computer programmer.

Whereas Babbage designed the Analytical Engine for purely mathematical purposes, Lovelace was the first person to see potential use of computers that went far beyond number-crunching. Lovelace realized that the numbers within the computer could be used to represent other entities, such as letters or musical notes. Consequently, she also prefigured many of the concepts associated with modern computers, including software and subroutines.

The Analytical Engine was never actually built in Babbage’s or Lovelace’s lifetime. However, the lack of construction was due to funding problems and personality clashes between Babbage and potential donors, rather than because of any design flaws.

Throughout the remainder of his life, Babbage dabbled in many different fields. Several times, he attempted to become a Member of Parliament. He wrote several books, including one on political economy that explored the commercial advantages of the division of labor. He was fundamental in establishing the English postal system. He also invented an early type of speedometer and the locomotive cowcatcher (i.e., the metal frame that attached to the front of trains to clear the track of obstacles). On October 18, 1871, Babbage died at his home in London. He was 79 years old.

After her work translating Babbage’s lecture, Lovelace began working on several different projects, including one that involved creating a mathematical model for how the brain creates thoughts and nerves—although she never achieved that objective. On November 27, 1852, Lovelace died from uterine cancer. She was just 36 years old.

During his lifetime, Babbage declined both a knighthood and a baronetcy. In 1824, he received the Gold Medal from the Royal Astronomical Society “for his invention of an engine for calculating mathematical and astronomical tables.” Since their deaths, many buildings, schools, university departments and awards have been named in Babbage’s and Lovelace’s honor.

Thanks to the work of Babbage and Lovelace, the field of computation was changed forever. Without Babbage’s work, the world’s first automatic digital computer wouldn’t have been conceived when it was. Likewise, many of the main elements that modern computers use today would likely have not been developed until much later. Without Lovelace, it may have taken humanity much longer to realize that computers could be used for more than just mathematical calculations. Together, Babbage and Lovelace laid the groundwork for modern-day computing, which is used by billions of people across the world and underpins much of our progress today. For these reasons, Charles Babbage and Ada Lovelace are our 49th Heroes of Progress.

Axios | Science & Technology

OpenAI Releases “Strawberry” Model with Better Reasoning

“OpenAI on Thursday released a new model, previously code named Strawberry, which is capable of evaluating its steps before proceeding.

Why it matters: In addition to being better at complex math, science and coding questions, OpenAI says this approach is more explainable and adheres more closely to intended safety guardrails.”

From Axios.

MIT Technology Review | Computing

Google Says It’s Made a Quantum Computing Breakthrough

“One major challenge has been that quantum computers can store or manipulate information incorrectly, preventing them from executing algorithms that are long enough to be useful. The new research from Google Quantum AI and its academic collaborators demonstrates that they can actually add components to reduce these errors. Previously, because of limitations in engineering, adding more components to the quantum computer tended to introduce more errors.”

From MIT Technology Review.

Blog Post | Science & Technology

Digital Representation Drives Progress

This technological revolution is pushing humanity forward on countless interconnected fronts.

Summary: Digital representation has revolutionized the economy and daily life by converting analog information into digital formats, enabling vast efficiencies and new business models. This has made products like music and services more accessible and has advanced production and waste management in contexts such as agriculture and recycling. As digital representation evolves, it plays a crucial role in addressing global challenges, from poverty in developing countries to environmental change.


Back in the “good old days,” if someone wanted to listen to Michael Jackson’s Thriller, they had to visit their local music store and see if it had the album in cassettes. Alternatively, they could pick up a vinyl record to play on their record player. Nowadays, cassettes and vinyl records have become relics of the past due to the rise in one gigantic innovation: digital representation.

Digital representation starts with the process of digitization, which involves converting analog information or data into digital signals. While analog signals are represented in continuous waves, digital signals are discrete units of information that consist of binary values, or 0s and 1s. Because computers perform using binary logic, the conversion from analog to binary enables computers and mobile devices to process, store, and transfer data digitally.

Digitization is just one of the key components of digital representation, which is the process of converting real-world objects, processes, and information into a digital format. Some examples of digital representation include medical images, cryptocurrency, YouTube videos, and many, many more. Businesses have moved toward using digital representation applications such as Excel to create data visualizations, Microsoft Azure for cloud storage, and SAP for automating business procedures.

One of the reasons for the extensive use of digital representation is that businesses that use digital representation have near-zero marginal costs per new unit produced or user added. Spotify and Amazon are prime examples of corporations taking advantage of digital representation. Spotify’s streaming service uses a “freemium” model, which has a limited, free service to casual listeners and a premium option for music connoisseurs. This model has expanded Spotify’s user base so that it can spread its fixed costs across its 615 million users, minimizing the costs of adding more users. Spotify also uses a data center that contains millions of digitized songs, all of which maintain their audio quality indefinitely and are distributed at zero additional cost. On the other hand, Amazon has its Prime subscription. Unlike a traditional business model, which contains fixed costs for each layered process, Amazon runs its distribution digitally and enables its “massive bundling” strategy through the Prime subscription, which combines services including movie streaming, music streaming, groceries, and more. Both corporations have revolutionized entertainment and shopping for millions of people worldwide.

Besides innovative business models, digital representation’s greatest benefit has come in advancing the efficiency of production. One such example is the creation of digitalized mobile phones, which has helped the populations of developing countries overcome bottlenecks in infrastructure, most prominently in Sub-Saharan Africa. According to a Pew Research Center study in 2015, 97 percent of those surveyed from countries such as Senegal, Kenya, and Nigeria did not have a working landline telephone due to failed landline development across the continent. However, this lack of service has been alleviated by the massive increase in mobile phone usage. In 2022, the number of mobile cellular subscriptions in Sub-Saharan Africa per 100 people was nearly 52 times larger than it was in 2000, growing from less than 2 to around 89.

Mobile cellular subscriptions, per 100 people

Not only has mobile phone usage skyrocketed, but the mobile money movement has taken over Africa. Fintech—which is composed of software, mobile applications, and other technologies that automate and digitalize financing—facilitated this process. In 2008, fintech was first introduced in Africa with M-Pesa in Kenya. It enables users to withdraw, transfer, and deposit funds into their accounts all through a mobile device. A study done in 2016 found that since its inception, M-Pesa has helped increase daily per capita consumption levels of 194,000 households, and that number has increased since then. With M-Pesa’s rapid success, the number of fintechs have increased across Africa and includes Fawry in Egypt, Yoco in South Africa, and Interswitch in Nigeria. Mobile money helps millions of unbanked people in Africa gain access to financial services. Those include small business owners, who make up 90 percent of businesses in Africa. Certainly, gaining access to financial tools such as M-Pesa and other mobile money services has expanded the possibilities for Africans to contribute to the continent’s growth.

Digital representation has also generated greater efficiencies in production. In agriculture, for example, the advancements of farming tools combined with the adoption of digital technologies has skyrocketed production. According to US Department of Agriculture data, between 1996 and 2017, the US average corn yield increased 42 percent, from 130 to 185 bushels an acre. It is not just corn either. Apples, wheat, and other crops have also seen significant growth in production worldwide since the mid–20th century. The increased use of technologies such as yield monitors and digital mapping has helped farmers manage their crop yields in greater capacity by highlighting what areas are prone to soil erosion and pests, promoting uniform crop growth and reducing the risk of lost crop production.

With increased production comes a significant amount of waste. In response, digital representation has revolutionized recycling and waste management. This development has been especially prominent in Europe. Some of the technologies developed improve logistics, such as routing systems and centralized Enterprise Resource Planning databases to manage and track waste data. Other innovations have helped sorting out waste for recycling, including robotic sorters and artificial intelligence image processing to recognize and pick out different types of waste using data algorithms. This digital revolution has led to improved waste management and an increase in recycling across Europe.

Beyond shaping businesses, digital representation creates a technological revolution that contributes to alleviating poverty, reducing waste, and improving lives across the globe. As humanity progresses, the power of digital representation will only expand as people continue to find ways to achieve what was not possible before.

Wall Street Journal | Science & Technology

Novel Ideas to Cool Data Centers: Liquid in Pipes or a Dunking Bath

“One of the latest innovations at artificial-intelligence chip maker Nvidia has nothing to do with bits and bytes. It involves liquid.

Nvidia’s coming GB200 server racks, which contain its next-generation Blackwell chips, will mainly be cooled with liquid circulated in tubes snaking through the hardware rather than by air. An Nvidia spokesman said the company was also working with suppliers on additional cooling technologies, including dunking entire drawer-sized computers in a nonconductive liquid that absorbs and dissipates heat.

Cooling is suddenly a hot business as engineers try to tame one of the world’s biggest electricity hogs. Global data centers—the big computer farms that handle AI calculations—are expected to gobble up 8% of total U.S. power demand by 2030, compared with about 3% currently, according to Goldman Sachs research.”

From Wall Street Journal.