fbpx
01 / 04
Heroes of Progress, Pt. 22: Virginia Apgar

Blog Post | Pregnancy & Birth

Heroes of Progress, Pt. 22: Virginia Apgar

Introducing the anesthesiologist who created the "Apgar Score," a test that has saved millions of babies' lives.

Today marks the 22nd installment in a series of articles by HumanProgress.org titled, Heroes of Progress. This bi-weekly column provides a short introduction to heroes who have made an extraordinary contribution to the well-being of humanity. You can find the 21st part of this series here.

This week, our Hero of Progress is Virginia Apgar, an American anesthesiologist and medical researcher who created a test that is used to quickly assess the health of new-born babies and to determine whether infants need immediate neonatal medical care. The test, which is named the ‘Apgar Score’ continues to be used as a standard practice across the world and it is credited with saving the lives of millions of babies since 1952.

Virginia Apgar was born in Westfield, New Jersey on June 7, 1909. Apgar had two older brothers, one of whom died at a young age due to tuberculosis, while the other lived with a chronic illness. Inspired by both of her brothers’ medical problems, Apgar opted for a career in the medical industry. In 1929, Apgar earnt a degree in zoology with minors in physiology and chemistry from Mount Holyoke College, and in the same year, she began her medical training at Columbia University’s College of Physicians and Surgeons (P&S).

Apgar obtained her MD in 1933 and began a two-year surgical internship at P&S’ Presbyterian Hospital. In spite of her good performance, P&S’ chairman, who was worried about economic prospects of new women surgeons during the Great Depression, advised Apgar to make a career in anesthesiology – a new field of study that was beginning to take shape as a medical, rather than a strictly nursing, speciality.

Apgar accepted the advice and after her internship ended in 1936, she began a year-long anesthetist training course at the Presbyterian Hospital. After completing the course, Apgar performed residencies in anesthesiology at the University of Wisconsin and Bellevue Hospital in Manhattan, NYC. In 1938, she returned to the Presbyterian Hospital and became the director of the newly established division of anesthesia. Apgar was the first woman to hold the position of a director at the Presbyterian Hospital.

In 1949, Apgar also became the first female to hold a full professorship at P&S. Professorship in anesthesiology freed Apgar from many of administrative duties, thus enabling her to devote more of her time to research.

Apgar noticed that infant (i.e., baby between the ages of 0 and 1) mortality in the United States rapidly declined between the 1930s and the 1950’s. However, the death rate for babies in the first 24 hours after birth stayed the same. Perplexed by this discrepancy, Apgar began documenting the differences between healthy new-borns and new-borns requiring medical attention.

In 1952, Apgar created a test called the “Apgar score” that medical professionals could use to asses the health of new-born infants. The Apgar scoring system gives each new-born a score of 0, 1 or 2. Zero denotes the worst possible condition and two denotes the ideal condition across each of the following five categories: activity (muscle tone), pulse, grimace (reflex irritability), appearance (skin color) and respiration. To make her assessment easy to remember, the first letter of each of the five categories spells “APGAR.”

The test is usually performed on new-born babies 1 minute and then 5 minutes after birth. A cumulative score of 3 or below is typically categorized as critically low and a cause for immediate medical action. Apgar’s test soon become common practice across the world. It remains a standard procedure to assess the health of new-born babies today.

In 1959, Apgar graduated with a Master of Public Health degree from Johns Hopkins University and began working for the March of Dimes Foundation – an American nonprofit organization that works to improve the health of mothers and babies – directing its research program with a focus on the treatment and prevention of birth defects.

Whilst working at the March of Dimes, Apgar also became an outspoken advocate for universal vaccinations to prevent the mother-to-child transmission of rubella. Later in life, Apgar became a lecturer and then a clinical professor of pediatrics at Cornell University. She died on August 7, 1974.

Throughout her career, Apgar received numerous honorary doctorates and was awarded the Distinguished Service Award from the American Society of Anesthesiologists (1966) and the Woman of the Year in Science by Ladies Home Journal (1973). In 1995, she was inducted into the U.S. National Women’s Hall of Fame.

The use of the Apgar Score is credited with lowering the infant mortality rate by considerably increasing the likelihood of babies’ survival in the first 24 hours after birth. The invention and use of Apgar’s test has saved millions of lives and continues to save thousands more every day. For that reason, Virginia Apgar is our 22nd Hero of Progress.

Blog Post | Health & Medical Care

Halloween: More Walking Dead, Fewer Dead Walkers

Today’s trick-or-treaters have far less to fear than past generations.

Summary: Halloween is a celebration of death and fear, but it also reveals how much safer and healthier life has become. This article shows how child mortality, especially from pedestrian accidents, has declined dramatically in recent decades. It also explores how other causes of death, such as disease and violence, have become less common thanks to human progress.


This Halloween, you might see your neighbors’ front yards decorated with faux tombstones and witness several children dressed as ghosts, skeletons, zombies, or other symbols of death. Thankfully, today’s trick-or-treaters can almost all expect to remain among the living until old age. But back when the holiday tradition of children going door-to-door in spooky costumes originated, death was often close at hand, and the young were particularly at risk.

Halloween’s origins are closely linked to concerns about death. The holiday arose out of All Souls’ Day, a Christian commemoration for the deceased falling on November 2 that is also simply called the Day of the Dead. In the Middle Ages, this observance was often fused with another church feast called All Saints’ Day or All Hallows’ Day on November 1. The night before, called All Hallows’ Eve—now shortened to Halloween—in parts of medieval Britain, children and people who were poor would visit their wealthier neighbors and receive “soul cakes,” round pastries with a cross shape on them. In exchange, they promised to pray for the cake-givers’ dead relatives. This was called “souling.”

In Ireland and Scotland, Halloween also incorporated some aspects of an old Celtic pagan tradition called Samhain, including bonfires and masquerades. Samhain was also associated with death and sometimes called the feast of the dead. Eventually the traditions of wearing masks and of going door-to-door for treats combined, and young people in Ireland and Scotland took part in a practice called “guising” that we now call trick-or-treating. Dressing as ghouls and other folkloric incarnations of death became popular.

In the 1800s, an influx of Irish immigrants is thought to have popularized this Halloween tradition in the United States. The phrase “trick-or-treating” dates to at least the 1920s, when Halloween pranks or tricks also became a popular pastime. But according to National Geographic, “Trick-or-treating became widespread in the U.S. after World War II, driven by the country’s suburbanization that allowed kids to safely travel door to door seeking candy from their neighbors.”

And just how safe today’s trick-or-treaters are, especially compared to the trick-or-treaters of years past, is underappreciated. Despite the occasional public panic about razor blades in candy, malicious tampering with Halloween treats is remarkably rare, especially given that upward of 70 percent of U.S. households hand out candy on Halloween each year.

The biggest danger to today’s trick-or-treaters is simply crossing streets. But while Halloween is the deadliest night of the year for children being struck by cars, there is heartening news: annual child pedestrian deaths have declined dramatically. The number of pedestrian deaths among children aged 13 or younger fell from 1,632 in 1975 to 144 in 2020. The steep decline is even more impressive when one considers that it occurred as the total number of people and cars in the country has increased substantially.

Today’s children are thus safer as they venture out on Halloween than the last few generations of trick-or-treaters were. And, of course, when compared to the world of the very first children to celebrate Halloween, the modern age is by many measures less dangerous, especially for the young. In medieval England, when “souling” began, the typical life expectancy for ducal families was merely 24 years for men and 33 for women. While data from the era is sparse, among non-noble families in Ireland and Scotland, where “guising” began, living conditions and mortality rates may have been far worse.

It is estimated that between 30 and 50 percent of medieval children did not survive infancy, let alone childhood, with many dying from diseases that are easily preventable or treatable today. Given that context, the medieval preoccupation with death that helped give rise to traditions like Halloween is quite understandable. Life expectancy was lower for everyone, even adult royalty: the mean life expectancy of the kings of Scotland and England who reigned between the years 1000 and 1600 was 51 and 48 years, respectively. Before the discovery of the germ theory of disease, the wealthy, along with “physicians and their kids lived the same amount of time as everybody else,” according to Nobel laureate Angus Deaton.

In 1850, during the wave of Irish immigration to the United States that popularized Halloween, little progress had been made for the masses: white Americans could expect to live only 25.5 years—similar to what a medieval ducal family could expect. (And for African Americans, life expectancy was just 21.4 years.)

But the wealth explosion after the Industrial Revolution soon funded widespread progress in sanitation. That reduced the spread of diarrheal diseases, a major killer of infants—and one of the top causes of death in 1850—improving children’s survival odds and lengthening lifespans. By 1927, the year when the term “trick-or-treating” first appeared in print, there had been clear progress: U.S. life expectancy was 59 years for men and 62 years for women. The public was soon treated to some innovative new medical tricks: the following year, antibiotics were discovered, and the ensuing decades saw the introduction of several new vaccines.

In 2021, U.S. life expectancy was 79.1 years for women and 73 years for men. That’s slightly down from recent years but still decades longer than life expectancy for the aforementioned medieval kings who ruled during Halloween’s origins. Life expectancy has risen for all age groups, but especially for children, thanks to incremental progress in everything from infant care to better car-seat design.

So as you enjoy the spooky festivities this Halloween, take a moment to appreciate that today’s trick-or-treaters inhabit a world that is in many ways less frightening than when Halloween originated.

Blog Post | Overall Mortality

The Canadian Child-Deaths Would Not Have Shocked Our Ancestors

Half of all children died before adulthood in archaic societies, one quarter before their first birthday and another quarter before the age of 15.

Summary: The recent discoveries of unmarked graves at former residential schools in Canada have shocked the world and exposed the brutal legacy of colonialism. However, for most of human history, such atrocities were common and accepted. This article argues that we should appreciate the human progress that has made us more sensitive to this suffering.


Revelations of graveyards containing the bodies of some 4,000 children from the First Nations in Canada have shocked the world. The dead were some of the 150,000 indigenous children sent or forcibly taken to residential schools meant to divorce the former from their birth culture.

The graves represent the injustice and misery of the past, but our reaction to them is proof of our advancement. In pre-industrial times, child deaths were so common that those graves wouldn’t have shocked anyone. In fact, a death rate of some three percent of children is low by historical standards. It was only in the past 50 or 60 years that the child mortality rate fell below 3 percent – even in rich countries.

The usual estimation is that half of all children died before adulthood in archaic societies, one quarter before their first birthday and another quarter before the age of 15, which is the end of puberty and our reasonable definition of becoming an adult. That seems to hold over all societies examined, including the Roman Empire, 18th century Britain, and all other groups of humans over time. (It is also, roughly speaking, true of the other Great Apes.)

This sorry state of affairs was brought to an end in three stages. The first stage was the discovery of infectious disease. John Snow, for example, showed that cholera cycled through the sewage and water systems. His discovery led to the single greatest aid to human health ever: the development of proper water systems, which provide fresh water and carry away sewage. Essentially, drains were the first step in reducing child mortality.

The second stage was the development of antibiotics. As late as 1924, an infected blister killed the U.S. President’s son. It wasn’t until the late 1930s that effective antibiotics were deployed at any scale. It took another decade to discover a treatment for tuberculosis, which was one of the great killers of the first half of the 20th century.

The third stage was the development of vaccines for common childhood diseases. Smallpox had been preventable since the 1790s with vaccination and through variolation before that. But polio remained a problem until the 1950s and measles into the 1960s. That last disease could, if unleashed against a population with no resistance at all, kill over 10 percent of people infected.

The combined effect of these discoveries – alongside better medical care, nutrition, shelter, and heating – has been a 100-fold reduction in youth mortality over the 20th century. The process isn’t finished yet. Far too many children still die due to a lack of access to clean water, antibiotics, and desirable immunizations. But those discoveries are all spreading, and, in that sense, the world is getting better at record speed.

In rich nations, such as Canada, Britain, the United States, the child mortality rate ranges from 0.5 percent to 0.8 percent – down from 50 percent in human history. A significant portion of the remaining mortality is due to accident, not disease.

This article is not meant to diminish the pain of losing a child or to suggest that pain was lighter in the past because it was reasonable to expect to lose a child. Rather, it is to point out how that loss is so much less common today.

Nor is it implying that the First Nations children in Canada were treated acceptably. Rather, the above data is meant to remind the reader that previous generations of humans would have found nothing strange at all about graveyards full of children. That we find them shocking today is proof of human progress.

Today’s expectation, an entirely reasonable one, is that any child born today will live between 70 and 80 years. We’re in the first two or three generations of humans who ever existed where the assumption of reaching adulthood is better than a 50/50 break. How can that not be thought of as progress?

Blog Post | Health & Medical Care

U.S. President's Son Dies of an Infected Blister?

No amount of wealth or power could save a patient from this minor ailment.

The year was 1924, the 16-year-old son of the president of the United States lay dying, and a bacterial infection in a blister on the third toe of his right foot was to blame. It developed earlier in the week, while he was out playing tennis on the White House lawn with his brother. Many of the best doctors of the day were consulted, multiple diagnostic tests were run, and he was admitted to one of the top hospitals in the country. Despite all that, he died within a week of infection. Sadly, the case of Calvin Coolidge’s son was not unusual. Deaths from sepsis following the infection of a minor cut or blister were extremely common at the time and no amount of wealth or power could save a patient.

Calvin Coolidge’s 16-year-old son stands on the far left. Image taken shortly before his infection. 

Just four years later, Alexander Fleming discovered penicillin – the world’s first antibiotic. Since then, one study estimates, penicillin has saved around 82 million lives. Antibiotics and other medical advancements have helped to raise global average life expectancy to an all-time high. Today, antibiotics are readily available for a few dollars at your local drugstore and around the world, and death from sepsis is much rarer than it once was. Access to antibiotics is just one of many ways in which an average person today is better off than the rich and powerful were a century ago.