fbpx
01 / 05
Some Historical Perspective for Anxious Parents

Blog Post | Health & Medical Care

Some Historical Perspective for Anxious Parents

Victorians regularly prescribed opium to treat infant teething, with deadly results.

Old photo of mother and child

As every parent knows, infancy and childhood can be fraught with peril. From Sudden Infant Death Syndrome to suffocation, the list of dangers to infants is extensive. But the risks today pale in comparison to those that children faced in the past.

Judith Flanders’ exhaustively detailed book, Inside the Victorian Home: a Portrait of Domestic Life in Victorian England, offers a disturbing glimpse of early childhood in the Victorian era. It should fill everyone with gratitude for the sheer scale of medical advancement in the last century or so.

The benefits of vaccines and proper sanitation are, of course, well-known. However, the sheer extent of their positive impact on the lives of children bears repeating. Before the age of five, 35 out of every 45 Victorian children had experienced either smallpox, measles, scarlet fever, diphtheria, whooping cough, typhus or enteric fever — or some combination of those illnesses — and many of them did not survive.

As late as 1899, more than 16 percent of children died before their first birthday; today in the United Kingdom that figure is 0.35 percent.

What is less well known is that it was not just disease, but also primitive medicine that killed infants.

Consider teething and one of its alleged cures. Today, teething is accepted as a routine stage of development. In Victorian England, it was not. According to the common wisdom of the age, teething was a potentially deadly disorder, sometimes involving convulsions, and should be treated with opium.

Giving opium to an infant is a very bad idea, causing — you guessed it — convulsions and often death. A shocking 16 percent of child deaths in Victorian England were the result of well-meaning attempts to treat teething with opium. Parents attributed these opium-induced deaths to teething and blamed themselves for failing to have administered a high enough dose of the opium “cure”.

Childhood ailments, real and imagined, were often treated with hard liquor, such as brandy, or milder alcoholic drinks, like wine. Patent medicines were also popular. John Collis Browne’s Cholodyne, for example, which was supposed to have cured everything from colds and coughs to stomach aches and sleeplessness, contained cannabis and a hypnotic drug called chloral hydrate in addition to opium.

Other common cures included ipecacuanha, a powdered root that induces vomiting; and the laxative calomel, made of mercury chloride, which is highly toxic. Newborns were also routinely given castor oil (a laxative later famously force-fed to prisoners in Mussolini’s Italy) shortly after birth.

It should be noted that, as horrid as Victorian medical practices were, child mortality rates fell dramatically as better urban sanitation and scientific understanding of disease spread. Medical practices weren’t the only part of Victorian childhoods that might today be considered abusive, or at least troubling. Childhood discipline practices, it turns out, have also evolved quite a bit since the Victorian era.

As Flanders writes, our understanding of “what seemed harsh changed over time”. Some Victorian parents held their children’s fingers against hot fireplace grates and cut them with knives, to teach them the dangers of fire and sharp objects. Corporal punishment, even including whipping, was still common, although it gradually vanished over the course of a century.

Children were denied any food that they enjoyed. Food that tasted good enough to be consumed because of desire rather than hunger was considered morally damaging to the young. Eggs, bacon, and other flavorful foods were thought to be the gateway drugs to a life of hedonism and sin.

Even children from prosperous families grew up on a Spartan diet of bread, porridge and watered-down milk. The wealthy Gwen Raveat née Darwin, a granddaughter of Charles Darwin and the daughter of a Cambridge University don, recalled that during her childhood, twice a week she was allowed toast “spread with a thin layer of that dangerous luxury, jam. But of course, not butter too. Butter and jam on the same bit of bread would have been an unheard-of indulgence — a disgraceful orgy.”

Many children had to contend with actual hunger rather than just bland diets, and child labor, often in dangerous conditions, was common. Child labor is not a practice unique to the Victorian era—however linked the two may be in the popular imagination.

It had been ubiquitous since time immemorial and finally began to come under scrutiny during the Victorian age. As the wealth generated by industrialisation started to improve working conditions and wages began to rise, fewer and fewer children worked compared to the past.

None of this is to deny the existence of things worth worrying about today. But parents shouldn’t forget that their kids are growing up in a far safer and gentler world than they would have been just a few generations ago.

This first appeared in CapX.

EIN Presswire | Child Abuse & Bullying

Zambia Passes Legislation Setting Marriageable Age at 18

“In a momentous stride towards safeguarding children’s rights, Zambia’s parliament passed the Marriage (Amendment) Act of 2023 on December 22, 2023. This landmark legislation unequivocally sets the marriageable age at 18, without exception, for all marriages, including customary marriages, representing a significant shift in the nation’s commitment to eradicating child marriage.”

From EIN Presswire.

Blog Post | Human Development

1,000 Bits of Good News You May Have Missed in 2023

A necessary balance to the torrent of negativity.

Reading the news can leave you depressed and misinformed. It’s partisan, shallow, and, above all, hopelessly negative. As Steven Pinker from Harvard University quipped, “The news is a nonrandom sample of the worst events happening on the planet on a given day.”

So, why does Human Progress feature so many news items? And why did I compile them in this giant list? Here are a few reasons:

  • Negative headlines get more clicks. Promoting positive stories provides a necessary balance to the torrent of negativity.
  • Statistics are vital to a proper understanding of the world, but many find anecdotes more compelling.
  • Many people acknowledge humanity’s progress compared to the past but remain unreasonably pessimistic about the present—not to mention the future. Positive news can help improve their state of mind.
  • We have agency to make the world better. It is appropriate to recognize and be grateful for those who do.

Below is a nonrandom sample (n = ~1000) of positive news we collected this year, separated by topic area. Please scroll, skim, and click. Or—to be even more enlightened—read this blog post and then look through our collection of long-term trends and datasets.

Agriculture

Aquaculture

Farming robots and drones

Food abundance

Genetic modification

Indoor farming

Lab-grown produce

Pollination

Other innovations

Conservation and Biodiversity

Big cats

Birds

Turtles

Whales

Other comebacks

Forests

Reefs

Rivers and lakes

Surveillance and discovery

Rewilding and conservation

De-extinction

Culture and tolerance

Gender equality

General wellbeing

LGBT

Treatment of animals

Energy and natural Resources

Fission

Fusion

Fossil fuels

Other energy

Recycling and resource efficiency

Resource abundance

Environment and pollution

Climate change

Disaster resilience

Air pollution

Water pollution

Growth and development

Education

Economic growth

Housing and urbanization

Labor and employment

Health

Cancer

Disability and assistive technology

Dementia and Alzheimer’s

Diabetes

Heart disease and stroke

Other non-communicable diseases

HIV/AIDS

Malaria

Other communicable diseases

Maternal care

Fertility and birth control

Mental health and addiction

Weight and nutrition

Longevity and mortality 

Surgery and emergency medicine

Measurement and imaging

Health systems

Other innovations

Freedom

    Technology 

    Artificial intelligence

    Communications

    Computing

    Construction and manufacturing

    Drones

    Robotics and automation

    Autonomous vehicles

    Transportation

    Other innovations

    Science

    AI in science

    Biology

    Chemistry and materials

      Physics

      Space

      Violence

      Crime

      War

      Blog Post | Health & Medical Care

      Halloween: More Walking Dead, Fewer Dead Walkers

      Today’s trick-or-treaters have far less to fear than past generations.

      Summary: Halloween is a celebration of death and fear, but it also reveals how much safer and healthier life has become. This article shows how child mortality, especially from pedestrian accidents, has declined dramatically in recent decades. It also explores how other causes of death, such as disease and violence, have become less common thanks to human progress.


      This Halloween, you might see your neighbors’ front yards decorated with faux tombstones and witness several children dressed as ghosts, skeletons, zombies, or other symbols of death. Thankfully, today’s trick-or-treaters can almost all expect to remain among the living until old age. But back when the holiday tradition of children going door-to-door in spooky costumes originated, death was often close at hand, and the young were particularly at risk.

      Halloween’s origins are closely linked to concerns about death. The holiday arose out of All Souls’ Day, a Christian commemoration for the deceased falling on November 2 that is also simply called the Day of the Dead. In the Middle Ages, this observance was often fused with another church feast called All Saints’ Day or All Hallows’ Day on November 1. The night before, called All Hallows’ Eve—now shortened to Halloween—in parts of medieval Britain, children and people who were poor would visit their wealthier neighbors and receive “soul cakes,” round pastries with a cross shape on them. In exchange, they promised to pray for the cake-givers’ dead relatives. This was called “souling.”

      In Ireland and Scotland, Halloween also incorporated some aspects of an old Celtic pagan tradition called Samhain, including bonfires and masquerades. Samhain was also associated with death and sometimes called the feast of the dead. Eventually the traditions of wearing masks and of going door-to-door for treats combined, and young people in Ireland and Scotland took part in a practice called “guising” that we now call trick-or-treating. Dressing as ghouls and other folkloric incarnations of death became popular.

      In the 1800s, an influx of Irish immigrants is thought to have popularized this Halloween tradition in the United States. The phrase “trick-or-treating” dates to at least the 1920s, when Halloween pranks or tricks also became a popular pastime. But according to National Geographic, “Trick-or-treating became widespread in the U.S. after World War II, driven by the country’s suburbanization that allowed kids to safely travel door to door seeking candy from their neighbors.”

      And just how safe today’s trick-or-treaters are, especially compared to the trick-or-treaters of years past, is underappreciated. Despite the occasional public panic about razor blades in candy, malicious tampering with Halloween treats is remarkably rare, especially given that upward of 70 percent of U.S. households hand out candy on Halloween each year.

      The biggest danger to today’s trick-or-treaters is simply crossing streets. But while Halloween is the deadliest night of the year for children being struck by cars, there is heartening news: annual child pedestrian deaths have declined dramatically. The number of pedestrian deaths among children aged 13 or younger fell from 1,632 in 1975 to 144 in 2020. The steep decline is even more impressive when one considers that it occurred as the total number of people and cars in the country has increased substantially.

      Today’s children are thus safer as they venture out on Halloween than the last few generations of trick-or-treaters were. And, of course, when compared to the world of the very first children to celebrate Halloween, the modern age is by many measures less dangerous, especially for the young. In medieval England, when “souling” began, the typical life expectancy for ducal families was merely 24 years for men and 33 for women. While data from the era is sparse, among non-noble families in Ireland and Scotland, where “guising” began, living conditions and mortality rates may have been far worse.

      It is estimated that between 30 and 50 percent of medieval children did not survive infancy, let alone childhood, with many dying from diseases that are easily preventable or treatable today. Given that context, the medieval preoccupation with death that helped give rise to traditions like Halloween is quite understandable. Life expectancy was lower for everyone, even adult royalty: the mean life expectancy of the kings of Scotland and England who reigned between the years 1000 and 1600 was 51 and 48 years, respectively. Before the discovery of the germ theory of disease, the wealthy, along with “physicians and their kids lived the same amount of time as everybody else,” according to Nobel laureate Angus Deaton.

      In 1850, during the wave of Irish immigration to the United States that popularized Halloween, little progress had been made for the masses: white Americans could expect to live only 25.5 years—similar to what a medieval ducal family could expect. (And for African Americans, life expectancy was just 21.4 years.)

      But the wealth explosion after the Industrial Revolution soon funded widespread progress in sanitation. That reduced the spread of diarrheal diseases, a major killer of infants—and one of the top causes of death in 1850—improving children’s survival odds and lengthening lifespans. By 1927, the year when the term “trick-or-treating” first appeared in print, there had been clear progress: U.S. life expectancy was 59 years for men and 62 years for women. The public was soon treated to some innovative new medical tricks: the following year, antibiotics were discovered, and the ensuing decades saw the introduction of several new vaccines.

      In 2021, U.S. life expectancy was 79.1 years for women and 73 years for men. That’s slightly down from recent years but still decades longer than life expectancy for the aforementioned medieval kings who ruled during Halloween’s origins. Life expectancy has risen for all age groups, but especially for children, thanks to incremental progress in everything from infant care to better car-seat design.

      So as you enjoy the spooky festivities this Halloween, take a moment to appreciate that today’s trick-or-treaters inhabit a world that is in many ways less frightening than when Halloween originated.

      Blog Post | Child Abuse & Bullying

      Eradicating FGM Requires Persuasion, Not Punishment

      Classrooms, not jail cells, offer the best hope of changing norms around female genital mutilation.

      Summary: Female genital mutilation is a harmful practice that affects millions of girls and women around the world. Although there has been some progress in eradicating these procedures, punitive legislation is not enough to stop them. Read more about how an education-based approach might be the answer in this article by Rifal Imam.


      Historically, the cultures that carry out female genital mutilation/circumcision (FGM/C) on a wide scale associate the unfortunate practice with enhanced marriage potential and social acceptance. Sudan, the country of my family, is no exception. In Sudan, the most common term referencing FGM/C is “tahura,” the Arabic word roughly translating to “pure” or “cleanliness.” Around 87 percent of Sudanese women have undergone circumcision. They number among the recorded 200 million girls and women worldwide who have undergone the procedure, which is practiced in more than 31 countries. This mutilation consists of “all procedures involving the partial or total removal of the external female genitalia or other injuries to the female genital organs for non-medical reasons.”

      There has been some progress. Sudan traditionally practiced “pharaonic purification,” the most radical form of genital mutilation. Recently the practice has been medicalized, with 63.6 percent of women being cut by a trained midwife and 28.7 percent by a traditional cutter lacking medical training. This is connected to the social shift of practicing the less extensive “sunna” version of circumcision rather than the more extreme pharaonic form. The sunna cut is usually found among better educated, wealthier, urban younger women rather than in older generations, thus signifying changing traditions. Furthermore, the refusal of the practice is no longer considered detrimental to a woman’s entrance to society or her family’s honor. Many still hold the practice, however, as a positive or neutral Sudanese tradition connected to their national identity and ideals of female purity.

      The first time I heard of tahura I was eight and visiting family in Sudan. My cousins were excitedly talking about the parties that come after. When I asked my mother, who was born and raised in Sudan, about the practice, she told me off, stating how tahura hindered women. My mother, and now my cousins, shaped my attitudes and encouraged my hope for a changed Sudan. Female genital mutilation reinforces negative social and economic structures, as can be seen in the stark difference between its prevalence in rural versus urban areas, and it overall hinders the progress of women in the nation. 

      Legislation aimed at tackling FGM/C on a national level, such as the 2020 criminalization of the practice, has failed for decades. Studies illustrate that punitive laws and international intervention campaigns have failed to combat the problem in its many forms in Sudan. Better-informed efforts must be taken.

      I argue that the best approach to eradicating FGM/C in a shame-based collectivist nation such as Sudan is not through punitive legislation and the use of government force, but through cultural channels and persuasion. Punishment-based legislation, particularly in collectivist nations, is largely ineffective. Instead, the best policy option is to take nonpunitive measures against practitioners of FGM/C, thus making it more probable for people to report cases without fear of punishment. Education against circumcision and its consequences for those found to practice it are far more effective than trying to change traditional practices using a heavy-handed approach through the criminal justice system.

      The British took the education-based approach in the 1900s and it proved effective under a midwifery training school that was meant to teach a less radical form of FGM/C. The school endeavored to train and convince local women to “abandon harmful customs” and took the non-prohibitionary route, claiming to work with and not against local customs. When the British introduced criminalization of the practice and punished midwives for performing it, the majority of Sudanese midwives did not comply, and circumcision continued in secret with unhygienic procedures and no proper supplies. That created further risk to the circumcised women’s well-being instead of gradually phasing out the practice under the harm-reducing midwifery training school.

      The education-based approach admittedly requires more effort than the punitive status quo, but of the available options I believe it’s the most effective way forward. 

      Eradicating FGM/C requires joint initiatives of various actors, including community-based programs led by local residents rather than outside activists. In a shame-based country such as Sudan, changes in culture and values are a vital part of the process, as highlighted by FGM/C’s continued prevalence for thousands of years and the difficulty of eradicating it. The issue has been on the agenda for almost a century and receives much attention, yet is still a persistent problem. UNICEF has found that culturally-sensitive education and public awareness–raising activities are effective at contributing to the practice’s decline in many communities. 

      It should be emphasized that there is no true quick-fix solution in Sudan, considering how long the issue has been on the agenda. Education campaigns, including those originating from nonpunitive legislation, can be structured to be culturally sensitive to local value systems and thus persuade the people who do the cutting to adopt a different approach. Classrooms, not jail cells, offer the best hope of changing norms around FGM/C in Sudan.