fbpx
01 / 05
Eradicating FGM Requires Persuasion, Not Punishment

Blog Post | Child Abuse & Bullying

Eradicating FGM Requires Persuasion, Not Punishment

Classrooms, not jail cells, offer the best hope of changing norms around female genital mutilation.

Summary: Female genital mutilation is a harmful practice that affects millions of girls and women around the world. Although there has been some progress in eradicating these procedures, punitive legislation is not enough to stop them. Read more about how an education-based approach might be the answer in this article by Rifal Imam.


Historically, the cultures that carry out female genital mutilation/circumcision (FGM/C) on a wide scale associate the unfortunate practice with enhanced marriage potential and social acceptance. Sudan, the country of my family, is no exception. In Sudan, the most common term referencing FGM/C is “tahura,” the Arabic word roughly translating to “pure” or “cleanliness.” Around 87 percent of Sudanese women have undergone circumcision. They number among the recorded 200 million girls and women worldwide who have undergone the procedure, which is practiced in more than 31 countries. This mutilation consists of “all procedures involving the partial or total removal of the external female genitalia or other injuries to the female genital organs for non-medical reasons.”

There has been some progress. Sudan traditionally practiced “pharaonic purification,” the most radical form of genital mutilation. Recently the practice has been medicalized, with 63.6 percent of women being cut by a trained midwife and 28.7 percent by a traditional cutter lacking medical training. This is connected to the social shift of practicing the less extensive “sunna” version of circumcision rather than the more extreme pharaonic form. The sunna cut is usually found among better educated, wealthier, urban younger women rather than in older generations, thus signifying changing traditions. Furthermore, the refusal of the practice is no longer considered detrimental to a woman’s entrance to society or her family’s honor. Many still hold the practice, however, as a positive or neutral Sudanese tradition connected to their national identity and ideals of female purity.

The first time I heard of tahura I was eight and visiting family in Sudan. My cousins were excitedly talking about the parties that come after. When I asked my mother, who was born and raised in Sudan, about the practice, she told me off, stating how tahura hindered women. My mother, and now my cousins, shaped my attitudes and encouraged my hope for a changed Sudan. Female genital mutilation reinforces negative social and economic structures, as can be seen in the stark difference between its prevalence in rural versus urban areas, and it overall hinders the progress of women in the nation. 

Legislation aimed at tackling FGM/C on a national level, such as the 2020 criminalization of the practice, has failed for decades. Studies illustrate that punitive laws and international intervention campaigns have failed to combat the problem in its many forms in Sudan. Better-informed efforts must be taken.

I argue that the best approach to eradicating FGM/C in a shame-based collectivist nation such as Sudan is not through punitive legislation and the use of government force, but through cultural channels and persuasion. Punishment-based legislation, particularly in collectivist nations, is largely ineffective. Instead, the best policy option is to take nonpunitive measures against practitioners of FGM/C, thus making it more probable for people to report cases without fear of punishment. Education against circumcision and its consequences for those found to practice it are far more effective than trying to change traditional practices using a heavy-handed approach through the criminal justice system.

The British took the education-based approach in the 1900s and it proved effective under a midwifery training school that was meant to teach a less radical form of FGM/C. The school endeavored to train and convince local women to “abandon harmful customs” and took the non-prohibitionary route, claiming to work with and not against local customs. When the British introduced criminalization of the practice and punished midwives for performing it, the majority of Sudanese midwives did not comply, and circumcision continued in secret with unhygienic procedures and no proper supplies. That created further risk to the circumcised women’s well-being instead of gradually phasing out the practice under the harm-reducing midwifery training school.

The education-based approach admittedly requires more effort than the punitive status quo, but of the available options I believe it’s the most effective way forward. 

Eradicating FGM/C requires joint initiatives of various actors, including community-based programs led by local residents rather than outside activists. In a shame-based country such as Sudan, changes in culture and values are a vital part of the process, as highlighted by FGM/C’s continued prevalence for thousands of years and the difficulty of eradicating it. The issue has been on the agenda for almost a century and receives much attention, yet is still a persistent problem. UNICEF has found that culturally-sensitive education and public awareness–raising activities are effective at contributing to the practice’s decline in many communities. 

It should be emphasized that there is no true quick-fix solution in Sudan, considering how long the issue has been on the agenda. Education campaigns, including those originating from nonpunitive legislation, can be structured to be culturally sensitive to local value systems and thus persuade the people who do the cutting to adopt a different approach. Classrooms, not jail cells, offer the best hope of changing norms around FGM/C in Sudan. 

EIN Presswire | Child Abuse & Bullying

Zambia Passes Legislation Setting Marriageable Age at 18

“In a momentous stride towards safeguarding children’s rights, Zambia’s parliament passed the Marriage (Amendment) Act of 2023 on December 22, 2023. This landmark legislation unequivocally sets the marriageable age at 18, without exception, for all marriages, including customary marriages, representing a significant shift in the nation’s commitment to eradicating child marriage.”

From EIN Presswire.

Blog Post | Human Development

1,000 Bits of Good News You May Have Missed in 2023

A necessary balance to the torrent of negativity.

Reading the news can leave you depressed and misinformed. It’s partisan, shallow, and, above all, hopelessly negative. As Steven Pinker from Harvard University quipped, “The news is a nonrandom sample of the worst events happening on the planet on a given day.”

So, why does Human Progress feature so many news items? And why did I compile them in this giant list? Here are a few reasons:

  • Negative headlines get more clicks. Promoting positive stories provides a necessary balance to the torrent of negativity.
  • Statistics are vital to a proper understanding of the world, but many find anecdotes more compelling.
  • Many people acknowledge humanity’s progress compared to the past but remain unreasonably pessimistic about the present—not to mention the future. Positive news can help improve their state of mind.
  • We have agency to make the world better. It is appropriate to recognize and be grateful for those who do.

Below is a nonrandom sample (n = ~1000) of positive news we collected this year, separated by topic area. Please scroll, skim, and click. Or—to be even more enlightened—read this blog post and then look through our collection of long-term trends and datasets.

Agriculture

Aquaculture

Farming robots and drones

Food abundance

Genetic modification

Indoor farming

Lab-grown produce

Pollination

Other innovations

Conservation and Biodiversity

Big cats

Birds

Turtles

Whales

Other comebacks

Forests

Reefs

Rivers and lakes

Surveillance and discovery

Rewilding and conservation

De-extinction

Culture and tolerance

Gender equality

General wellbeing

LGBT

Treatment of animals

Energy and natural Resources

Fission

Fusion

Fossil fuels

Other energy

Recycling and resource efficiency

Resource abundance

Environment and pollution

Climate change

Disaster resilience

Air pollution

Water pollution

Growth and development

Education

Economic growth

Housing and urbanization

Labor and employment

Health

Cancer

Disability and assistive technology

Dementia and Alzheimer’s

Diabetes

Heart disease and stroke

Other non-communicable diseases

HIV/AIDS

Malaria

Other communicable diseases

Maternal care

Fertility and birth control

Mental health and addiction

Weight and nutrition

Longevity and mortality 

Surgery and emergency medicine

Measurement and imaging

Health systems

Other innovations

Freedom

    Technology 

    Artificial intelligence

    Communications

    Computing

    Construction and manufacturing

    Drones

    Robotics and automation

    Autonomous vehicles

    Transportation

    Other innovations

    Science

    AI in science

    Biology

    Chemistry and materials

      Physics

      Space

      Violence

      Crime

      War

      Blog Post | Health & Medical Care

      Halloween: More Walking Dead, Fewer Dead Walkers

      Today’s trick-or-treaters have far less to fear than past generations.

      Summary: Halloween is a celebration of death and fear, but it also reveals how much safer and healthier life has become. This article shows how child mortality, especially from pedestrian accidents, has declined dramatically in recent decades. It also explores how other causes of death, such as disease and violence, have become less common thanks to human progress.


      This Halloween, you might see your neighbors’ front yards decorated with faux tombstones and witness several children dressed as ghosts, skeletons, zombies, or other symbols of death. Thankfully, today’s trick-or-treaters can almost all expect to remain among the living until old age. But back when the holiday tradition of children going door-to-door in spooky costumes originated, death was often close at hand, and the young were particularly at risk.

      Halloween’s origins are closely linked to concerns about death. The holiday arose out of All Souls’ Day, a Christian commemoration for the deceased falling on November 2 that is also simply called the Day of the Dead. In the Middle Ages, this observance was often fused with another church feast called All Saints’ Day or All Hallows’ Day on November 1. The night before, called All Hallows’ Eve—now shortened to Halloween—in parts of medieval Britain, children and people who were poor would visit their wealthier neighbors and receive “soul cakes,” round pastries with a cross shape on them. In exchange, they promised to pray for the cake-givers’ dead relatives. This was called “souling.”

      In Ireland and Scotland, Halloween also incorporated some aspects of an old Celtic pagan tradition called Samhain, including bonfires and masquerades. Samhain was also associated with death and sometimes called the feast of the dead. Eventually the traditions of wearing masks and of going door-to-door for treats combined, and young people in Ireland and Scotland took part in a practice called “guising” that we now call trick-or-treating. Dressing as ghouls and other folkloric incarnations of death became popular.

      In the 1800s, an influx of Irish immigrants is thought to have popularized this Halloween tradition in the United States. The phrase “trick-or-treating” dates to at least the 1920s, when Halloween pranks or tricks also became a popular pastime. But according to National Geographic, “Trick-or-treating became widespread in the U.S. after World War II, driven by the country’s suburbanization that allowed kids to safely travel door to door seeking candy from their neighbors.”

      And just how safe today’s trick-or-treaters are, especially compared to the trick-or-treaters of years past, is underappreciated. Despite the occasional public panic about razor blades in candy, malicious tampering with Halloween treats is remarkably rare, especially given that upward of 70 percent of U.S. households hand out candy on Halloween each year.

      The biggest danger to today’s trick-or-treaters is simply crossing streets. But while Halloween is the deadliest night of the year for children being struck by cars, there is heartening news: annual child pedestrian deaths have declined dramatically. The number of pedestrian deaths among children aged 13 or younger fell from 1,632 in 1975 to 144 in 2020. The steep decline is even more impressive when one considers that it occurred as the total number of people and cars in the country has increased substantially.

      Today’s children are thus safer as they venture out on Halloween than the last few generations of trick-or-treaters were. And, of course, when compared to the world of the very first children to celebrate Halloween, the modern age is by many measures less dangerous, especially for the young. In medieval England, when “souling” began, the typical life expectancy for ducal families was merely 24 years for men and 33 for women. While data from the era is sparse, among non-noble families in Ireland and Scotland, where “guising” began, living conditions and mortality rates may have been far worse.

      It is estimated that between 30 and 50 percent of medieval children did not survive infancy, let alone childhood, with many dying from diseases that are easily preventable or treatable today. Given that context, the medieval preoccupation with death that helped give rise to traditions like Halloween is quite understandable. Life expectancy was lower for everyone, even adult royalty: the mean life expectancy of the kings of Scotland and England who reigned between the years 1000 and 1600 was 51 and 48 years, respectively. Before the discovery of the germ theory of disease, the wealthy, along with “physicians and their kids lived the same amount of time as everybody else,” according to Nobel laureate Angus Deaton.

      In 1850, during the wave of Irish immigration to the United States that popularized Halloween, little progress had been made for the masses: white Americans could expect to live only 25.5 years—similar to what a medieval ducal family could expect. (And for African Americans, life expectancy was just 21.4 years.)

      But the wealth explosion after the Industrial Revolution soon funded widespread progress in sanitation. That reduced the spread of diarrheal diseases, a major killer of infants—and one of the top causes of death in 1850—improving children’s survival odds and lengthening lifespans. By 1927, the year when the term “trick-or-treating” first appeared in print, there had been clear progress: U.S. life expectancy was 59 years for men and 62 years for women. The public was soon treated to some innovative new medical tricks: the following year, antibiotics were discovered, and the ensuing decades saw the introduction of several new vaccines.

      In 2021, U.S. life expectancy was 79.1 years for women and 73 years for men. That’s slightly down from recent years but still decades longer than life expectancy for the aforementioned medieval kings who ruled during Halloween’s origins. Life expectancy has risen for all age groups, but especially for children, thanks to incremental progress in everything from infant care to better car-seat design.

      So as you enjoy the spooky festivities this Halloween, take a moment to appreciate that today’s trick-or-treaters inhabit a world that is in many ways less frightening than when Halloween originated.

      Blog Post | Overall Mortality

      The Canadian Child-Deaths Would Not Have Shocked Our Ancestors

      Half of all children died before adulthood in archaic societies, one quarter before their first birthday and another quarter before the age of 15.

      Summary: The recent discoveries of unmarked graves at former residential schools in Canada have shocked the world and exposed the brutal legacy of colonialism. However, for most of human history, such atrocities were common and accepted. This article argues that we should appreciate the human progress that has made us more sensitive to this suffering.


      Revelations of graveyards containing the bodies of some 4,000 children from the First Nations in Canada have shocked the world. The dead were some of the 150,000 indigenous children sent or forcibly taken to residential schools meant to divorce the former from their birth culture.

      The graves represent the injustice and misery of the past, but our reaction to them is proof of our advancement. In pre-industrial times, child deaths were so common that those graves wouldn’t have shocked anyone. In fact, a death rate of some three percent of children is low by historical standards. It was only in the past 50 or 60 years that the child mortality rate fell below 3 percent – even in rich countries.

      The usual estimation is that half of all children died before adulthood in archaic societies, one quarter before their first birthday and another quarter before the age of 15, which is the end of puberty and our reasonable definition of becoming an adult. That seems to hold over all societies examined, including the Roman Empire, 18th century Britain, and all other groups of humans over time. (It is also, roughly speaking, true of the other Great Apes.)

      This sorry state of affairs was brought to an end in three stages. The first stage was the discovery of infectious disease. John Snow, for example, showed that cholera cycled through the sewage and water systems. His discovery led to the single greatest aid to human health ever: the development of proper water systems, which provide fresh water and carry away sewage. Essentially, drains were the first step in reducing child mortality.

      The second stage was the development of antibiotics. As late as 1924, an infected blister killed the U.S. President’s son. It wasn’t until the late 1930s that effective antibiotics were deployed at any scale. It took another decade to discover a treatment for tuberculosis, which was one of the great killers of the first half of the 20th century.

      The third stage was the development of vaccines for common childhood diseases. Smallpox had been preventable since the 1790s with vaccination and through variolation before that. But polio remained a problem until the 1950s and measles into the 1960s. That last disease could, if unleashed against a population with no resistance at all, kill over 10 percent of people infected.

      The combined effect of these discoveries – alongside better medical care, nutrition, shelter, and heating – has been a 100-fold reduction in youth mortality over the 20th century. The process isn’t finished yet. Far too many children still die due to a lack of access to clean water, antibiotics, and desirable immunizations. But those discoveries are all spreading, and, in that sense, the world is getting better at record speed.

      In rich nations, such as Canada, Britain, the United States, the child mortality rate ranges from 0.5 percent to 0.8 percent – down from 50 percent in human history. A significant portion of the remaining mortality is due to accident, not disease.

      This article is not meant to diminish the pain of losing a child or to suggest that pain was lighter in the past because it was reasonable to expect to lose a child. Rather, it is to point out how that loss is so much less common today.

      Nor is it implying that the First Nations children in Canada were treated acceptably. Rather, the above data is meant to remind the reader that previous generations of humans would have found nothing strange at all about graveyards full of children. That we find them shocking today is proof of human progress.

      Today’s expectation, an entirely reasonable one, is that any child born today will live between 70 and 80 years. We’re in the first two or three generations of humans who ever existed where the assumption of reaching adulthood is better than a 50/50 break. How can that not be thought of as progress?