fbpx
01 / 03
Centers of Progress, Pt. 37: Dubrovnik (Public Health)

Blog Post | Health Systems

Centers of Progress, Pt. 37: Dubrovnik (Public Health)

In the midst of the Black Death, Dubrovnik kept its port open with innovations in public health.

Today marks the 37th installment in a series of articles by HumanProgress.org called Centers of Progress. Where does progress happen? The story of civilization is in many ways the story of the city. It is the city that has helped to create and define the modern world. This bi-weekly column will give a short overview of urban centers that were the sites of pivotal advances in culture, economics, politics, technology, etc.

The 37th Center of Progress is now called Dubrovnik, but was known historically as Ragusa. The picturesque port city is nicknamed “the pearl of the Adriatic” for its beauty. But the city has also been called “the Hong Kong of the Mediterranean” for its historic embrace of personal and economic freedom and its maritime trade-based prosperity. Not only was the small city-state of the Republic of Ragusa at the forefront of freedom for its time, being one of the earliest countries to ban slavery, but the glittering merchant city on the sea was also the site of an early milestone in the history of public health: quarantine waiting periods, which were first implemented in 1377. In 1390, Dubrovnik also created the world’s first permanent public health office. Perhaps more than any other city, Dubrovnik can claim to have helped create the idea of public health.

Today, Dubrovnik is best known for its exquisite sights, including many historic buildings and museums. It is located in the southern Croatian region of Dalmatia, best known for the Dalmatian dog breed, which existed as far back as1375. Tourism dominates the economy. Much of the city’s layout remains largely unchanged from the year 1292, with narrow winding stone-paved streets; innumerable medieval monuments, towers, and monasteries; and charming garden-surrounded villas and orange groves. The old city is a designated UNESCO World Heritage Site, boasting well-preserved Gothic, Renaissance, and Baroque architecture in the form of numerous churches and palaces. The city is often considered a major artistic center of Croatia and the site of many cultural activities, theatrical and musical performances, festivals, and museums. The city’s Banje Beach is also popular, and the Gruz Port is now busy with cruise ships.

The Irish playwright George Bernard Shaw (1856–1950) claimed, “Those who seek paradise on Earth should come to Dubrovnik.” Fans of Game of Thrones may recognize Dubrovnik as the set bringing to life the fictional seaside city of King’s Landing. But whereas King’s Landing was the capital of a despotic absolute monarchy, in reality Dubrovnik was devoted to freedom to an unusual degree from its inception, and is proud to have had no king. “The city-republic was liberal in character, affording asylum to refugees of all nations—one of them, according to legend, was King Richard I (the Lionheart) of England, who landed on the offshore island of Lokrum in 1192 on his return from the Crusades,” according to the Encyclopedia Britannica.

Dubrovnik was a tributary city-state under Venetian suzerainty from 1205–1358, retaining substantial independence and growing prosperous as a mercantile power. It was during that period, in 1348, when the bubonic plague first reached the city. Within four years the disease extinguished the lives of perhaps two-thirds of the city’s citizens. And that was just the first wave. During the Black Death pandemic, periodic lulls were often followed by new outbreaks.

In 1358, Hungary pressured Venice to surrender control of Dubrovnik, and the Republic of Ragusa (1358–1808) was born. It was during the republican era that the city created the novel public health measure of quarantine, and practiced it from 1377–1533. While not perfect—outbreaks of plague occurred in 1391 and 1397—the measure was nonetheless revolutionary. Other cities soon implemented similar protocols, such as Geneva in 1467. 

“It should not be a surprise to find Dubrovnik at the heart of quarantine’s origin-story, because the city was a seafaring supernova for much of the medieval era,” notes British journalist Chris Leadbeater. An aristocratic republic with fewer than 10,000 people living within its walls and a constitution resembling Venice’s, Dubrovnik was ruled by a council of merchant princes selected from the patrician families that comprised about a third of the city’s population. Unlike in Venice, the ranks of the nobility were never formally closed, meaning that newly successful merchant families could gain patrician status. Term limits restricted the top government official, the rector, from serving for more than a month, after which he could not seek the role again for two years. Dubrovnik also “never saw an elaborate increase in bureaucratic functions or felt the great weight of government intervention as Venetians did,” opting for relatively limited government interference with the city’s robust trade. 

If you could visit Dubrovnik during its maritime golden years (1350–1575), you would enter a vibrant coastal city filled with stone architecture, diverse travelers speaking languages ranging from German to Turkish to Italian, and awash in art and commerce. You might have glimpsed noblewomen wearing fine jewelry—who were free to trade their jewels without male permission even in that age of extreme gender inequality, thus contributing to a lucrative export market.

The Croatian economic historian Vladimir Stipetić has noted, “Dubrovnik traded like Hong Kong, Singapore, Taiwan . . . but did so some five hundred years before . . . [and like those countries] became prosperous . . . because of [its] adopted economic policy.” As a result of the city’s relative economic freedom, and the resources saved by the city’s disinterest in military expansionism, Dubrovnik’s fleet of hundreds of merchant ships at times outnumbered those of Venice, despite the latter boasting perhaps 10 times the population of Dubrovnik. Dubrovnik’s economic expansion is also, of course, owed to the innovativeness of its people. In the 15th century, a Dubrovnik humanist, merchant, and nobleman named Benedetto Cotrugli (1416–1469) published Della mercatura e del mercante perfetto (Trade and the Perfect Merchant), which is thought to be the first work on bookkeeping in the world. It was also a trade manual advocating for honesty in all dealings.

The republic mediated trade between the Ottoman Empire and what was popularly called Christendom. Located at the intersection of territories practicing Islam, Catholicism, and Orthodox Christianity, Dubrovnik maintained a policy of friendly trade with people of all faiths in an era when religious tensions were high, while internally endorsing Catholicism. The city’s culture was unusually “secular, sophisticated, individualistic,” and cosmopolitan for its time. During its republican era, Dubrovnik became a major center of Slavic literature and art, as well as philosophy, particularly in the 15th, 16th and 17th centuries—earning the city the nickname “the Slavic Athens.” It produced notable writers, such as Cerva (1463–1520), Šiško Menčetić (1457-1527), Marin Držić (1508–1567), and Ivan Gundulić (1589-1638), now regarded as Croatia’s national poet. His most famous poem is the “Hymn to Freedom”:

O liepa, o draga, o slatka slobodo,

dar u kom sva blaga višnji nam Bog je dô,

uzroče istini od naše sve slave,

uresu jedini od ove Dubrave,

sva srebra, sva zlata, svi ljudcki životi

ne mogu bit plata tvôj čistoj lipoti.

O beautiful, o precious, o sweet Liberty,

the greatest gift of all the treasures God has given us,

the truth of all our glory,

the decoration of Dubrovnik,

all silver, all gold, all human lives

are not worth as much as your pure beauty.

Despite its lack of military power and its miniscule size, Dubrovnik’s economic freedom and remarkable political and social stability helped the tiny republic to survive for almost half a millennium before Napoleon conquered it in 1808. While Dubrovnik was at times compelled to provide tribute to its more powerful neighbors to maintain political independence, the republic’s citizens were proud of their relative liberty. In fact, the republic’s Latin motto was Non bene pro toto libertas venditur auro, meaning, “Liberty is not sold for all the gold in the world.” The republic’s flag was simply the word Libertas (Latin for “liberty”) in red on a white background. From 1792 to 1795, Dubrovnik also issued silver coins called libertinas, featuring the word Libertas in the design’s central position. Moreover, the republic was among the first European countries to abolish slavery, outlawing the slave trade in 1416. The city’s governing council voted that “none of our nationals or foreigners, and everyone who considers himself or herself from Dubrovnik, can in any way or under any pretext to buy or sell slaves . . . or be a mediator in such trade.”

Recognizing the threat that recurring outbreaks of bubonic plague posed to their city, the people of Dubrovnik took action to preserve their trading prosperity and their very existence. Thanks to Dubrovnik’s public health measures, the city managed to prevent many deaths and even achieve significant mercantile expansion during the plague period.

Bubonic plague is a bacterial disease that, when left untreated, is usually fatal within days of symptoms appearing. The bubonic plague has ravaged humanity many times, and has even been found in human skeletons dating to 3000 BC. Bubonic plague cases still occur even today. The first outbreak of the illness that was widespread enough to be termed a pandemic occurred in the 6th century AD, during the reign of the Byzantine emperor Justinian I. But the bubonic plague pandemic that devastated Asia, Africa, and Europe in the 14th century—named the Black Death or the Great Pestilence—proved to be the most fatal pandemic in recorded history, killing perhaps as many as 200 million people, including up to 60 percent of Europe’s population. 

That outbreak first emerged in western China. In just three years, between 1331 and 1334, bubonic plague killed more than 90 percent of the people in Hebei Province, which covers an area of land slightly bigger than Ireland. Over 5 million Hebeian corpses presented a preview of the deaths to come.

The scale of the devastation is difficult to imagine. The Black Death laid waste to Europe from 1346 to 1353. In 1348, the bacteria wiped out 60 percent of Florence’s population. That same year, the plague reached France, and within four years at least a third of Parisians were in the grave. The following year, the plague arrived in London and halved that city’s populace. In practically every city and town, the tragedy repeated itself.

One firsthand account of the devastation notes: “[T]his mortality devoured such a multitude of both sexes that no one could be found to carry the bodies of the dead to burial, but men and women carried the bodies of their own little ones to church on their shoulders and threw them into mass graves, from which arose such a stink that it was barely possible for anyone to go past a churchyard.”

Survivors were haunted by grief and loneliness. In 1349, the Italian writer Francesco Petrarch, who lost many companions to the plague, including his muse Laura, wrote:

Where are our dear friends now? Where are the beloved faces? Where are the affectionate words, the relaxed and enjoyable conversations? . . . What abyss swallowed them? There was a crowd of us, now we are almost alone. We should make new friends—but how, when the human race is almost wiped out; and why, when it looks to me as if the end of the world is at hand? Why pretend? We are alone indeed.

Despite life’s hardships, survival was nonetheless preferable to death, and people made a great number of innovative attempts to prevent and treat the disease that was decimating humanity. Many of those measures were tragically ineffective, such as bloodletting and avoiding baths. (Bathing was thought to expand the pores and make one vulnerable to disease.) Some measures helped a little in the prevention of illness—such as avoiding foul smells, including rotting corpses, and encouraging better home ventilation. 

Famously, medieval understanding of how disease spread left much to be desired. Many assumed that the Black Death was a divine punishment for mankind’s sins, giving rise to the distressing flagellant movement, and some of the brightest minds of the day at the University of Paris, when commissioned by the king of France to explain the plague, concluded that the movements of Saturn were to blame. Others blamed witchcraft. Reprehensibly, still others violently scapegoated religious minorities: “Hygienic practices limited the spread of plague in Jewish ghettos, leading to the Jews being blamed for the plague’s spread, and widespread massacres, especially in Germany and Central Europe.” 

However, while they may not have grasped the cause of the illness, medieval people did possess the general concept of contagion. They knew that the plague disseminated from one place to another and that transmission was occurring in some way: the suspected vectors ranged from the wind to the gaze of an infected person. 

Fortunately, medieval people did not need to know that the bubonic plague spreads mainly via fleas to figure out that limiting contact with people and objects from known outbreak sites was the most prudent course of action. This idea became widespread in part through the works of various physicians publishing medical pamphlets or tractates throughout Europe that may have represented “the first large-scale effort at popular health instruction in history.” The Catalan doctor Jaume d’Agramont (d. 1350 of plague), for example, advised the public against eating food from “pestilential regions,” and wrote that “association with a sufferer of a pestilential disease” could cause the illness to spread from one person to another “like a wildfire.” The possibility of interpersonal transmission became widely suspected, even if few guessed at the flea’s role as an intermediary. 

Even before the plague, Dubrovnik made several strides toward better public health. While we now take basic hygiene measures for granted, Dubrovnik was something of a medieval outlier when it limited the disposal of garbage and feces in the city in 1272. The city banned swine from city streets in 1336, hired street cleaners in 1415, and created a complete sewage system in the early 15th century. Dubrovnik’s relative prosperity allowed it to offer competitive wages to draw physicians from other cities, such as Salerno, Venice, Padua, and the home of the first university, Bologna. In 1390, Dubrovnik also created the world’s first permanent public health office to enforce its various public health rules.

Economic incentives helped motivate the trade-dependent city’s innovations in public health and sanitation: “Sanitary measures in Dubrovnik were constantly improved because the city was forced to find a way to protect itself from diseases and at the same time retain the lucrative trade relations which formed its economic base.” During the outbreak of 1347, the Dubrovnik writer and nobleman Nikola Ragnina (1494–1582) claimed that people first attempted to banish the plague with fire: “There was no cure and everyone was dying. When people saw that their physicians could not defend them, they decided to . . . purify the air with fire.” The fires may have helped to kill off some of the plague-carrying fleas, but were ultimately a failed experiment. So, they tried something new.

Even a primitive understanding of how the illness spread proved sufficient for the people of Dubrovnik to attempt a radical and historic experiment in disease prevention. In 1374, Venice first put in place waiting periods for ship passengers to enter their city, but this was purely at the discretion of health bureaucrats, thus leading to irrational, selective enforcement. But in 1377, Dubrovnik’s council implemented a much more logical system: all passengers on incoming ships and members of trade caravans arriving from infected areas were to wait for 30 days in the nearby town of Cavtat or the island of Mrkan before entering Dubrovnik’s city walls. The quarantine period was soon expanded to 40 days (the word “quarantine” means “40 days”)—a number likely reached as a result of experience, as the full course of the bubonic plague from contraction to death was typically around 37 days.

“Dubrovnik’s administration arrived at the idea of quarantine as a result of its experience isolating leprosy victims to prevent spread of the disease,” notes historian Ana Bakija-Konsuo. “Historical science has undoubtedly proved Dubrovnik’s priority in the ‘invention’ of quarantine. Isolation, as a concept, had been applied even before 1377, as mentioned in the Statute of the City Dubrovnik, which was written in 1272 and . . . is the first mention of the isolation of the patients with leprosy.” Dubrovnik’s stone seaside quarantine shelters, sometimes considered the first plague hospitals in Europe, were called lazarettos after Lazarus, the patron saint of lepers. Today the city’s lazarettos serve as tourist attractions and concert venues.

Devastating plague outbreaks eventually forced Venice to implement a complete ban on anyone entering its walls, bringing trade and city life to a halt, but Dubrovnik’s limited waiting periods let the republic keep its doors open to people and merchandise from abroad. “Hence, Dubrovnik implemented a method that was not only just and fair, but also very wise and successful, and it [eventually] prevailed around the world,” according to historian Ante Milošević. Quarantine procedures remain the standard policy to this day when dealing with certain contagious diseases.

Dubrovnik Old Town Croatia.

The Black Death pandemic is sometimes viewed as the end of medieval civilization and the beginning of the Renaissance period. Faced with a disease that would not become treatable until the advent of antibiotics in the 1940s, Dubrovnik certainly underwent a rebirth, recovering from the initial wave of deaths to become the first city to implement a coherent public health response to the bubonic plague. Dubrovnik’s invention of quarantine represents not only perhaps the highest achievement of medieval medicine, but the emergence of one of humanity’s oldest disease-prevention tools and a turning point in the history of public health. With its strong ideals of liberty and devotion to public health, Dubrovnik during its republican era has earned its place as our 37th Center of Progress.

Blog Post | Science & Technology

How Many Lives Are Lost Due to the Precautionary Principle?

New research suggests that allowing our fears to prevent action can be deadly.

No matter how well intentioned, sometimes hyper-precautionary rules can be deadly. By defaulting public policies to super-cautious mode and curtailing important innovations, laws and regulations can actually make the world less safe.

A new NBER working paper finds exactly this: the authors examined the “unintended effects from invoking the precautionary principle after the Fukushima Daiichi nuclear accident,” which occurred in Japan in March 2011 due to a tsunami. They find that the Japanese government’s decision to entirely abandon nuclear energy following the incident resulted in many unnecessary deaths, primarily due to increased energy costs and corresponding cold weather-related welfare effects. Japan’s decision also has had potentially serious environmental implications.

The precautionary principle, in other words, can cost more lives than it saves.

How Excessive Regulation Costs Lives

The precautionary principle refers to the idea that public policies should limit innovations until their creators can prove they will not cause any potential harms or disruptions. Where there is uncertainty about future risks, the precautionary principle defaults to play-it-safe mode by disallowing trial-and-error progress, or at least making it far more difficult.

The problem with the precautionary principle is that uncertainty about the future and risks always exists. Worse yet, defaulting to super-safe mode results in a great deal of forgone experimentation with potentially new and better ways of doing things.

As I summarized in my last book, “living in constant fear of worst-case scenarios—and premising public policy on them—means that best-case scenarios will never come about. When public policy is shaped by precautionary principle reasoning,” I argued, “it poses a serious threat to technological progress, economic entrepreneurialism, social adaptation, and long-run prosperity.”

But can the precautionary principle really lead to deaths? Yes, it can. The aforementioned NBER paper by Matthew J. Neidell, Shinsuke Uchida, and Marcella Veronesi finds that, in the four-year period following the Fukushima accident, there were 1,280 cold-related deaths due to the government’s decision to completely end nuclear power production in Japan.

In the wake of that decision, Japanese citizens experienced immediate electricity price hikes as the country went from 30 percent nuclear power production to zero percent in just 14 months. Japan had to increase reliance on fossil fuels to offset that shortfall, which resulted in the rapid increase in electricity prices and corresponding increased fatalities from cold weather-related problems.

“This suggests that ceasing nuclear energy production has contributed to more deaths than the accident itself,” the authors find. In fact, the authors note, “[n]o deaths have yet to be directly attributable to radiation exposure, though projections estimate a cumulative 130 deaths.” But that total would still fall well short of the number of lives lost due to increased electricity prices overall.

Again, the study covers just four years, from 2011 to 2014. The authors say that fatalities due to higher electricity prices likely grew in the years beyond that because the effects of the nuclear ban continued to be felt—and those effects continue right up to present time.

The authors also note that there were likely significant health impacts associated with replacing nuclear power with fossil fuels due to the deterioration of local air quality, although they did not model those results in this study. Taken together, however, “the total welfare effects from ceasing nuclear production in Japan are likely to be even larger than what we estimate, and represents a fruitful line for future research,” they conclude.

The Golden Rice Case Study

This isn’t the only example of how the precautionary principle can undermine public health or lead to death. There are many others. A particularly powerful example involves Golden Rice, a form of rice that was genetically engineered to contain beta-carotene, which helps combat vitamin A deficiency.

Science writer Ed Regis recently published Golden Rice: The Imperiled Birth of a GMO Superfood, which provides a history of this super food. It serves as a cautionary tale of how the precautionary principle can cause unnecessary suffering and cost lives. Scientists in Germany developed the modified rice in the early 2000s to address the global vitamin A deficiency, which led to blindness and an estimated million deaths each year, primarily among children and pregnant women in undeveloped countries.

Unfortunately, anti-GMO resistance among environmental activists and regulatory officials held up the diffusion of this miracle food. Regis argues that one of the primary reasons it took 20 years to develop the final version of Golden Rice was “the retarding force of government regulations on GMO crop development.” He continues:

Those regulations, which cover plant breeding, experimentation, and field trials, among other things, are so oppressively burdensome that they make compliance inordinately time-consuming and expensive. Such regulations exist because of irrational fears of GMOs, ignorance of the science involved, and overzealous adherence to the precautionary principle. Ingo Potrykus, one of the co-inventors of Golden Rice, has estimated that compliance with government regulations on GMOs caused a delay of up to ten years in the development of his final product. 

Ironically, in view of all the good that Golden Rice could have been doing in ameliorating vitamin A deficiency, blindness, and death during those ten years, it was precisely the government agencies that were supposed to protect people’s health that turned out to be the major impediments to faster development of this life-saving and sight-saving superfood. As it was, countless women and children died or went blind in those intervening years as a result of government-imposed regulatory delays. While that is not a ‘crime against humanity,’ it is nevertheless a modern tragedy.

Regis points out that the real problem with the precautionary principle is that it treats innovations like Golden Rice as “guilty until proven innocent.” This is the essential danger associated with the precautionary principle that I documented in my last book and all my writing on this issue. Risk analysts and legal scholars have also criticized the precautionary principle because they argue it “lacks a firm logical foundation” and is “literally incoherent.” They argue the principle is, in essence, a non-principle because it fails to specify a clear standard by which to judge which risks are most serious and worthy of preemptive control.

But the precautionary principle really is rooted in a principle, or at least a preference. It is an implicit preference for stasis, or preservation of the status quo. Advocates of the precautionary principle might believe that doing nothing in the face of uncertainty seems like the safer choice. But to borrow a line from the rock band Rush, “If you choose not to decide, you still have made a choice,” and by opting for stasis and disallowing ongoing innovation, precautionary principle advocates make a choice for us that leaves the world less safe in the long-run.

The late political scientist Aaron Wildavsky dedicated much of his life’s work to proving how efforts to create a risk-free society would instead lead to an extremely unsafe society. In his important 1988 book, Searching for Safety, Wildavsky warned of the dangers of “trial without error” reasoning, and contrasted it with the trial-and-error method of evaluating risk and seeking wise solutions to it. He argued that wisdom is born of experience and that we can learn how to be wealthier and healthier as individuals and a society only by first being willing to embrace uncertainty and even occasional failure:

The direct implication of trial without error is obvious: If you can do nothing without knowing first how it will turn out, you cannot do anything at all. An indirect implication of trial without error is that if trying new things is made more costly, there will be fewer departures from past practice; this very lack of change may itself be dangerous in forgoing chances to reduce existing hazards. . . . Existing hazards will continue to cause harm if we fail to reduce them by taking advantage of the opportunity to benefit from repeated trials.

This is the most crucial and most consistently overlooked lesson about the precautionary principle. When taken too far, precaution makes us less safe. It can even cause us suffering and lead to deaths. The burden of proof, therefore, is on advocates of the precautionary principle to explain why stopping experimentation is good for us, because it almost never is in practice.

“The Hidden Cost of Saying No”

More generally, the two case studies discussed above once again illustrate the simple truth that trade-offs exist and policy incentives matter. Regulation is not a magic wand that instantly grants society cost-free blessings. Every policy action has potential costs, many of which are hard to foresee upfront or even to estimate after the fact. The precautionary principle is static and short-sighted, focusing only on mitigating some direct, obvious risks. By stopping one potential risky outcome, policymakers can assure citizens that no potential danger can arise again because of that particular activity.

But sometimes the greatest risk of all is inaction. Progress and prosperity are impossible without constant trial-and-error experimentation and a certain amount risk-taking. Without risk, there can be no reward. In a new book, scientist Martin Rees refers to this truism about the precautionary principle as “the hidden cost of saying no.”

That hidden cost of precautionary regulations on Golden Rice resulted in “a modern tragedy” for the countless people who have suffered blindness or died as a result. That hidden cost was also quite profound for Japanese citizens following the Fukushima incident. If regulation forbids one type of energy production, something else must take its place to maintain living standards. The country’s decision to forbid nuclear power apparently lead to unnecessary deaths after other substitutes had to be used.

To be clear, the Fukushima incident was a horrible accident that had many other costs in its own right. Well over 100,000 residents were evacuated from the communities surrounding the plant due to contamination fears. But it remains unclear how much harm came about due to the release of radioactive materials relative to either the destructive power of the tsunami itself (or the resulting regulatory response).

The International Atomic Energy Agency maintains a site dedicated to ongoing Fukushima Daiichi status updates and notes that cleanup efforts are ongoing. Regarding sea area monitoring, the IAEA says that the “levels measured by Japan in the marine environment are low and relatively stable.” “The situation with regard to the safety of the food supply, fishery and agricultural production continues to remain stable,” as well.

There may also be long-term health care issues due to radiation exposure, even though that has not been proven thus far. Importantly, however, some lives were lost during evacuation of the area, especially among elderly individuals. Official reports from the Japanese government’s Reconstruction Agency found over 1,600 “indirect” deaths attributable to stress and other illnesses during the evacuation phase, which was more than those directly attributable to the disaster itself.

Risk Analysis is Complicated, But Essential

The dynamic nature of regulatory trade-offs such as these is what makes benefit-cost analysis so challenging yet essential. Policymakers must do a better job trying to model the costs of regulatory decisions—especially those involving sweeping precautionary controls—precisely because the costs of getting things wrong can be so profound.

A 2017 Mercatus Center working paper entitled, “Death by Regulation: How Regulations Can Increase Mortality Risk,” by James Broughel and W. Kip Viscusi found that “regulations costing more than $99.3 million per life saved can be expected to increase mortality risk. A cost-per-life-saved cutoff of approximately $100 million is a threshold cost-effectiveness level beyond which life-saving regulations will be counterproductive—where rules are likely to cause more expected fatalities than they prevent.” In other words, at some point regulation can become so costly that it actually does more harm than good. Where we find rules that impose costs beyond such a threshold, we should look for alternative solutions that will be more cost-effective and life-enriching.

In the aggregate, it is impossible to know how many lives are lost due to the application of the precautionary principle. There are just too many regulatory scenarios and dynamic effects to model. But when some critics decry efforts to estimate the potential costs associated with precautionary regulations, or insist that any cost is worth bearing, we must remind them that no matter how difficult it is to model risk trade-offs and uncertain futures, we must try to ensure that regulation is worth it. We live in a world of resource constraints and tough choices.

Indirect Opportunity Costs Matter Deeply

Generally speaking, however, it is almost never wise to completely foreclose important types of innovation that might offer society important benefits that are difficult to foresee. In the case of nuclear power, however, the benefits were quite evident from the start, but many countries opted to tightly control or stifle its development anyway.

Conversations about nuclear power in the United States were always tainted by worst-case thinking, especially following the Three Mile Island incident in 1979. Although that incident resulted in no deaths, it severely curtailed nuclear power as a major energy source in the US. Since that time, few new nuclear power plants have successfully been built and put online in the United States. That trend only worsened following the Fukushima accident, as regulatory requirements intensified. Political wrangling over nuclear waste disposal also holds up progress.

But the costs of those policy decisions are more evident today as we face questions about how to combat climate change and reduce carbon emissions. In a recent Wall Street Journal essay, Joshua S. Goldstein and Staffan A. Qvist argue that, “Only Nuclear Energy Can Save the Planet,” and offset fossil fuel consumption fast enough. Concerns about disasters and waste management persist even though, relatively speaking, nuclear power has had a fairly remarkable safety record.

Waste disposal concerns are also overstated. “An American’s entire lifetime of electricity use powered by nuclear energy would produce an amount of long-term waste that fits in a soda can,” they note. That is certainly a challenge we can handle relative to the massive carbon footprint all of us currently produce.

This case study about how the precautionary principle held back nuclear power innovation is instructive in a couple of ways. First, as suggested by the new NBER study and other research, the precautionary principle has had significantly negative direct costs in the form of increased electricity costs as well as increasing carbon emissions, due to forced continued reliance on fossil fuels.

Second, there have likely been many indirect costs in the form of forgone innovation. We simply do not know how much better nuclear power plants would be today if experimentation with new methods had been allowed over the past four decades. The dream of making power “too cheap to meter” via nuclear production might have become more than just a catchphrase or utopian dream. At a minimum, we would have likely had more Thorium-based reactors online that could have significantly improved efficiency and safety.

Conclusion

This points to the need for greater humility in policymaking. We do not possess crystal balls that will allow us to forecast the technological future, or all our future needs. Many countries (especially the United States) likely made a serious mistake by discouraging nuclear technologies, and now we and the rest of the world, are stuck living with the ramifications of that precautionary miscalculation. Likewise, the Golden Rice case study points to the dangers of regulatory hubris on the global stage, as policymakers in many held back life-saving innovations that could have alleviated suffering and death.

It is time to reject the simplistic logic of the precautionary principle and move toward a more rational, balanced approach to the governance of technologies. Our lives and well-being depend upon it.

This originally appeared in The Bridge. 

Blog Post | Communicable Disease

Impending Defeat for the Four Horsemen of the Apocalypse

Pestilence, war, famine, and death are all on the decline.

Most of you think that the world, in general, is getting worse. You are wrong. Citing uncontroversial data on major global trends, I will prove to you that this dark view of humanity’s prospects is, in large part, badly mistaken.

First, though: How do I know most of you believe that things are bad and getting worse? Because that’s what you tell pollsters. A 2016 survey by the public opinion firm YouGov asked folks in 17 countries, “All things considered, do you think the world is getting better or worse, or neither getting better or worse?” Fifty-eight percent answered worse, and 30 percent chose neither. Only 11 percent thought things are getting better. In the United States, 65 percent thought that the world is getting worse and 23 percent said neither. Only 6 percent responded that the world is getting better.

A 2015 study in the journal Futures polled residents of the U.S., the U.K., Canada, and Australia; it reported that a majority (54 percent) rated the risk of our way of life ending within the next 100 years at 50 percent or greater, and a quarter (24 percent) rated the risk of humans being wiped out in the next 100 years at 50 percent or greater. Younger respondents were more pessimistic than their elders.

So why are so many smart people like you wrong about the improving state of the world? For starters, almost all of us have a couple of psychological glitches that cause us to focus relentlessly on negative news.

Way back in 1965, Johan Galtung and Mari Holmboe Ruge of the Peace Research Institute Oslo observed “a basic asymmetry in life between the positive, which is difficult and takes time, and the negative, which is much easier and takes less time.” They illustrated this by comparing “the amount of time needed to bring up and socialize an adult person and the amount of time needed to kill him in an accident; the amount of time needed to build a house and to destroy it in a fire, to make an airplane and to crash it, and so on.” News is bad news; steady, sustained progress is not news.

Smart people seek to be well-informed and so tend to be more voracious consumers of news. Since journalism focuses on dramatic events that go wrong, the nature of news thus tends to mislead readers and viewers into thinking that the world is in worse shape than it really is. This mental shortcut is called the availability bias, a name bestowed on it in 1973 by the behavioral scientists Amos Tversky and Daniel Kahneman. “People tend to assess the relative importance of issues by the ease with which they are retrieved from memory—and this is largely determined by the extent of coverage in the media,” explains Kahneman in his book Thinking, Fast and Slow.

Another reason for the ubiquity of mistaken gloom derives from evolutionary psychology. A Stone Age person hears a rustle in the grass. Is it the wind or a lion? If he assumes it’s the wind and the rustling turns out to be a lion, then that person does not live to become one of our ancestors. We are the descendants of the worried folks who tended to assume that all rustles in the grass were dangerous predators. Due to this instinctive negativity bias, most of us attend far more to bad rather than to good news.

Of course, not everything is perfect. Big problems remain to be addressed and solved. As the Harvard psychologist Steven Pinker says, “it’s essential to realize that progress does not mean that everything gets better for everyone, everywhere, all the time. That would be a miracle, that wouldn’t be progress.”

For example, man-made climate change arising largely from increasing atmospheric concentrations of carbon dioxide released from the burning of fossil fuels could become a significant problem for humanity during this century. The spread of plastic marine debris is a big and growing concern. Many wildlife populations are declining, and tropical forest area continues to shrink. And far too many people are still malnourished and dying in conflicts around the globe.

But many of those problems are already in the process of being ameliorated. For example, the falling prices of renewable energy sources offer ever-stronger incentives to switch away from fossil fuels. And hyperefficient agriculture is globally reducing the percentage of people who are hungry—while simultaneously freeing up land, so that forests are now expanding in much of the world.

The fact that we denizens of the early 21st century are much richer than any previous generation accounts for much of the good news. Thanks to technological progress and expanding global markets, the size of the world’s economy since 1820 has grown more than 100-fold while world population grew somewhat less than eightfold. In concrete terms, world gross product grew from $1.2 trillion (in 2011 dollars) to more than $116 trillion now. Global per capita GDP has risen from $1,200 per year in 1820 to more than $15,000 per person currently.

The astonishing result of this increase in wealth is that the global rate of absolute poverty, defined as living on less than $1.90 per person per day, fell from 84 percent in 1820 to 55 percent in 1950. According to the World Bank, 42 percent of the globe’s population was still living in absolute poverty as late as 1981. The latest World Bank assessment reckons that the share of the world’s inhabitants living in extreme poverty fell to 8.6 percent in 2018. In 1990 about 1.9 billion of the world’s people lived in extreme poverty; by 2018, that number had dropped to 660 million.

In Christian tradition, the four horsemen of Famine, Pestilence, War, and Death usher in the apocalypse. Compared to 100 years ago, deaths from infectious diseases are way down; wars are rarer and kill fewer people; and malnutrition has steeply declined. Death itself is in retreat, and the apocalypse has never looked further away.

Death

Average life expectancy at birth hovered around 30 years for most of human history. This was mostly due to the fact that about a third of all children died before they reached their fifth birthday. Demographers ​estimate that in 16th century England, 60 out of 100 children died before age 16. ​Some fortunate people did have long lives, but only 4 percent of the world’s population lived to be older than 65 before the 20th century.

In 1820, global average life expectancy was still about 30 years. Then, remarkably, life expectancy in Europe and North America began rising at the sustained rate of about 3 months annually. That was largely a consequence of better nutrition and the rise of public health measures such as filtered water and sewers.

During the past 200 years, global life expectancy more than doubled, now reaching more than 72, according to the World Bank. Worldwide, the proportion of folks who are 65 years and older has also more than doubled, to 8.5 percent. By 2020, for the first time in human history, there will be more people over the age of 64 than under the age of 5.

Even in the rapidly industrializing United States, average life expectancy was still only 47 years in 1900, and only 4 percent of Americans were 65 years and older. U.S. life expectancy is now 78.7 years. And today 15.6 percent of Americans are 65 or older, while only 6.1 percent are under age 5.

The historic rate of rising life expectancy implies a global average of 92 years by 2100. But the United Nations’ medium fertility scenario rather conservatively projects that average global life expectancy at the end of the century will instead be 83.

A falling infant mortality rate accounts for the major share of increasing longevity. By 1900, infant mortality rates had fallen to around 140 per 1,000 live births in modernizing countries such as the United Kingdom and the United States. Infant mortality rates in the two countries continued to fall to around 56 per 1,000 live births in 1935 and down to about 30 per 1,000 live births by 1950. In 2017, the U.K. and U.S. infant mortality rates were 3.8 and 5.9 per 1,000 live births, respectively. Since 1900, in other words, infant mortality in those two countries has fallen by more than 95 percent.

Infant mortality rates have also been falling steeply ​in the rest of the world​. The World Health Organization estimates that the global infant mortality rate was just under 160 per 1,000 live births in 1950. In 2017, it was down to 29.4 per 1,000 live births, about the level of the U.K. and the U.S. in 1950. Vastly fewer babies are dying today because rising incomes have enabled improved sanitation and nutrition and more resources for educating mothers.

According to the World Bank, the global crude death rate stood at 17.7 per 1,000 in 1960. That is, about 18 people out of every 1,000 persons in a community would die each year. That number has fallen to 7.6 per 1,000 in 2016. The global death rate has fallen by more than half in the last 60 years.

Famine

Food production since 1961 has essentially quadrupled while global population has increased two and half times, according to the World Bank. As a result, the Food and Agriculture Organization reports, the global average food supply per person per day rose from 2,225 calories in 1961 to 2,882 calories in 2013. As a general rule, men and women need around 2,500 or 2,000 calories per day, respectively, to maintain their weight. Naturally, these values vary depending on age, metabolism, and levels of physical activity, among other things.

Food availability, of course, is not equally distributed across the globe. Nevertheless, rising agricultural production has caused undernourishment in poor developing countries to fall dramatically. The Food and Agriculture Organization regularly estimates the “proportion of the population whose habitual food consumption is insufficient to provide the dietary energy levels that are required to maintain a normal active and healthy life.” It reports that this undernourishment fell from 37 percent of the population in 1969–71 to just under 15 percent by 2002, reaching a low of 10.6 percent in 2015 before ticking up to 10.9 percent in 2017.

Famines caused by drought, floods, pests, and conflict have collapsed whole civilizations and killed hundreds of millions of people over the course of human history. In the 20th century, the biggest famines were caused by communist regimes in the Soviet Union and mainland China. Soviet dictator Josef Stalin’s famines killed up to 10 million people; China’s despot, Mao Zedong, starved 45 million between 1958 and 1962.

In the 21st century, war and political violence are still major causes of hunger around the world. Outbreaks of conflict in Syria, Yemen, Somalia, South Sudan, Afghanistan, and Nigeria are largely responsible for the recent uptick in the rate of global undernourishment. In other words, famines have disappeared outside of war zones. Much progress has been made, and the specter of famine no longer haunts the vast majority of humankind.

Pestilence

Prior to its eradication in 1979, smallpox was one of humanity’s oldest and most devastating scourges. The disease, which can be traced all the way back to pharaonic Egypt, was highly contagious. A 1775 French medical textbook estimated that 95 percent of the population contracted smallpox at some point during their lives.

In the 20th century alone, the disease is thought to have killed between 300 and 500 million people. The smallpox mortality rate among adults was between 20 and 60 percent. Among infants, it was 80 percent. That helps explain why life expectancy remained between 25 and 30 years for so long.

Edward Jenner, an English country doctor, noted that milkmaids never got smallpox. He hypothesized that the milkmaids’ exposure to cowpox protected them from the disease. In 1796, Jenner inserted cowpox pus from the hand of a milkmaid into the arm of a young boy. Jenner later exposed the boy to smallpox, but the boy remained healthy. Vacca is the Latin word for a cow—hence the English word vaccination.

The World Health Organization estimates that vaccines prevented at least 10 million deaths between 2010 and 2015 alone. Many millions more lives were protected from illness. As of 2018, global vaccination coverage remains at 85 percent, with no significant changes during the past few years. That said, an additional 1.5 million deaths could be avoided if global immunization coverage improves.

Improved sanitation and medicine account for many of the other wins against pestilence. Before the 19th century, people didn’t know about the germ theory of disease. Consequently, most people did not pay much attention to the water they drank. The results were often catastrophic, since contaminated water spreads infectious diseases, including diarrhea, dysentery, typhoid, polio, and cholera.

From 1990 to 2015, access to improved water sources rose from 76 percent of the world’s population to 91 percent. Put differently, 285,000 people gained access to clean water each day over that time period.

As a result of growing access to clean water and improved sanitation, along with the wider deployment of rehydration therapy and effective rotavirus vaccines, the global rate of deaths from diarrheal diseases stemming from rotavirus, cholera, and shigella has fallen from 62 per 100,000 in 1985 to 22 per 100,000 in 2017, according to The Lancet’s Global Burden of Disease study that year. And thanks to constantly improving medicines and pesticides, malaria incidence rates decreased by 37 percent globally and malaria mortality rates decreased by 60 percent globally between 2000 and 2015.

War

Your chances of being killed by your fellow human beings have also been dropping significantly. Lethal interpersonal violence was once pervasive. Extensive records show that the annual homicide rate in 15th century England hovered around 24 per 100,000 residents, while Dutch homicide rates are estimated as being between 30 and 60 per 100,000 residents. Fourteenth century Florence experienced the highest known annual homicide rate: 150 per 100,000. The estimated homicide rates in 16th century Rome range from 30 to 80 per 100,000. Today, the intentional homicide rate in all of those countries is around 1 per 100,000.

The Cambridge criminologist Manuel Eisner notes that “almost half of all homicides worldwide occurred in just 23 countries that account for 10 per cent of the global population.” Unfortunately, medieval levels of violence still afflict such countries as El Salvador, Honduras, and South Africa, whose respective homicide rates are 83, 57, and 34 per 100,000 persons.

Nonetheless, the global homicide rate is falling: According to the Institute for Health Metrics and Evaluation, it has dropped from 6.4 per 100,000 in 1990 to 5.3 per 100,000 in 2016. That’s a reduction of 17 percent during a remarkably short period of 26 years, or 0.7 percent per year.

Another way to measure the general decline in violence is the global battle death rate per 100,000 people. Researchers at the Peace Research Institute Oslo have documented a steep post–World War II decline in the rate at which soldiers and civilians are killed in combat. The rate of battle deaths per 100,000 people reached a peak of 23 in 1953. By 2016, that had fallen by about 95 percent.

Apocalypse Later?

Some smart people acknowledge that considerable social, economic, and environmental progress has been made but worry that the progress will not necessarily continue.

“Human beings still have the capacity to mess it all up. And it may be that our capacity to mess it up is growing,” claims Cambridge political scientist David Runciman in The Guardian. He adds, “For people to feel deeply uneasy about the world we inhabit now, despite all these indicators pointing up, seems to me reasonable, given the relative instability of the evidence of this progress, and the [unpredictability] that overhangs it. Everything really is pretty fragile.”

Runciman is not alone. The worry that civilization is just about to go over the edge of a precipice has a long history. After all, many earlier civilizations and regimes have collapsed, including the Babylonian, Roman, Tang, Mayan, and, more recently, Ottoman and Soviet empires.

Yet there are good reasons for optimism. In their 2012 book Why Nations Fail, economists James Robinson of the University of Chicago and Daron Acemoglu of the Massachusetts Institute of Technology persuasively outline an explanation for the exponential improvement in human well-being that started about two centuries ago.

Before then, they argue, most societies were organized around “extractive” institutions—political and economic systems that funnel resources from the masses to the elites. In the 18th century, some countries—including Britain and many of its colonies—shifted from more extractive to more inclusive institutions.

“Inclusive economic institutions that enforce property rights, create a level playing field, and encourage investments in new technologies and skills are more conducive to economic growth than extractive economic institutions that are structured to extract resources from the many by the few,” the authors write. “Inclusive economic institutions are in turn supported by, and support, inclusive political institutions.”

Inclusive institutions are similar to one another in their respect for individual liberty. They include democratic politics, strong private property rights, the rule of law, enforcement of contracts, freedom of movement, and a free press. Inclusive institutions are the bases of the technological and entrepreneurial innovations that produced a historically unprecedented rise in living standards in those countries that embraced them, including the United States, Western Europe, Japan, and Australia.

While uneven and occasionally reversed, the spread of inclusive institutions to more and more countries is responsible for what the University of Illinois at Chicago economist Deirdre Nansen McCloskey calls the “Great Enrichment,” which has boosted average incomes 10- to 30-fold in those countries where they have taken hold.

The most striking examples of social disintegration—Roman, Tang, Soviet—occurred in extractive regimes. Despite crises such as the Great Depression, there are no examples so far of countries with long-established inclusive political and economic institutions suffering similar collapses.

In addition, major confrontations between relatively inclusive regimes and extractive regimes, such as World War II and the Cold War, have been won by the former. That suggests that liberal free market democracies harbor reserves of resilience that enable them to forestall or rise above shocks that destroy countries with brittle extractive systems.

If inclusive liberal institutions can continue to be strengthened and if they further spread across the globe, the auspicious trends documented here will extend their advance, and those that are currently negative will turn positive. By acting through inclusive institutions to increase knowledge and pursue technological progress, past generations met their needs and hugely increased our generation’s ability to meet our needs. We should do no less for future generations. That is what sustainable development looks like.

This article is based on data and analysis drawn from the author’s forthcoming book Ten Global Trends Every Smart Person Should Know (Cato), co-authored with HumanProgress.org editor and Cato Institute Senior Policy Analyst Marian L. Tupy.

This first appeared in Reason.