fbpx
01 / 05
New Golden Rooster Unveiled on Notre Dame Cathedral

Blog Post | Conservation & Biodiversity

Technological Progress Versus Degrowth as Solutions to the Sixth Mass Extinction

Assuring the long-term future of Earth’s wildlife requires more economic and technological development, not less.

Summary: Many environmentalists warn that human activity is causing a sixth mass extinction, and advocate economic stagnation or degrowth to slow or reverse the damage. Contrary to their assumptions, the long-term survival of Earth’s wildlife likely depends not on shrinking humanity’s footprint, but on expanding its technological reach. Rather than retreating from nature, advancing human civilization is probably the surest way to protect and even proliferate life of all kinds—on Earth and beyond.


Rising ocean temperaturesdeforestationreplacement of grassland with agricultural land, and many other environmental effects of human activity have been taking an enormous toll on wildlife. Around one million plant and animal species are now threatened with extinction according to a 2019 United Nations (UN) report, which states, “The average abundance of native species in most major land-based habitats has fallen by at least 20%, mostly since 1900.” The most heavily affected are insect species, among which the extinction rate is about eight times faster than among mammals, reptiles, or birds.

But despite this, the popular idea that Earth is seeing a “sixth mass extinction” is premature. According to conservation biologist Chris D. Thomas, we will only reach mass extinction after about 10,000 years if current trends continue. In response to the UN report, National Geographic explains that only “if all species currently designated as critically endangered, endangered, or vulnerable go extinct in the next century, and if that rate of extinction continues without slowing down” will we experience a mass extinction within the next few centuries. It is extremely unlikely that all the “critically endangered” species will go extinct so soon, to say nothing of the “endangered” and “vulnerable” species. And even National Geographic’s projection is probably overly pessimistic. As science journalist Ronald Bailey has shown, reports of mass extinction, including the UN’s, tend to assume worst-case instead of most-likely scenarios and therefore probably overestimate extinction rates.

But even if “mass extinction” fears are overblown, the increased rate of species extinctions is worth taking seriously. That is why it’s important to clear up some widespread confusion about the interests of Earth’s wildlife.

One particularly dangerous misconception is that the interests of human beings are fundamentally at odds with those of wildlife. According to this belief, continued economic growth and industrialisation may benefit humans, but only at the expense of the long-term wellbeing of nonhuman life.

The truth is quite different. In the long run, the dangers of disordered nature are so pervasive, and humanity’s potential solutions so indispensable, that advancing human wealth and economic flourishing is necessary, not detrimental, if we want to protect wildlife and maintain or even increase biodiversity.

The Zero-Sum “Man Versus Nature” Premise

The zero-sum view of the relationship between humanity and wildlife often leads humanists and environmentalists to think of each other as enemies. On the environmentalist side, perhaps the most explicit statement of this binary thinking comes from the late David M. Graber (1948–2022). In 1989, before taking up his post as the Chief Scientist for the Pacific West Region of the National Park Service, where he served for over three decades, he elucidated his priorities in the Los Angeles Times:

We are not interested in the utility of a particular species, or free-flowing river, or ecosystem, to mankind. They have intrinsic value, more value—to me—than another human body, or a billion of them. Human happiness, and certainly human fecundity, are not as important as a wild and healthy planet. I know social scientists who remind me that people are part of nature, but it isn’t true. Somewhere along the line—at about a billion years ago, maybe half that—we quit the contract and became a cancer. We have become a plague upon ourselves and upon the Earth. … Until such time as Homo sapiens should decide to rejoin nature, some of us can only hope for the right virus to come along.

Dave Foreman, founder of the environmental group “Earth First!” and “a leading figure among a generation of activists,” according to the New York Timeslikewise advocated what the Times describes as “aggressive protection of the environment for its own sake”: “a … philosophy, known as deep ecology, which holds that nature has inherent value, not just in its utility to people” and whose proposals include “returning vast swaths of land to nature, ripping out any trace of human intervention.”

In his bestselling 2022 book Regenesis: Feeding the World Without Devouring the Planet George Monbiot writes:

The more land that farming occupies, the less is available for forests and wetlands, savannas and wild grasslands, and the greater is the loss of wildlife and the rate of extinction. All farming, however kind and careful and complex, involves a radical simplification of natural ecosystems. This simplification is required to extract something that humans can eat. In other words, farming inflicts an ecological opportunity cost. Minimizing our impact means minimizing our use of land.

I have come to see land use as the most important of all environmental questions. I now believe it is the issue that makes the greatest difference to whether terrestrial ecosystems and Earth systems survive or perish. The more land we require, the less is available for other species and the habitats they need, and for sustaining the planetary equilibrium states on which our lives might depend.

Unlike Graber and Foreman, Monbiot pays lip service to the necessity of taking human needs into account in any efforts to protect wildlife by scaling back food production. But the clear implication of the “counter-agricultural revolution” he advocates is the mass impoverishment and undernourishment of vast swaths of humanity to lessen human impact on wildlife.

Even the United Nations, in the “Summary for Policymakers” of its biodiversity report, suggests reducing economic activity and human population growth as strategies for protecting biodiversity: “The negative trends in biodiversity and ecosystem functions are projected to continue or worsen in many future scenarios in response to indirect drivers such as rapid human population growth, unsustainable production and consumption and associated technological development.” Therefore, the summary claims that, “Transformations towards sustainability are more likely when efforts are directed at … lowering total consumption and waste, including by addressing both population growth and per capita consumption.”

Among those on the pro-human side of the zero-sum premise is Alex Epstein, founder of the Center for Industrial Progress. “I hold human life as the standard of value,” Epstein writes in his 2014 book The Moral Case for Fossil Fuels: “This is the essence of the conflict: the humanist, which is the term I will use to describe someone on a human standard of value, treats the rest of nature as something to use for his benefit; the nonhumanist treats the rest of nature as something that must be served.”

Epstein is generally careful to propose positive-sum policies, but he is so focused on the case for humanism that even he often speaks in zero-sum terms about the relationship between human and non-human animals. In his 2022 book Fossil Future: Why Global Human Flourishing Requires More Oil, Coal, and Natural Gas—Not Less, Epstein writes: “To the extent one’s primary goal is animal equality one will be morally driven to eliminate all human impacts on animals, including human-benefitting impacts such as the use of animals for medical research.”

Graber, Foreman, Monbiot, and Epstein don’t all agree on whether to prioritise humanity or wildlife, but not one of them questions the assumption that wildlife’s flourishing depends on human retreat and non-intervention.

Mother Nature Is a Grim Reaper

While human activity has accelerated the rate of species extinction in recent history, extinction has been the rule, not the exception, since the dawn of life on Earth. Over 99.9 percent of species that have existed on this planet are now extinct—a story overwhelmingly written prior to the ascension of humankind. Most species have perished in so-called “background extinctions,” but at least five mass extinction events have also occurred, each wiping out over 75 percent of species on Earth at the time. And the rest of non-human life is nearly certain to follow if left to its own devices. Countless threats exist, each of which may be highly unlikely to manifest in any given century, but let the clock tick long enough and the odds eventually become high.

Take asteroids, for example. The New York Times reported in 2023 that, “The world’s family of asteroid-hunting telescopic surveys have so far found more than 32,000 near-Earth asteroids.” They added, reassuringly, that “[m]ost of those capable of inflicting planet-scale devastation have been found because it’s easier to spot bigger rocks glinting in sunlight.”

Still, there are likely to be many Earth-threatening asteroids out there that have not been found. A 2022 National Geographic article reporting on new asteroid research explains that life on Earth is at risk of extinction by “a largely unseen population of space rocks—one that could threaten life as we know it.” Summarising discoveries published in the journal Science, the article explains that, “A group of space rocks stays mostly inside the orbit of Earth, making them difficult to pick out in the glare of the sun—and potentially a threat to our planet.”

As I write this, reports of an asteroid designated “2024 YR4” are all over the news. “The object, first detected in December, is 130 to 300 feet long and expected to make a very close pass of the planet in 2032,” the New York Times reported on 18 February. “Its odds of impacting Earth on Dec. 22 of that year currently stand at 3.1 percent.” (The risk has since been downgraded to a more reassuring 0.004 percent.) This particular asteroid is not large enough to destroy more than a city, but it illustrates the fact that a planetary threat could be discovered at any moment.

Asteroids are just one of many known threat categories that we Earthlings are counting on future technological advances to avert. Marian L. Tupy, Senior Fellow at the Cato Institute’s Center for Global Liberty and Prosperity, has surveyed these natural existential threats in his article “Degrowth Means Certain Death for Humanity.” The list includes asteroid and comet impacts, the weakening or reversal of the magnetosphere, supervolcano eruptions, plate tectonics and continental drift, ice ages, ocean current disruption, methane hydrate release, supernova explosions, nearby hypernovas, gamma-ray bursts, solar flares, coronal mass ejections, rogue planets or stars, black holes, solar evolution, and Milky Way collisions.

That list may be just the beginning. In his book The Precipice: Existential Risk and the Future of Humanity, philosopher Toby Ord notes:

It is striking how recently many of these risks were discovered. Magnetic field reversal was discovered in 1906. Proof that Earth had been hit by a large asteroid or comet first emerged in 1960. And we had no idea gamma ray bursts even existed until 1969. For almost our entire history we have been subject to risks to which we were completely oblivious.

And there is no reason to think that the flurry of discovery has finished—that we are the first generation to have discovered all the natural risks we face. Indeed, it would surely be premature to conclude that we have discovered all of the possible mechanisms of natural extinction while major mass-extinction events remain unexplained.

In any one century, the odds of a catastrophe threatening all life on Earth may be tiny. But there are 10,000,000 centuries in a billion years. And if no existential catastrophe occurs by then, complex life will be long gone a billion years from now when the planet’s oxygen levels will have greatly diminished. That is, unless some technologically advanced species changes the odds in time.

Expanding the Scope of Human and Non-Human Life

In the future, humans may develop the technological capacity to affordably travel to outer space and terraform previously uninhabited worlds. Since many known extinction-level threats are restricted to a single planet, creating self-sustaining habitats outside Earth’s biosphere would significantly reduce the chance of some catastrophe ending all known life. Therefore, if life could become multiplanetary, species-level extinctions need not be the inevitable, almost universal rule.

This is among Elon Musk’s long-term goals at SpaceX. “If we’re a multi-planet species, it’s like life insurance for life itself. Not just for humans, but for all the creatures on Earth, because we would bring them with us. And they can’t build spaceships, so we are in effect the steward of life,” Musk has explained.

Perhaps in addition to or instead of terraforming other planets, humans will engineer giant artificial worlds, which physicist Gerard K. O’Neill first conceptualised in detail, to circumvent the many technical challenges of interplanetary travel. Progress toward the development of O’Neill cylinders, which would contain entire ecosystems and fine-tuned environments for the flourishing of life, is Jeff Bezos’s long-term goal for his aerospace company Blue Origin. These “O’Neill colonies,” as Bezos calls them, which would rotate to create artificial gravity using centrifugal force, would be large enough to comfortably hold at least a million people each and would provide attractive environments that people would want to live in, according to Bezos’s vision. He doesn’t believe that he’ll live to see O’Neill colonies himself, but he believes that future generations will.

By these or other means, humans could offer a long-term future to Earth’s lifeforms and increase the number and well-being of animal and plant species, unconstrained by the resources of Earth or of any other single planet. Once space travel and terraformation become cheap enough, entire planets or artificial worlds could be fine-tuned to benefit specific species. Perhaps this process could even be automated and expanded exponentially with the help of AI, which could provide the necessary research and manage the construction many magnitudes faster and more accurately than human beings.

Scientists are already conducting preliminary research on terraforming space habitats. In 2022, a team of University of Rochester scientists laid out a theoretical plan to terraform asteroids into Manhattan-scale space habitats. In 2023, researchers at NASA’s Johnson Space Center discovered how to turn moon dust into breathable oxygen. And several recent papers have offered strategies for heating up Mars for eventual habitation. Those are just a few of many recent examples.

Even sceptics tend to question only the timeline—not the possibility—of space colonisation. Jonathan McDowell, physicist at the Harvard-Smithsonian Center for Astrophysics, recently commented that Musk is “not at all” close to building “a human civilization on Mars that is self-sustaining,” but added that “towards the end of this century” we might have Mars settlements whose inhabitants could even bring their pets with them from Earth. I have not been able to find any experts who seriously doubt that we will be able to build Earth-like habitats outside Earth’s atmosphere sometime this millennium, which is a short period in evolutionary terms—short enough to prevent the next mass extinction.

A Positive-Sum Environmental Strategy

As philosopher William MacAskill has asked, “if it really is of value to have greater diversity of species, why do we not actively try and promote a greater amount of biodiversity above merely preventing loss of biodiversity?” The opportunity to terraform extraterrestrial environments is just one reason why environmentalists should set their sights far higher than mere conservation.

Areas of Earth that are currently of limited habitability such as deserts and tundra could be transformed into comparatively biodiverse paradises. We could create countless new species through practices like hybridisation and genetic modification—as has already been done with the Italian sparrow (Passer italiae), the apple fly (Rhagoletis pomonella), the yellow-flowered Yorkwort (Senecio eboracensis), and scores of others—as biologist Chris D. Thomas describes in his book Inheritors of the Earth: How Nature Is Thriving in an Age of Extinction. We could also resurrect extinct species, which is the goal of the ten billion USD-valued Dallas-based startup Colossal Biosciences, which is currently working to bring back the woolly mammoth, the dodo bird, and the Tasmanian tiger.

Attempts to conserve Earth’s current ecosystems through non-intervention are doomed because change is a constant of nature and environmental change is a constant of Earth’s geology. Plus, in addition to being futile, it is pessimistic to think that mere conservation should be the highest hope of a forward-looking environmentalist movement. Human non-intervention may benefit non-human life in an unsustainable, short-term way. But acting in the long-term interest of Earth’s wildlife means protecting species from exogenous existential threats and investing in technological and scientific advances that will enable them to thrive at unprecedented levels.

As Tupy explains, “In the long run, the only way to ensure the future of our (hopefully interplanetary) species is through exponential increase in wealth and technological sophistication.” Such a wealthy and innovative economy as humans must continue to build cannot be sustained without significant environmental change. But there is no reason to assume that the damage to wildlife caused by human industry outweighs the benefits—especially possible salvation from extinction—that could accrue to wildlife through the advancement of human knowledge.

If people had not been growing the world economy by using fossil fuelsagriculture, and other ecologically disruptive industries since at least the Industrial Revolution, humans would not be much closer than chimpanzees to colonising space and securing the future of life on Earth and beyond. And as MacAskill argues in his book What We Owe the Future, “if society stagnates technologically, it could remain stuck in a period of high catastrophic risk for such a long time that extinction or collapse would be all but inevitable.” Since we can only guess when and how existential threats will manifest, every extra dollar of research, development, or education might be the dollar that gets Earth’s humanity and wildlife through the technological bottleneck and into the next phase of consciousness in the universe.

The UN’s projections suggest that population growth and economic activity have negative effects on biodiversity because their statistical models can only estimate the destructive aspects of these phenomena. Due to the intrinsic unpredictability of future knowledge, creative transformations such as terraforming other planets and reviving extinct species are outside the scope of UN projections. But this means that their suggestions about reducing production, consumption, and population growth are biased against positive-sum solutions to the point of probably being counterproductive to the advancement of human and wildlife interests alike.

It is true that technological progress is itself a source of existential risks, but unlike the alternative (stagnation in ignorance), it has salvational potential as well, and thus the demise of life on Earth is more likely without technological progress than it is with it. Plus, a positive-sum (technological accelerationist) environmentalism is more politically achievable than a zero-sum (degrowth) environmentalism. The former merely requires that technologists, scientists, and entrepreneurs are left to their own devices within a market economy, while the latter requires everyone to make major sacrifices of their own comfort and prosperity.

As Elon Musk commented in 2022, “I think it’s important that we become a multi-planet species and a spacefaring civilization, because eventually the sun will expand and destroy all life on earth. So if one is a true environmentalist or cares about the future of life, it is obviously important that life become multiplanetary and ultimately multi-stellar.” That is a positive-sum environmentalist agenda with a decent shot at benefiting both human and non-human life. Conversely, when Dave Foreman advocates “ripping out any trace of human intervention” for nature’s sake, or Alex Epstein implies that achieving animal equality would require eliminating “all human impacts on animals,” or David M. Graber hopes “for the right virus to come along” to wipe out humans, they are falsely assuming a zero-sum relationship between humans and other lifeforms. They are missing the larger point that on a long enough timeline the interests of all living beings are aligned and are best served by technological and scientific progress.

This article was originally published at Quillette on 3/13/2025.

Blog Post | Human Development

Grim Old Days: A. Roger Ekirch’s At Day’s Close, Part 2

What was the world really like when nightfall meant fear, filth, and fire?

Summary: A. Roger Ekirch’s book offers a vivid and unsettling portrait of life after dark before the modern era. In a world lit dimly by candles and haunted by both real and imagined dangers, the setting sun marked the beginning of fear, vulnerability, and isolation. From rampant crime to ghostly superstitions, nocturnal life was fraught with hardship, mystery, and menace that shaped how generations lived.


The historian A. Roger Ekirch’s book At Day’s Close: Night in Times Past provides a fascinating window into our ancestors’ world. The book provides insight into everything from the nocturnal dangers they faced, such as the threats of crime and fire, to their deeply uncomfortable sleeping arrangements. For excerpts from the book on that last subject, click here.

Nighttime in the past was far darker than today. Lighting was of poor quality and prohibitively expensive. “Preindustrial families were constrained by concerns for both safety and frugality.” Indeed, “even the best-read people remained sparing with candlelight. In his diary for 1743, the Reverend Edward Holyoke, then president of Harvard, noted that on May 22 and 23 his household made 78 pounds of candles. Less than six months later, the diary records in its line-a-day style, ‘Candles all gone.’”

Use of candles during the day was widely considered so extravagantly wasteful that it was avoided even by the wealthy. In 1712, the rich Virginia planter William Byrd II recorded finding an enslaved woman on his plantation named Prue “with a candle by daylight” for which he barbarically “gave her a salute with [his] foot” (in other words, kicked her). Jonathan Swift advised servants to never light candles “until half an hour after it be dark” to avoid facing wrath.

Most people, of course, had no servants (free or enslaved) and even fewer candles to spare. “At all hours of the evening, families often had to navigate their homes in the dark, carefully feeling their way” and relying on familiarity with the house. “Individuals long committed to memory the internal topography of their dwellings, including the exact number of steps in every flight of stairs.” The wood stair railing of a plantation in colonial Maryland features a distinctive notch to alert candle-less climbers of an abrupt turn.

“All would be horror without candles,” noted a 16th-century writer. Yet “light from a single electric bulb is one hundred times stronger than was light from a candle or oil lamp.” Although they were the best form of artificial lighting our ancestors knew, candles created only small and flickering areas of light. Rather than completely filling a room as artificial light does today with the flick of a light switch, candle light merely “cast a faint presence in the blackness,” not reaching the ceiling or the end of a room and leaving most of one’s surroundings still drenched in darkness. Even objects within the reach of the pitifully small pool of light could appear distorted. A French saying mocking the poor quality of candle illumination stated, “By candle-light a goat is lady-like.

“Prices fluctuated over time, but never did wax . . .  candles become widely accessible. . . . Tallow candles, by contrast, offered a less expensive alternative. The mainstay of many families, their shaft consisted of animal fat, preferably rendered from mutton that was sometimes mixed with beef callow. (Hog fat, which emitted a thick black smoke, did not burn nearly as well, though early Americans were known to employ bear and deer fat.)” Vermin found such candles delectable. “Tallow candles required careful storage so that they would neither melt nor fall prey to hungry rodents.” Unpleasantly, candles “made from tallow gave off a rancid smell from impurities in the fat. . . . Wicks not only flickered, but also spat, smoked and smelled. . . . Still, despite such drawbacks, even aristocratic households depended upon them for rudimentary needs,” as wax candles were so expensive.

“Only toward the eighteenth century did cities and towns take half-steps to render public spaces accessible at night.” The average person remained indoors after sunset. “For most persons, the customary name for nightfall was ‘shutting-in,’ a time to bar doors and bolt shutters.”

Centuries later, little had changed. “Across the preindustrial countryside, fortified cities and towns announced the advance of darkness by ringing bells, beating or blowing horns from atop watchtowers, ramparts, and church steeples.” As rural peasants retreated into their homes, “townspeople hurried home before massive wooden gates, reinforced by heavy beams, shut for the evening and guards hoisted drawbridges wherever moats and trenches formed natural perimeters.” The writer Jean-Jacques Rousseau wrote of his panic as he rushed toward Geneva’s barred gates: “About half a league from the city, I hear the retreat sounding; I hurry up; I hear the drum being beaten, so I run at full speed: I get there all out of breath, and perspiring; my heart is beating; from far away. I see the soldiers from their lookouts; I run, I scream with a choked voice. It was too late.” When the Swiss writer Thomas Platter (1499–1582) found himself locked outside Munich’s city gate, he was reduced to seeking overnight shelter at a “leper-house.” In one French town, when a guard rang the bell signaling the gates were closing a half-hour too early, “Such was the mad crush of panicked crowds as they neared the gate that more than one hundred persons perished, most trampled in the stampede, others pushed from the drawbridge, including a coach and six horses. For his rapacity, the guardsman was broken upon the wheel. . . . Just to approach ramparts without warning at night constituted a crime.”

The time of shutting-in varied with the length of the day. “In winter, when darkness came on quickly, they could shut as early as four o’clock.” Laws even banned leaving one’s home at night. “In 1068, William the Conqueror (ca. 1028–1087) allegedly set a national curfew in England of eight o’clock.” Streets were blockaded to further discourage venturing outside after nightfall. “Lending weight to curfews, massive iron chains, fastened by heavy padlocks, blocked thoroughfares in cities from Copenhagen to Parma . . .  Nuremberg alone maintained more than four hundred. In Moscow, instead of chains, logs were laid across lanes to discourage nightwalkers. Paris officials in 1405 set all of the city’s farriers to forging chains to cordon off not just streets but also the Seine.” In the early 1600s, one writer noted of the French town of Saint-malo: “In the dusk of the evening a bell is rung to warn all that are without the walls to retire into the town: then ye gates are shut, and eight or ten couple of hungry mastiffs turn’d out to range about town all night. . . . Courts everywhere exacted stiffer punishments for nighttime offences” than daytime ones. For example: “For thefts committed after the curfew bell, towns in Sweden decreed the death penalty.”

Toward the end of the Middle Ages, 9 p.m. or 10 p.m. became the standard “hour for withdrawing indoors” in much of Europe.

After nightfall, “for the most part, streets remained dark.” Even where early attempts at street lighting were made, they were seldom adequate. “As late as 1775 a visitor to Paris noted, ‘This town is large, stinking, & ill lighted.’ . . . Lamps in Dublin, as late as 1783, were spaced one hundred yards apart just enough, complained a visitor, to show the ‘danger of falling into a cellar.’”

Sunsets were seldom considered beautiful. “Rarely did preindustrial folk pause to ponder the beauty of day’s departure.” Instead, most surviving descriptions of sundown were characterized by anxiety. “Begins the night, and warns us home repair,” wrote one Stuart poet.

Most ordinary people feared nighttime. “We lie in the shadow of death at night, our dangers are so great,” noted one English author in 1670. Shakespeare’s Lucrece calls nighttime a “black stage for tragedies and murders” and “vast sin-concealing chaos.” “According to Roman poet Juvenal, pedestrians prowling the streets of early Rome after sunset risked life and limb” because the darkness hid so many threats. Centuries later, similar warnings are recorded: “Except in extreme necessity, take care not to go out at night,” advised the Italian writer Sabba da Castiglione (c. 1480–1554).

Many cultures widely believed that demons, ghosts, evil spirits, and other supernatural threats would emerge after sundown, hiding in the all-encompassing darkness. “Evil spirits love not the smell of lamps,” noted Plato. “In African cultures like the Yoruba and Ibo peoples of Nigeria and the Ewe of Dahomey and Togoland, spirits assumed the form of witches at night, sowing misfortune and death in their wake.” The most feared time of night was often the “dead of night,” between midnight and the crowing of roosters (roughly 3 a.m.), which the Ancient Romans called intempesta, “without time.” The crowing was thought to scare away nocturnal demons.

Hence, “in the centuries preceding the Industrial Revolution, evening appeared fraught with menace. Darkness in the early modern world summoned the worst elements in man, nature, and the cosmos. Murderers and thieves, terrible calamities, and satanic spirits lurked everywhere.”

The night was filled with terrors both real and imagined. Fear of the night was ancient. In Greek mythology, Nyx, the personification of night and daughter of Chaos, counted among her children Disease, Strife, and Doom.

The Talmud, an ancient religious text, warns, “Never greet a stranger in the night, for he may be a demon.” After all, darkness hid “vital aspects of identity in the preindustrial world.” At night, “friends were taken for foes, and shadows for phantoms.” Ghostly nighttime encounters were widely reported throughout the preindustrial age, as widely held superstitions combined with a dearth of proper lighting to create traumatic experiences in the minds of many of our ancestors. “There was not a village in England without a ghost in it, the churchyards were all haunted, every large common had a circle of fairies belonging to it, and there was scarce a shepherd to be met with who had not seen a spirit,” an 18th-century writer in the Spectator claimed. “The late eighteenth-century folklorist Francis Grose estimated that the typical churchyard contained nearly as many ghosts at night as the village had parishioners.” Fear of such folkloric creatures was near-universal. Most ordinary people felt genuine, acute distress regarding the pantheon of evil spirits they feared lurked in the night:

Especially in rural areas, residents were painfully familiar with the wickedness of local spirits, known in England by such names as the “Barguest of York,” “Long Margery,” and “Jinny Green-Teeth.” Among the most common tormenters were fairies. In England, their so-called king was Robin Good-fellow, a trickster. . . . “The honest people,” if we may believe a visitor to Wales, “are terrified about these little fellows,” and in Ireland Thomas Campbell reported in 1777, “The fairy mythology is swallowed with the wide throat of credulity.” . . . Dobbies, who dwelt near towers and bridges, reportedly attacked on horseback. An extremely malicious order of fairies, the duergars, haunted parts of Northumberland in northern England, while a band in Scotland, the kelpies, bedeviled rivers and ferries. Elsewhere, the people of nearly every European culture believed in a similar race of small beings notorious for nocturnal malevolence.

In the minds of our ancestors, every shadow might hide trolls, elves, sprites, goblins, imps, foliots, and more. A favorite prank of young men was to affix “candles onto the backs of animals to give the appearance of ghosts.” The impenetrable darkness of the night before humanity harnessed electricity gave rise to imagined horrors beyond modern comprehension.

Other denizens of the nocturnal world included banshees in Ireland whose dismal cries warned of impending death; the ar cannerez, French washwomen known to drown passersby who refused to assist them; and vampires in Hungary, Silesia, and other parts of Eastern Europe who sucked their victims’ blood. . .  As late as 1755, authorities in a small town in Moravia exhumed the bodies of suspected vampires in order to pierce their hearts and sever their heads before setting the corpses ablaze. During the sixteenth and seventeenth centuries, reports of werewolves pervaded much of Central Europe and sections of France along the Swiss border, notably the Jura and the Franche-Comté. The surgeon Johann Dietz witnessed a crowd of villagers in the northern German town of Itzehoe chase a werewolf with spears and stakes. Even Paris suffered sporadic attacks. In 1683, a werewolf on the Notre-Dame-de-Grâce road supposedly savaged a party that included several priests.

And that is not all that the darkness ostensibly hid. “Known as boggles, boggarts, and wafts, ghosts reportedly resumed their mortal likenesses at night.” It was popularly believed that those who died by suicide were doomed to wander the night for all eternity as ghosts, and such ghosts were sometimes thought to assume the form of animals such as dogs.

Ghosts afflicted numerous communities, often repeatedly, like the Bagbury ghost in Shropshire or Wiltshire’s Wilton dog. Apparitions grew so common in the Durham village of Blackburn, complained Bishop Francis Pilkington in 1564, that none in authority dared to dispute their authenticity. Common abodes included crossroads fouled by daily traffic, which were also a customary burial site for suicides. After the self-inflicted death in 1726 of an Exeter weaver, his apparition appeared to many at a crossroads. “‘Tis certain,” reported a newspaper, “that a young woman of his neighbourhood was so scared and affrighted by his pretended shadow” that she died within two days. Sometimes no spot seemed safe. Even the urbane. [English writer Samuel] Pepys feared that his London home might be haunted. The 18th-century folklorist John Brand recalled hearing many stories as a boy of a nightly specter in the form of a fierce mastiff that roamed the streets of Newcastle-upon-Tyne.

Material problems sometimes exacerbated such anxieties. Amid an episode of widespread starvation in Poland, one observer in 1737 opined, “This calamity has sunk the spirits of the people so low, that at [Kamieniec], they imagine they see spectres and apparitions of the dead, in the streets at night, who kill all persons they touch or speak to.”

Such superstitions inspired a feeling of terror that was all too real and could result in actual deaths. Sometimes our forebears literally died of fright, experiencing cardiac arrest from the sheer shock of glimpsing sights in the darkness that they interpreted to be fairies or other such entities. And ordinary people accused of being witches or werewolves could face execution. “In Cumberland, of fifty-five deaths arising from causes other than ‘old age’ reported in the parish register of Lamplugh during a five-year period from 1658 to 1662, as many as seven persons had been ‘bewitched.’ Four more were ‘frighted to death by fairies,’ one was ‘led into a horse pond by a will of the wisp,’ and three ‘old women’ were ‘drownd’ [sic] after being convicted of witchcraft.” (Note that fairies were considered dangerous, not adorable; an 18th-century rebel group of agrarian peasants in Ireland even adopted the moniker of fairies “to intimidate their adversaries”).

Many deaths attributed to legendary beings hiding in the darkness were caused by the darkness itself. Lethal nighttime accidents were common because of the poor state of lighting. “On most streets before the late 1600s, the light from households and pedestrians’ lanterns afforded the sole sources of artificial illumination. Thus the Thames and the Seine claimed numerous lives, owing to falls from wharves and bridges, as did canals like the Leidsegracht in Amsterdam and Venice’s Grand Canal.” Canals, unguarded ditches, ponds, and open pits of varying kinds were far more commonplace in the past, as concern for safety was considerably lower than in the present. “Many people fell into wells, often left unguarded with no wall or railing. If deep enough, it made little difference whether dry”—the fall was sufficient to cause death. Straying from a familiar route could prove lethal. “In Aberdeenshire, a fifteen-year-old girl died in 1739 after straying from her customary path through a churchyard and tumbling into a newly dug grave.

“Even the brightest torch illuminated but a small radius, permitting one, on a dark night, to see little more than what lay just ahead.” Wind could blow out a torch or lantern in an instant. William Shakespeare described the frequent horror of “night wand’rers” upon seeing their “light blown out in some mistrustful wood” in his poem Venus and Adonis (1593). Traveling when the moon was bright could be the difference between life and death; by the 1660s, one in every three English families bought almanacs forecasting the lunar phases, and in colonial America, such almanacs “represented the most popular publication after the Bible.” In parts of England, the evening star (the planet Venus) was known as the Shepherd’s Lamp for its role in helping the poor navigate the night. An overcast sky could, of course, deprive a traveler of any celestial light from the stars or moon. Spaniards called such occasions noché ciéga, blind nights.

Making nocturnal navigation even harder, ordinary people in the past were rarely fully sober. This lack of sobriety, when combined with darkness, could lead to confusion and accidents. “A New England newspaper in 1736 printed a list of more than two hundred synonyms for drunkenness. Included were ‘knows not the way home’ and ‘He sees two moons’ to describe people winding their way in the late evening.” In some cases, intoxication contributed to hallucinations of the supernatural and to deadly accidents. In Derby in England, one preindustrial “inebriated laborer snored so loudly after falling by the side of a road that he was mistaken for a mad dog and shot.” Similarly tragic episodes abounded. “On a winter night in 1725, a drunken man stumbled into a London well, only to die from his injuries after a neighbor ignored his cries for help, fearing instead a demon.”

When natural phenomena illuminated the night unexpectedly, our forebears often reacted with distress. Examples of such sources of illumination included comets, aurora borealis, and swamp gas lights (caused by the oxidation of decaying matter in marshlands releasing photons). Many people took swamp gas lights to be a supernatural occurrence, termed will-o’-the-wisps.

All unusual nocturnal lights inspired terror and wonder in the people of the past, who often understood the lights as supernatural signs or portents. A comet in 1719 “struck all that saw it into great terror,” according to an English vicar, who noted that “many” people “fell to [the] ground” and “swooned” in fear. “All my family were up and in tears . . . the heavens flashing in perpetual flames,” wrote George Booth of Chester in 1727, when the aurora borealis, usually only visible farther north, made a rare appearance in England’s night sky and caused panic. One colonist in Connecticut “reportedly sacrificed his wife,” killing her in the hope that a human sacrifice might appease the heavens, upon seeing an unexpected light overhead (likely a comet). Occasionally, unexpected natural light sources could prove helpful. “Only the flash from a sudden bolt of lightning, one ‘very dark’ August night in 1693, kept the merchant Samuel Jeake from tumbling over a pile of wood in the middle of the road near his Sussex home.” More often, unanticipated lights in the darkness led to tragedy. “‘Pixy led’ was a term reserved in western England . . . for nocturnal misadventures attributed to will-o’-the-wisps.” Many deaths by drowning resulted from our forebears’ rash reactions to the sight of such “pixies” (in actuality, swamp gas).

Other nocturnal dangers were all too human, although they might pretend otherwise. “In Dijon during the fifteenth century, it was common for burglars to impersonate the devil, to the terror of both households and their neighbors. Sheep-stealers in England frightened villagers by masquerading as ghosts.” In 1660, the German legal scholar Jacobus Andreas Crusius claimed, “Experience shows that very often famous thieves are also wizards.” Many criminals indeed attempted to perform magic through grotesque superstitious rituals. “Some murderers hoped to escape capture by consuming a meal from atop their victim’s corpse. In 1574, a man was executed for slaying a miller one night and forcing his wife, whom he first assaulted, to join him in eating fried eggs from the body.” And that was not all.

The most notorious charm, the “thief’s candle,” found ready acceptance in most parts of Europe. The candle was fashioned from either an amputated finger or the fat of a human corpse, leading to the frequent mutilation of executed criminals. Favored, too, were fingers severed from the remains of stillborn infants. . . . To enhance the candle’s potency, the hands of dead criminals, known as Hands of Glory, were sometimes employed as candlesticks. Not unknown were savage attacks on pregnant women whose wombs were cut open to extract their young: In 1574, Nicklauss Stiller of Aydtsfeld was convicted of this on three occasions, for which he was “torn thrice with red-hot tongs” and executed upon the wheel (In Germany, a thief’s candle was called a Diebeherze.). . . . Before entering a home in 1586 a German vagabond ignited the entire hand of a dead infant, believing that the unburned fingers signified the number of persons still awake. Even in the late eighteenth century, four men were charged in Castlelyons, Ireland, with unearthing the recently interred corpse of a woman and removing her fat for a thief’s candle.

Many households also turned to attempts at magic to defend against thieves and monsters, using “amulets, ranging from horse skulls to jugs known as ‘witch-bottles,’ which typically held an assortment of magical items. Contents salvaged from excavated jugs have included pins, nails, human hair, and dried urine.” Some hung wolves’ heads over doors. “To keep demons from descending chimneys, suspending the heart of a bullock or pig over the hearth, preferably stuck with pins and thorns, was a ritual precaution in western England. . . . In Somerset, the shriveled hearts of more than fifty pigs were discovered in a single fireplace.”

Fear of not only evil spirits but of such flesh-and-blood criminals lurking in the darkness kept most people indoors. In 1718, London’s City Marshal noted, “It is the general complaint of the taverns, the coffeehouses, the shopkeepers and others, that their customers are afraid when it is dark to come to their houses and shops for fear that their hats and wigs should be snitched from their heads or their swords taken from their sides, or that they may be blinded, knocked down, cut or stabbed. . . . As late as the mid-eighteenth century, a Londoner complained of the ‘armies of Hell’ that ‘ravage our streets’ and ‘keep possession of the town every night.’” Almost anyone who ventured outside did so armed. “As soon as night falls, you cannot go out without a buckler and a coat of mail,” opined a visitor to Valencia in 1603.

On a night in Venice, a young English lady suddenly heard a scream followed by a “curse, a splash and a gurgle,” as a body was dumped from a gondola into the Grand Canal. “Such midnight assassinations,” her escort explained, “are not uncommon here.” First light in Denmark revealed corpses floating in rivers and canals from the night before, just as bloated bodies littered the Tagus and the Seine. Parisian officials strung nets across the water to retrieve corpses. . . . In Moscow, so numerous were street murders that authorities dragged corpses each morning to the Zemskii Dvor [Zemsky Court] for families to claim. In London . . . Samuel Johnson warned in 1739, “Prepare for death, if here at night you roam, and sign your will before you sup from home.”

“On moonless nights in many Italian cities, young men called ‘Bravos’ prowled as paid assassins.” In some cases, affluent and highborn youths roamed the night looking for a fight: ”Some cities saw the rise of nocturnal gangs composed of blades with servants and retainers in tow.” Most ruffians and thieves hiding in the darkness were common people out to commit robbery, not bored young noblemen hoping to enter a swordfight. “During the late sixteenth century, pedestrians in Vienna or Madrid rarely felt safe after dark. Foot-pads [thieves] rendered Paris streets menacing, a visitor discovered in 1620; one hundred years later, a resident wrote that ‘seldom not a night passes but some body is found murdered.’”

In London in 1712, a gang called the Mohocks terrorized the population: “Besides knifing pedestrians in the face, they stood women on their heads, ‘misusing them in a barbarous manner.’” The poet Jonathan Swift so feared that gang that he made a point of coming home early. “They shan’t cut mine [face],” he reasoned.

A lack of proper lighting afforded criminals ample cover to commit crimes. In 1681, the British dramatist John Crowne observed that night is “The time when cities are set on fire; / When robberies and murders are committed.”

Indeed, nocturnal crime was so common that a dictionary in 1585 defined thieves as felons “that sleepeth by day” so that they “may steale by night.” Surviving records suggest most preindustrial crimes occurred at night. “In the eighteenth century, nearly three-quarters of thefts in rural Somerset occurred after dark, as did 60 percent in the Libournais region of France.” “Of Italian peasants, a poem, ‘De Natura Rusticorum,’ railed: “At night they make their way, as the owls, / and they steal as robbers.”

Even indoors, nocturnal thefts were so common as to be unremarkable. In 1666, Samuel Pepys awoke “much frighted” by the noise of a theft, but upon realizing the thief was merely robbing a neighbor and not Pepys’s own home, he went back to sleep feeling relieved. Urban areas were not the only sites of crime. Bands of thieves roamed the countryside. “Bands of a half-dozen or more members were typical, as were violent break-ins. . . . Wooden doors were smashed open with battering rams and shutters bashed apart by staves. Gaping holes were cut through walls of wattle and daub. Nine thieves in 1674 stormed into the Yorkshire home of Samuel Sunderland. After binding every member of the household, they escaped with £2500.” Criminal gangs were more common in some areas than others. “French gangs, known as chauffeurs, grew notorious for torturing families with fire.” Criminals either carried no lights or “dark lanterns,” which emit light from only one side. (Merely possessing such a lantern constituted a crime in Rome and could lead to imprisonment).

In preindustrial societies, violence left few realms of daily life unscathed. Wives, children and servants were flogged, bears baited, cats massacred, and dogs hanged like thieves. Swordsmen dueled, peasants brawled, and witches burned. . . . Short tempers and long draughts made for a fiery mix, especially when stoked by the monotony and despair of unremitting poverty. The incidence of murder during the early modem era was anywhere from five to ten times higher than the rate of homicide in England today. Even recent murder rates in the United States fall dramatically below those for European communities during the sixteenth century. While no social rank was spared, the lower orders bore the brunt of the brutality.

The thieves of the past were not picky and would even pry “lead from the roofs of dwellings.” After all:

Economic necessity begot most nocturnal license. With subsistence a never-ending struggle, impoverished households naturally turned to poaching, smuggling, or scavenging food and fuel. The common people are thieves and beggars,” wrote Tobias Smollett, “and I believe this is always the case with people who are extremely indigent and miserable.”

“The working poor also took precautions, for even the most mundane items—food, clothing, and household goods—attracted thieves.” Each household, however humble, barricaded itself as night fell. “Doors, shutters, and windows were closed tight and latched.” Throughout most of history, locks were feeble and easily picked. “Not until the introduction of the ‘tumbler’ lock in the eighteenth century would keyholes better withstand the prowess of experienced thieves. In the meantime, families resorted to double locks on exterior doors, bolstered from within by padlocks and iron bars. . . . Also common, naturally, for those who could afford the expense, was the practical use of candlelight to ward off thieves. . . . In the Auvergne of France, so alarmed by crime were peasants in the mid-1700s that an official reported, ‘These men keep watch with a lamp burning all night, afraid of the approach of thieves.’”

While darkness caused lethal accidents, offered cover for crimes, and terrified our ancestors with the fear that the night might hide supernatural threats, fire could also kill. Understandable fear of fire motivated brutal punishments for arsonists and would-be arsonists. “A mob in 1680, upon learning that a woman had threatened to burn the town of Wakefield, carried her off to a dung heap, where she lay all night after first being whipped. A worse fate befell a Danish boatman and his wife, upon trying to set the town of Randers ablaze. After being dragged through every street and repeatedly ‘pinched’ with ‘glowing tongs,’ they were burned alive.” A 24-year-old University of Paris student was burned alive for arson in 1557. In Denmark, beheading was the usual punishment for arson. After a Stockholm bellringer failed to sound the alarm when a fire flared in 1504, he “was ordered to be broken on the rack, until pleas for mercy resulted instead in his beheading.”

Candles, hearth flames, and poorly cleaned or designed chimneys all posed constant fire hazards. “Some homes lacked chimneys altogether, to the consternation of anxious neighbors. Complaining that John Taylor, both a brewer and a baker, had twice nearly set his Wiltshire community ablaze from not having a chimney, petitioners in 1624 pleaded that his license be revoked. Of their absence in an Irish village, John Dunton observed, ‘When the fire is lighted, the smoke will come through the thatch, so that you would think the cabin were on fire.’”

Most ordinary homes among the impoverished masses were infested with vermin, and rats and candles proved a highly combustible combination. Flickering candles “made tempting targets for hungry rats and mice. Samuel Sewall of Boston attributed a fire within his closet to a mouse’s taste for tallow.” The Old Farmer’s Almanack advised placing candles “in such a situation as to be out of the way of rats.”

“Despite the introduction of fire engines in cities by the mid-seventeenth century, most firefighting tools were primitive,” the fire engines being mere tubs of water transported by runners on long poles or wheels. Rather than assisting in fighting the flames, neighbors often robbed burning homes. “Fireside thefts were endemic.” In England, “So routine was this form of larceny that Parliament legislated in 1707 against ‘ill-disposed persons’ found ‘stealing and pilfering from the inhabitants’ of burning homes.” “There was much thieving at the fire,” noted the Pennsylvania Gazette of a 1730 Philadelphia blaze.

“Often, barely a year passed before some town or city in England experienced disaster. From 1500 to 1800, at least 421 fires in provincial towns consumed ten or more houses apiece with as many as 46 fires during that period destroying one hundred or more houses each.” England was hardly unique in this regard. Across the preindustrial world, fires raged:

Fires spread terror from Amsterdam to Moscow, where an early morning blaze in 1737 took several thousand lives. Few cities escaped at least one massive disaster. . . . Toulouse was all but consumed in 1463, as was Bourges in 1487, and practically a quarter of Troyes in 1534. The better part of Rennes was destroyed in 1720 during a conflagration that raged for seven days. . . . Boston lost 150 buildings in 1679 after a smaller blaze just three years before. Major fires again broke out in Boston in 1711 and in 1760 when flames devoured nearly 400 homes and commercial buildings. . . . While New York and Philadelphia each suffered minor calamities, a fire gutted much of Charleston in 1740.

Rural areas were not necessarily safer from the threat of fires. The Danish writer Ludvig Holberg (1684–1754) observed, “Villages were laid out with the houses so close together that, when one house burned down, the entire village had to follow suit.” After all, rural construction materials were highly flammable. “Once ignited, a thatch roof, made from reeds or straw, was nearly impossible to save.”

New Atlas | Natural Disasters

Earth Fire Alliance Satellite for Detecting Wildfires Is Now in Orbit

“The first satellite in a constellation designed specifically to locate wildfires early and precisely anywhere on the planet has now reached Earth’s orbit, and it could forever change how we tackle unplanned infernos.

The FireSat constellation, which will consist of more than 50 satellites when it goes live, is the first of its kind that’s purpose-built to detect and track fires. It’s an initiative launched by nonprofit Earth Fire Alliance, which includes Google and Silicon Valley-based space services startup Muon Space as partners, among others.

According to Google, current satellite systems rely on low-resolution imagery and cover a particular area only once every 12 hours to spot significantly large wildfires spanning a couple of acres. FireSat, on the other hand, will be able to detect wildfires as small as 270 sq ft (25 sq m) – the size of a classroom – and deliver high-resolution visual updates every 20 minutes.”

From New Atlas.

Yahoo Finance | Natural Disasters

AI Breakthrough Is “Revolution” in Weather Forecasting

“Cambridge scientists have made a major breakthrough in weather forecasting after developing a new AI prediction model that is tens of times better than current systems.

The new model, called Aardvark Weather, replaces the supercomputers and human experts used by forecasting agencies with a single artificial intelligence model that can run on a standard desktop computer.

This turns a multi-stage process that takes hours to generate a forecast into a prediction model that takes just seconds…

Tests of the Aardvark model revealed that it is able to outperform the United States national GFS forecasting system using just 10 per cent of the input data.”

From Yahoo Finance.