In 1900, the material condition of the human race was unquestionably better than it had been a century earlier. In a country like England two hundred years ago, the poor lived in conditions that were worse than those endured by all but the poorest in the world today. Smallpox, cholera and typhoid were rife. Judicial punishments were often unbelievably harsh. Violence was part of the everyday landscape. Even children were required to work. In factories. Down the mines. And education was only for the privileged.
Although the main beneficiaries, in financial terms, of much of the progress that was achieved during the nineteenth century in England and other industrializing countries were the grande bourgoisie, the owners of capital, many advances also benefited the general population: public sewerage systems, railways, changes in medical practice, including the use of antiseptics and anaesthetics, and universal education are among the most obvious examples. Hence the positive appraisal of the human condition framed in the opening sentence.
A hundred years later, and despite two catastrophic world wars, the human race was once again in a position to assert that real progress had been made in the intervening century, with inventions such as motor cars, aeroplanes, computers, antibiotics, radio and television, and double glazing to set against the nuclear weapons and general brutality of the most violent century in the entire history of civilization.
Now fast forward to 2100. Will the human race be able to say that, on balance, progress has been made in the twenty-first century? The portents are not good. Although the problem had been brewing quietly for a couple of decades, the rise of militant Islam registered with the general public only after the attacks on the World Trade Center and Pentagon in 2001. There have been plenty of reminders since.
It is unlikely that this problem will be resolved anytime soon, and the potential for a global conflagration, originating in the Middle East with terrorism as the spark, should not be discounted. Islam and Christianity have been in conflict since the emergence of the latter in the seventh century, and the grievances that motivate modern Islamist terrorists are unlikely to disappear overnight.
However, there are even more serious threats to global stability. The most critical is probably population: the world’s human population is currently 6.9 billion. It was 2.3 billion in the year I was born. The rate of growth peaked at 22 percent per decade in the 1950s but has since declined. It is projected to continue to decline, but world population is still likely to exceed nine billion by mid-century, by which time the total will have stabilized, according to most projections.
If these projections are even reasonably accurate, then a major problem presents itself. The example provided by developed countries suggests that a population will stabilize only in response to increased prosperity, which is in effect a proxy for improvements in healthcare and thus increased life expectancy. As birth rates fall, the population pyramids of individual countries, then regions and finally the world will become increasingly top-heavy. A numerically stable population is an ageing population.
The pressures imposed by an expanding population impact heavily on resources, a catch-all term that includes food, energy and raw materials. The availability of food, in particular, is a serious problem: fish stocks have already collapsed in many parts of the world’s oceans, famine is an ever-present threat in some poor countries, and commodity speculators have entered the market, pushing up food prices beyond what many can afford. In addition, huge quantities of processed food are discarded in rich countries merely because it has reached its sell-by date, while in many parts of sub-Saharan Africa up to one-third of all food produced is lost through spoilage because harvesting and distribution systems are woefully inadequate.
Sell-by dates are a blunt instrument as a measure to protect public health. A few years ago, I visited my local supermarket about ten minutes before closing time. It was selling Galia melons at five pence each because it had only ten minutes in which to get rid of them or be forced to dump them. I bought six, because they weren’t even ripe! It took between two and three weeks for that to happen. They were delicious, but that’s not the point. It’s just one small example of the kind of inefficiencies in our food distribution systems that if sorted out would go some distance towards alleviating what is probably the most shocking statistic on food, that an estimated one billion people in poor countries don’t have enough to eat and are often hungry when they go to sleep at night. Meanwhile, I can buy Galia melons in my local supermarket, even if I usually have to pay the full price. I feel vaguely uncomfortable with this kind of privilege.
The availability of energy is also likely to be an increasing problem. Eradication of poverty worldwide, a laudable goal that was set by the G8 group of countries just a couple of years ago, is not possible without almost unlimited supplies of cheap energy, yet energy costs are skyrocketing, principally because there isn’t enough to go round. Consequently, we hear the familiar exhortation by environmental activists to conserve energy. This may be the correct short-term strategy, but it is counterproductive in terms of alleviating global poverty. On the contrary, there is no solution to this problem without invoking a radical new source of energy. And there is only one candidate that fits this description: nuclear fusion.
I’d like to be able to predict that success in harnessing the vast amounts of energy available from nuclear fusion is just a decade or so away. Unfortunately, I’m not optimistic. Here’s the problem: it’s relatively easy to smash atoms to pieces, but it’s much harder to stick them together, and it requires a lot of energy to get started, because atoms only stick together at very high temperatures. Hydrogen–helium, the simplest of all fusion reactions, requires a temperature of 20 million degrees, at which point individual atoms no longer exist. Everything becomes a kind of subatomic soup, known as a plasma. As you can imagine, the behaviour of this nuclear fireball is difficult to control, which is the object of the exercise.
And here’s the rub: you may think of someone who studied physics at university as a physicist, but by the time they graduate they will have already started on the path into one or another specialism. In this case, because nuclear fusion is a nuclear process, research in the field is conducted by nuclear physicists, yet they are attempting to control a plasma, something that plasma physicists know a lot about. But because there is so much to keep up to date with in their own field, nuclear physicists don’t read plasma physics journals. Oh my.
Using raw materials unsustainably has been the default position since the start of the Industrial Revolution, and little has changed even though reserves of some key commodities are now perilously low. Although there is little likelihood that the ores of common metals like iron, copper and aluminium will run out soon, the availability of rare earth elements, which are vital in the electronics industry, is already severely constrained, mainly because, as their name implies, they were scarce to begin with. There aren’t many minerals of which you could say supplies are plentiful.
Many raw materials can be produced sustainably, including timber and natural fibres, but this is never enough to meet demand. So we still see clear felling of old-growth boreal forests around the world, and tropical hardwoods such as teak and mahogany cannot be replaced fast enough to meet demand for luxury toilet seats from consumers in developed countries.
Natural fibres would seem, self-evidently, to present a ‘natural’ and sustainable alternative to their man-made counterparts, most of which derive ultimately from petroleum. However, it is instructive to look at two examples of the kind of environmental damage that can result from using land to produce these materials.
The mountains of the English Lake District were once covered with trees, which were cut down during the Neolithic period, when it must have seemed like a good idea. By the time the Middle Ages rolled around, wool had become an important commodity, with lots of money to be made, which is why sheep were introduced to the area by monks from local abbeys. It is a matter of regret that the monks’ successors continue to graze sheep on the mountains in large numbers, because one effect has been to prevent the regrowth of trees. Sheep are aliens in this fragile ecosystem because they will eat anything—except bracken. Nothing likes this highly adaptable fern, except fungi. So it continues to gain ground, choking everything in its path that hasn’t been eaten by the sheep.
Cotton is even more environmentally unfriendly, mainly because of the quantities of irrigation water required. Cotton cultivation can even be said to be behind one of the world’s worst man-made environmental disasters: the slow disappearance of the Aral Sea, which currently has an area only 10 percent of what it was in the mid-twentieth century. Planners in the Soviet Union in the 1940s decided that it would be a good idea to divert the two principal rivers flowing into the Aral Sea, the Amu Darya and the Syr Darya, in order to irrigate what had previously been desert in order to grow rice, other cereals and, principally, cotton. In keeping with this ramshackle plan, many irrigation canals were appallingly badly built, and up to 75 percent of the diverted water was (and still is) lost through either evaporation or leakage. And cotton is a notoriously thirsty crop anyway.
This sorry tale is one facet of a critical resource problem that is likely to become increasingly serious in the coming decades: the availability of water. In addition to individual requirements for drinking, cooking and washing, water is needed in large quantities for both agriculture and manufacturing industry. To take two examples, 6,000 litres of water is needed to produce a pair of denim jeans, while between 15,000 and 30,000 litres is required to produce a kilogram of beef. And, as usual, when there isn’t enough to go round, the potential for international conflict increases. We can safely assume that China knew what it was doing when it started building dams on the headwaters of the Brahmaputra in Tibet, and that India didn’t when it recognized Chinese sovereignty over Tibet.
One aspect of the water supply crisis is inextricably linked to the phenomenon of climate change as a result of global warming, because as the temperature of the atmosphere rises turbulence increases, so precipitation events (rain, snow, hail) become shorter and more intense. More water disappears as runoff, resulting in more catastrophic floods, and less percolates into the soil, so the water needed for agriculture has to come from other sources.
The Ogalalla Aquifer, which underlies most of the Great Plains of the United States, is the type example of environmental mismanagement here. This huge source of what is essentially fossil water has been tapped on a large scale since the 1950s to transform a vast plain whose natural climate is semi-arid (this is ‘dust bowl’ country) into the most productive agricultural region in the United States. Unfortunately, the level of the water table has been falling by as much as 1.5 metres per year in some places. And recharge rates are extremely slow. Does America have a Plan B?
It will be obvious that all these factors—population, food, water, raw materials and climate change—are interrelated, so any measure designed to address one must take into account the effect that it will have on all the others. And nobody will want to make concessions, so nothing significant will get done, even though we’ve already moved beyond taking our own bags to the supermarket, in case you hadn’t noticed.
A cosy theory has been doing the rounds since the early 1970s. It’s called the Gaia Hypothesis, after the Greek goddess of the Earth, and it postulates the planet as a gigantic, self-healing super-organism. It’s a seductive idea. However, a counter-theory has emerged recently. Palaeontologists studying mass extinctions in the Palaeozoic era noted that in the run-up to the extinction event, the dominant life-forms progressively bespoiled their environments, until, within a surprisingly short period, complete ecosystems collapsed. They have therefore proposed the Medea Hypothesis. In Greek mythology, at least according to Euripides, Medea ate her own children. Oh my.