Monday, December 29, 2008

How Many People Can the Earth Support... Really?

This is not simply a question of how many people can be crammed onto the dry surface of the planet. For example, it is easy to calculate (as I indicate below) that if the present number of 6.7 billion of us were each allowed an area of one square metre, we would collectively occupy just 6,700 square kilometres, or an area enclosed within a square about 50 miles by 50 miles, which would fit with spare room within a county the size of Yorkshire.

This is a comically naive piece of arithmetic, but not much more so than many sums I have seen done as to how many might exist on Earth given a daily diet of, say, 2,500 Calories, in which case a figure of around 12 billion can be deduced. So, whoopee, we may well pass that WHO estimate of 9 billion by 2050 and maybe get to 12 billion by the end of the century. Beyond simple issues of how much food we might need, and water for that matter, along with fertilizers and other material resources to build shelters and clothe ourselves, are more complex but equally fundamental questions centred around quality of life and human dignity.

So, what do we mean by living and what standards of it might be considered acceptable? Would we in the West want to “fall” materially to the standards, say, of western Africa? Or, is the economic dream more wishful, to raise the living standards of the majority world to those of the West? The answer is really the proverbial elephant in the room. If the whole existing number of people on earth lived at the standard of an average American (if there is really an average anybody), it is said we would need five planets worth of resources.

It’s about 3.5 planets for a typical European (an even less average scenario), and four earths for that many Australians. Simply put, the latter prospect is not viable, and in the longer run neither is it for Americans, Europeans or Australians, let alone the whole world. Modern, chemically fertilized, mechanised farming is very successful. We can also kill-off pests with synthetic pesticides, and so crop-yields on Western farms are the best on earth, but they are unlikely to be sustainable.

So, in the absence of plenty of cheap oil and natural gas, what is the upper limit of population, or conversely, the lower limit of material quality. Well, O.K., let’s say that we are all allowed that 2,500 calories every day.

6.7 x 10^9 people x 2500 Cals/day. 1 Cal = 1000 cals, so 1 cal = 4180 J (4.18 J/cal).
So they would all eat, 6.7 x 10^9 x 2500 x 4180 x 365 = 2.56 x 10^19 J/year.

So, what might be grown in total?
We have 15 x 10^6 km^2 of arable land for crops. If we assume that 2 tonnes of edible food can be grown (from maybe 5 tonnes of crops mass) per hectare/year, and that this is in the form of simple sugars, i.e. C6H12O6, with an energy content of 2800 kJ/mol, we get:

2 x 10^6 g/year/180g/mol x 2800 x 1000 J/mol = 3.11 x 10^10 J/ha/year. And converting 1 km^2 = 100 ha, that’s grown on 15 x 10^6 x 100 ha of land, so we have:
1.5 x 10^9 ha x 3.11 x 10^10 = 4.67 x 10^19 J/year.

Now this might be seen as good news in that we can produce around 80% more food energy than is consumed by the present 6.7 billion humans, leading to the conclusion that the earth can support around 12 billion of us, so as I said, maybe we can meet the WHO targets of billion by 2050 and 12 billion by 2100.

But, we would all be living on the proverbial “bowl of rice a day” [(2500 x 4180)/(2800 x 1000) = 3.7 mol = 672 grams].

(This ignores growing any food for animals, that you could get these crop yields without artificial fertilizers and pesticides and there would of course be no crops grown for biofuels).

I have mentioned before a Hubbert type analysis that can be applied to human population growth which is already slowing down. This predicts that in 2024 there will be a maximum at 7.1 billion people (not many more than the 6.7 billion now) after which there will be a decline to 2.5 billion by 2100. I would not be at all surprised because resources to support population are limited.

Now that 2.5 billion at the end of the century may all be living equally and equitably on one planet's worth of resources, or more likely there will be even more poverty all over the world but with a smaller differential between the developing and industrialized nations, if either category are that by then. The level of poverty can only fall so far, because the lower limit of destitution is death.

I think many in the West particularly would sooner die than return to the conditions of a pre-industrialized society, even it that could be relatively well-provided for by agriculture.

Tuesday, December 23, 2008

The End Times.

The end times (end of days) are most often worked into a message that God is set to bring Armageddon upon a sinful humanity, who must repent their sins, but even so, only “one hundred and forty and four thousand” of them will be spared into heaven, according to what is written in the Chapter of Revelations. The meaning of the complex, confusing and fearful scenes described in that closing chapter of the Bible is open to interpretation, and biblical scholars vary in their opinion and offer different readings of it, and yet without doubt, it seems to point to a chain of events that lead to an outcome in which the Earth is utterly and forever changed. If we believe that the description of these end times are a forewarning of human destruction, then it is easy to look at the many and interconnected troubles of a sick planet with sicker humans on it, and find a map in Revelations that our actions, of greed and disrespect for each other, ourselves and the Earth are simply predicted by divine will, and we should simply carry on as we are, contemptuously consuming ever more, laying waste to all resources and carving the way to majority dissolution and death, in the cause of the will of a creator who wrote the end times of humanity into His code for our destiny.

Other religious teachers read Revelations as being an historical record to us, in which the Whore of Babylon was the Roman Empire, not an agent of Satan, although to the persecuted of the time it may have seemed as though the Romans were themselves agents of the devil. The subject is fascinating and for example the psychic, Edgar Cayce saw the Book of Revelation as symbolic of the body and consequently each emblem, emotion and condition relate to the person. So, the elders of Revelation 4:4 relate to the 12 pairs (24) of cranial nerves, and the seven churches of Asia in Revelation Ch. 1 are symbolic of the 7 chakras, i.e. spiritual centres, and that the end times, if that is what they are, reflect changes within individuals, and that the second coming of Jesus Christ, rather than being part of an almighty cataclysm, was an event for each of us, singularly, and the end times of particular attitudes, beliefs and behaviours within each and every one of us, through which we enter a state on enlightenment. Whatever may be the truth of any of these interpretations, most of the cheap and readily available resources of materials and energy will not be so for much longer.

Thus, these are the end times of casual regard, e.g. for many metals, and fuels like oil, gas, uranium and even coal, if the end days last 50 years or more, and of an enlightenment to the reality that we must use them less and use them well. The alternative may be that we are living in the lead-up to an apocalypse, of war, famine, disease and death, and those four horsemen are on their way, at least if we do not begin to care for the earth, for one another and indeed for ourselves as individuals, by curbing greed and complacency - the principal sins of disregard. We are indeed, running low on cheap resources of metals and energy and the spiritual values of family and community have been plundered in the process that has lead to that which we call progress. It is beyond refute that there is much to be applauded in the advances of the past century and more, in healing sickness, and raising countless millions out of abject poverty, but there has been little symbiosis with the natural world, which we are largely isolated from in a vulnerable and fragile bubble , which could so easily be popped by any number of resource shortages, new strains of disease, or famine.

Even money, in the new economic miracle is proving to be a phantom, spirited away like a will-o-the-wisp on a financial system “built” on credit - i.e. on nothing. It is the issue of money as a false god that consumes resources and human inner resources, that I can read most closely in Revelations, but that is just an impression. If the end times are not to become the end of days, we need to address the problem, and readapt our attitudes, beliefs, behaviours and actions to avert this outcome. This will involve both a material and spiritual transformation, taking us into an enlightened new age. I have great faith and respect for the better qualities of humans, and a belief that we can find an earth-centred haven from the current white-noise of fear and despair. I am an optimist.

“Then he showed me the river of the water of life, sparkling like crystal, flowing from the throne of God and of the Lamb down the middle of the city's street. On either side of the river, stood a tree of life, which yields twelve crops of fruit, one for each month of the year. The leaves of the tree serve for the healing of the nations, and every accursed thing shall disappear.”

Related Reading.
Revelation 21, 22: "The New English Bible; New Testament. Popular Edition. Oxford University press; Cambridge University Press, 1961.

Friday, December 19, 2008

"Peak oil: postponed"? Dr Richard Pike.

This is the title of the transcript of a recent interview with Dr Richard Pike, the CEO of the Royal Society of Chemistry, by Andrew Orlowski. For many years, the RSC (not the Royal Shakespeare Company) has been a fairly dormant animal, and I did used to wonder what its purpose was exactly, even though I am a Fellow of the Royal Society of Chemistry, as I became shortly after I was awarded a research professorship in chemistry. However, its role is to sound the voice of chemistry in the United Kingdom, especially during these tough and inclement times, in regard to funding, the relative unpopularity of hard subjects such as science which students are reluctant to enrol on degree courses in, and the health of the chemical industry. Hence I applaud Dr Pike for his proactive stance on important issues where chemistry and a sound chemical training really do matter, namely those concerning the many challenges posed by our energy and environmental demands, such as peak oil and global warming.

He has previously pointed out that growing crops to make biofuels is a non-starter, at least on a petroleum-significant scale, otherwise there is a conflict between growing crops to feed cars or humans. I couldn't agree with him more and have said as much on various occasions and in appropriate postings on here; also on my regular monthly column at Dr Pike is also of the opinion (as I noted in yesterday's posting) that there is most likely far more oil in the ground to be recovered than the 1.2 trillion barrels that is generally quoted. I don't disagree, but I stress that it is the rate of recovery that is the most pressing issue, not so much how big the reserve is in total, and we will experience a demand-supply gap within the next decade, for sure, as even the CEO of Shell concurs.

Dr Pike points out that the figures given by the oil companies tend toward the conservative side, and that if a probabilistic analysis is done, based on the P50 estimate (see yesterday's posting) which refers to "proven but possible" oil reserves, rather than the P90 (90% chance of oil being recovered) , the ultimately recoverable resource (URR in Hubbert terms), or size of the oil bounty, can be "two or three times" greater. Thus, we might expect to recover a grand total of 2.4 trillion barrels not 1.2 trillion, even if the P90 figures given by some nations are suspect.

As Pike says, "P90 is a lower bound, and companies have a duty to report what the lower bound is to statutory bodies, such as the Securities and Exchange Commission, and BERR in the UK. And that figure is conservative. Over time, "lower bound" has come to mean "proven reserves". But it's actually the extreme left hand side of the probability curves."

Dr Pike makes some good points about peak oil doomsayers ("eschatologists", i.e. those who believe in the end times usually attributed literally to the "events" described in the biblical Chapter of Revelations), who think that "the end of the world is nigh". In fact there is a great deal of misunderstanding about what peak oil means, and this is where Pike's point is particularly salient. Many think that peak oil = end of oil, but that is not what it means, and nobody versed in the Hubbert analysis has to the best of my knowledge claimed as much.

Peak oil = end of "cheap" oil (the title of my posting here on May 13th, 2008) and we will indeed be producing oil for many decades yet, and many more in accord with Dr Pike's analysis. The Hubbert curve (or its adaptation, the Hubbert Linearization) refers to the production of a particular field. To date, this has meant readily available cheap, oil. It is not strictly within this remit to refer to a global peak since to derive such a thing means averaging over the production of many different oil fields in the countries of the world that produce oil. All fields have different capacities, and are at different stages in their production (or depletion). One consequence of this is that different countries will run out of oil quicker than others and that will shift economic power and stir-up geopolitical tensions. Russia comes to mind. The New World Order will be compelled by who has the oil, and the power that is attendant to it, while those like the UK who are short on oil will be accordingly weakened.

However, even if the world is not about to run out of oil, producing oil against a rising demand for it will raise the price, and there will be economic fallout, probably a recession and then a Long Emergency scenario according to Kunstler. Eventually there will be an effective global "peak" when overall oil supplies do decline and this will either widen the supply-demand gap or create it if it has not already come about. Pike concedes that there will be a peak eventually but no one knows when exactly; however, he considers that the analysis done so far is "ill informed" and that it will not come about immediately. It is true that we will only know the date of peak oil retrospectively, but most analyses suggest it will be with us by 2012 if not before. Even if peak oil is "postponed" the gap will not be, though it will certainly widen after the peak.

Once a supply-demand gap manifests, the price of oil will increase relentlessly, or at least so far as the market can bear. Pike notes that "you can buy your way out of capacity constraints". This is also quite correct, but it is a dicey business. I agree with him that if you invest in overcoming "surface constraints", as in the number of wells, gas/oil separators, pipelines, storage tanks or jetties, the supply of oil is improved, but the cost of a barrel of oil inevitably and accordingly increases. He says, "The rough rule of thumb that applied three or four years ago was that to get 1 million barrels a day extra, you needed to spend in the order of $10 billion, very approximately."

He notes that the money is recovered very soon, and that at $140 a barrel, the payback is 100 days. Yes, but we have seen the economic consequences of such high oil prices, coupled with a distinctly dodgy global financial system, and while the output of oil might be increased, it will cost.

In terms of renewables and chemistry, Pike states that "no economies are yet geared-up for electricity as a direct heating source, or as automotive fuel, or for hydrogen storage." This is absolutely true. He thinks that solar-energy is the answer, but there is the scale-up problem I have commented on before. In other words, even if solar/PV can be done using thin-film cells and using organic conductors (otherwise the shortage of platinum metal will scupper the whole enterprise, along with fuel-cell technology), it will still take decades to install enough to run the world on. This will indeed need to be "putting things together on a grand scale which requires leadership, because we're in a position where some of these decisions are not made by individuals or individual companies. It's going to require a lot of collaboration." Yes, and collaboration between entire nations and continents, probably, which might prove a longer job.

Pike also stresses the issue of scientific ignorance among the public, and refers to the level of questions being set to 14 year olds on science courses. He comments, that while the course material is often comprehensive, the examinations barely skim it - and are almost fail proof. Without better education, the next generation of policy makers is as likely to be as scientifically illiterate as the present one. The cycle needs to be broken"

I agree, but it is quite ironic that in this age of huge university expansion most of the ex-polytechnics (which did a fine job teaching science to technicians from industry) now they are the new-universities don't teach chemistry. In my opinion these institutions should be restored to the technical colleges they were once, and well, because in the time to come, as the energy crunch bites, we will need people who know how to do useful things, not a rising army of pharmacists, psychologists or media studies graduates, taught by "new-professors" some with no published work in the subject they are supposed to be professor of. Ironically a professor of "Chemical Education" is one lamentable example that comes to mind.

Related reading.
"Peak oil: postponed". By Andrew Orlowski.

Thursday, December 18, 2008

Oil Reserves.

The term "oil reserve" refers to quantities of crude oil that are claimed or estimated to be recoverable given a prevailing set of economic and operating conditions, i.e. mainly in regard to price. If the price of oil increases than more may be transferred from the resource to the reserve, as is true of all commodities.

The term "oil in place" is that amount of oil which is estimated to be held by a given oil reservoir, including that which will prove unrecoverable, in consequence of the particular geology and other properties (e.g. degree of fracture) of the reservoir. It is thus to be classified as the resource, while the fraction that can be produced is the reserve.

The term "recovery factor" is the ratio of producible oil reserves to total oil in place for a given field, and these vary from field to field. They may also change over time Recovery factors vary greatly from oil field to oil field. The recovery factor of any particular field may change over time, according to price and as new technologies for extracting oil, e.g. enhanced recovery methods, are introduced.

There are four criteria that must be fulfilled for the classification of a reserve, namely that it must be:

(1) discovered through one or more exploratory wells,

(2) recoverable using existing technology,

(3) commercially viable (given the contemporary economic climate),

(4) remaining in the ground.

All reserve estimates carry a degree of uncertainty, according to the available geological data and how these are interpreted. Accordingly, a further subdivision is introduced to indicate a relative degree to that uncertainty, using the classifications, proved and unproved, as defined below.

Proved Reserves.

These are reserves that are claimed to have a reasonable certainty (usually at a confidence of 90%) of being recoverable under existing economic and political conditions, and using existing technology. In the industry, this is known as P90 (i.e. with a 90% certainty of being produced). Proved reserves are also known in the industry as 1P.

Proved reserves are further sub-classified as Proved Developed (PD) and Proved Undeveloped (PUD). PD are reserves that can be produced from existing wells, or from additional reservoirs where any additional investment (operating expense) is minimal. PUD reserves require additional capital investment, e.g. drilling new wells and introducing gas-pressurisation in order to bring the oil and gas to the surface.

Companies listed on U.S. stock exchanges must substantiate their claims, however, there are governments and national oil companies which do not do this leading to some speculation that e.g. the Saudi fields may hold less oil than is claimed.

Unproved Reserves.

Probable reserves are based on median estimates, and claim a 50% confidence level of recovery, which is referred to in the industry as P50 (i.e. with a certainty of being produced of 50%). This case is referred to in the industry as 2P (i.e. proved plus probable).

Possible reserves have a lower probability of being recovered than probable reserves. The term P10 is often used for reserves with at least a 10% certainty of being produced. Reasons for classifying reserves as possible include varying interpretations of geology, reserves not producible at commercial rates, uncertainty due to reserve infill (seepage from adjacent areas), projected reserves based on future recovery methods. The term in the industry is 3P (proved plus probable plus possible).

Unproved reserves are used internally by oil companies and government agencies for future planning purposes, but do not normally feature among the numbers quoted in external publications, which tend to err on the side of caution. This is perhaps no surprise since in 2004 Shell got itself into a lot of trouble when it was found to have considerably overestimated the amount of oil in its holdings.

Dr Richard Pike, the CEO of the Royal Society of Chemistry, and an "oil-man" of some 24 years experience, has made the case that the normal procedure of simply adding together the individual oil holdings to make a grand world total of 1.2 trillion barrels, is inaccurate and that a probabilistic analysis (i.e. according to the amount that it is likely to be recovered according to the probability of recovery from different fields) is a better approach. The result is significantly different since it suggests that the amount of recoverable oil is most likely more than double the accepted estimate, i.e. in excess of 2.4 trillion barrels.

I have no dispute with what Dr Pike is saying, and there may well be more oil down there than is generally spoken of. However, if that oil cannot be recovered at a sufficient rate (given the prevailing economic situation) to match rising demand for it, a demand-supply gap will ensue. Economically this will push up the price of oil and encourage further development of even previously non-economic sources, but a rocketing oil price is likely to force an economic downturn overall, as we have seen in these last months of 2008. Indeed, a number of oil-development projects have been put on-hold because the contemporary oil-price is so low as to not make them worthwhile.

Peak oil will come, as it must, if it has not done so already, but it is the gap between demand and supply that is the real issue: the actual peak will simply make matters worse, by drawing down the supply side further and enlarging the chasm between the two. Then the price of oil will increase relentlessly.

Related Reading.

[2] "Peak Oil Postponed", By Andrew Orlowsi:

Wednesday, December 10, 2008

Water Vapour Heating Planet.

Just a quick note that I thought might be of interest to you. NASA have made the most detailed measurements yet of water vapour in the lowest ten miles of the atmosphere, using a satellite with specific sensors for this gas, which is strongly absorbent of radiated heat from the surface of the earth. It is thought that the heat-trapping ability of water vapour could increase the effect of global warming by carbon dioxide by as much as twice that it its absence.

Andrew Dessler from Texas A & M University has employed data gathered from the Atmospheric Infrared Sounder (AIRS) on the NASA Aqua satellite, measured over the period 2003 - 2008. The devise is the first with the ability to differentiate between different amounts of water as are present at different altitudes.

Through a combination of the satellite data and global average surface temperature readings, information has been garnered to determine how exactly water vapour influences and changes with temperature. It appears that a warming of the planet by 1 degree C will cause an elevation in humidity, and trap heat with an additional 2 Watts per square metre, which is in line with the predictions of climate models. Thus, the feedback effect of water vapour on global warming is both large and positive.

Related Reading.
"Water Vapour Warming,"By Olive Hefferman.

Saturday, December 06, 2008

A Recent History of Oil Prices.

In January 1999 the price of a barrel of oil reached a low point of $16 when Iraq increased its oil production at the time of the Asian Financial Crisis when demand for oil fell. Prices then increased rapidly, reaching $35 in September 2000, and after a temporary fall reached $40-50 by September 2004. Crude oil prices surged to a record high above $60 in June 2005, and by early August 2005 hit $65 as consumer demand was maintained. In September 2007, the price of US crude oil broke the $80 barrier. In October 2007 a barrel of US light crude oil exceeded $90 for the first time, due to a combination of tensions in eastern Turkey and a fall in the value of the US dollar. The next psychological watershed of $100 was briefly breached in early 2008, but the price fell again until the end of February after which it remained and rose well above this new setting. Then a visible ramping effect became evident and so the price exceeded $110 on March 12, 2008; $125 on May 9, 2008; $130 on May 21, 2008, $140 on June 26, 2008 and $145 on July 3, 2008. The record was reached on July 11, 2008 at $147.27 as a consequence of geopolitical tensions over Iranian missile tests.

The above data stress the point that the price of oil is highly sensitive to the world political situation and to a general sense of confidence, including that in the stock markets. When the $147 barrel appeared, it did appear there would be no stopping the escalating price of oil, and that by December 2008 a barrel of oil might cost around $150 or more, amid speculation that by the end of 2009, it would be nearer $200. However, oil prices declined by more than $20 over the next two weeks in July 2008, and seemed to stabilise at near $125 a barrel on July 24, 2008. A forcing factor came into play, which was that the very high price of oil had changed people’s behaviour and they were now driving less with a reduced demand for oil. Oil prices then dipped further, reaching $112 a barrel, on August 11, 2008.

On September 15 the $100 psychological barrier was again broken, but in reverse, when the price fell below $100 for the first time in seven months. On October 11 there occurred a massive crash in the value of global equities, with a barrel of oil falling by 10% to $77.70. In consequence of further economic slowdown the price continued to slide and today (December 4, 2008) it is trading at around $45 a barrel. Rather than the $200 predicted last summer some analysts are now predicting a $20 barrel sometime during 2009. I must stress, however, that even if this does happen it will be a short-lived event, because the facts of geological limits to production, increased production costs to obtain more difficultly recovered oil and that demand is still rising (demand is simply rising less steeply during this economic recession, but it is still in the ascendant).

The upshot will be a gap between the demand for crude oil and the limits of how much of it can be produced, a quantity that must inevitably fall beyond the point of arrival of “peak oil”, which will force the price up again. Thus oil will become a commodity that is both increasingly scarce and relentlessly expensive. The price of oil (and that of all commodities), is subject to major variations over time, since they are inextricably linked into to the overall business cycle. When the demand-supply gap is reached, oil prices will soar, but the commitments and habits that determine the energy use of oil-users will take time to adjust. It is time-consuming and expensive to introduce more production capacity in the near term, but in the longer run, both businesses and individuals will act to cut back their oil use in response to the driver of high prices. An optimistic economist might argue that high prices promote new investment in production and so new sources of oil will emerge on the market, gradually restoring a supply-demand balance.

The remarkable hike in the price of oil in the summer of 2008 was driven partly by a period of brevity when global demand for oil outran its supply. The OPEC nations were encouraged to ramp-up their production to get the price down for western nations who would be unable to maintain their demand for oil if its price remained at such high (and relentlessly increasing) levels. My suspicion is that the summer-spike in oil prices, led to a fall in oil-use and a significant curb in demand for cars and other goods, which became notably more expensive, and precipitated the present economic downturn. Job losses and less disposable income then meant that many in the low-income bracket would be unable to pay-back their mortgages and a credit-crunch ensued, with a lack of confidence in and among banks who had lent money rather carelessly. The increased output of oil by OPEC along with a contraction and the threat of further slowdown in the business sector, hence less oil being used, has created a minor glut, thus forcing down the price of oil, and increasingly so along with the declining value of all equities. While a low oil price should help businesses to invest and expand, the credit crunch and fear by the banks to lend money has acted in the reverse of this, and prompted a recession.

On the basis of microeconomic theory, when supply exceeds demand, the price of a commodity should reduce to the nominal cost of production of the most expensive source. In the case of oil, as its price drops, the most expensive wells become uneconomical and are shut down, at least temporarily. A price equilibrium is met at a point near the production cost of the most expensive source required to meet global demand. The variation between what the market can bear in the first days of shortage to the marginal cost of the last well in times of surplus can be enormous. Indeed, the price of the majority of commodities including metals and food are subject to equivalent large swings over time. As global oil production begins to decline, following “peak oil”, oil prices are likely to become increasingly volatile than before the peak, because the range of production costs among all sources supplying the market will be much wider. Major oil fields exist where the cost of production is comfortably below US$10 per barrel, and for decades their output was sufficient to meet the whole of global demand for oil. Indeed, many such cheap wells still provide a substantial proportion of the world’s oil .

The shortages and high prices that are inevitable in the future will render viable the extraction of oil sources that cost $50, $70, or $100 or more, a barrel, including offshore/deep water fields, oil sands, oil shale, and enhanced/secondary recovery from depleted fields. As couched in the jargon of microeconomic theory, the supply curve will be much steeper than in past years. Shifts in demand, either up or down, will hence cause swings of relatively greater amplitude in the market price. Nonetheless, even the most expensive sources of oil will be unable to provide anywhere close to the 30 billion barrels of crude oil that the world currently depends on each year. It is the rate of supply (variously termed rate of flow, rate of conversion or rate of recovery) that is at issue. Put simply, it doesn’t matter how big the volume of the resource is, if oil cannot be recovered at a rate of 85 million barrels a day to meet present demand (and rising), we must learn to live by using less oil. This poses a challenge that is simple but not easy, since it must involve curbing our reliance on personalised transport, mainly cars, which most of the world’s crude oil is currently used to run. The corollary to this is the need to develop rapidly, more localised communities, that depend far less on cheap oil-based transportation, which will no longer exist.

Related Reading.

This is the final section to an article entitled: The Oil Question: Nature and Prognosis" which will be published in the popular science journal "Science Progress", probably before Christmas. I thought I would put it as a posting on here for any general interest and/or comments.

Tuesday, December 02, 2008

Climate Change Not Caused by CO2?

I have put a question mark at the end of the title, since that sentence, made as a statement is highly controversial. Indeed, the term "denier" has been accoladed to the still substantial faction, including some serious and respected scientists, who challenge the assertion, based on computer models of the earth's climate that the increase in atmospheric CO2 derived from fossil fuel carbon is causing the planet to warm-up, perhaps by an additional 5 - 6 degrees C by 2100.

Let me emphasise my own perspective here. I am not a climatologist, but I do see that the most immediate impact on human life on earth will be caused by the dearth of cheap oil, then gas, and finally coal. Unless fast-breeder reactors, including thorium-based systems (the simpler liquid fluoride reactors rather than the vastly complex accelerator driven systems) are introduced fast and on a large scale, or other deposits of uranium are found, nuclear too will overrun its energy supply within 40 - 50 years, or sooner if nuclear power is proliferated as most western governments aspire to do.

If the climate models are correct in their predictions, the impact of carbon emissions will be felt later than that, and it is debatable how much we can moderate this, simply by reduced carbon emissions strategies (see yesterday's posting). Once we begin to eliminate fossil fuels, either through new "clean" technologies or by simply running out of them, the anthropogenic burden of CO2 will begin to level off. However, if the oceans are becoming saturated with CO2 in the surface layers and are less able to dissolve more of the gas, the consequences of carbon emissions both past and future may haunt us for millennia.

Since various computer models give different answers, e.g. warming by as little as 1 degree or as much as 5 degrees by the end of the century, it would be handy to have some experimental data to make recourse to. Obviously, we can't know the future, and that will only unfold as time passes. However, there is the geological record which gives some clues as to past behaviour. In a nutshell, according to ice-core samples, rather than the CO2 increasing and then the earth heating up, what may be deduced is the reverse of this; i.e. the planet warms and then, with a lag of around 800 years, the CO2 levels increase in the atmosphere.

The observables are a little less direct than this, since the temperature at a particular time in history is deduced from the ratio of heavy (deuterium) to light (protium) hydrogen isotopes (and O-18/O-16) in water (ice-cores), i.e. the ratio of heavy water to (ordinary) light water. Since the heavy water tends to evaporate more when it is hot, an increase in the heavy/light water ratio is observed, and vice versa for cold periods. There has been some speculation as to the accuracy of such isotopic thermometers, and corrections are proposed that close the gap somewhat between the temperature and CO2 levels suggesting a closer correspondence between the two, but nonetheless the initial heating period cannot be explained as being caused by rising CO2 levels, even though it may well be that once the CO2 concentration increases, its heat-trapping effect does introduce a thermal feedback to the climate.

One might speculate what exactly is the mechanism for that initial warming process if it cannot be simply explained in terms of CO2. Possibly, Milankovich cycles (changes in the amount of solar energy received by the Earth according to changes in the periodicity of its orbit over time) may play some role. For example, the roughly 100,000 year period between ice-ages corresponds to an "orbital forcing factor", between the closest and most distant approach of the Earth to the Sun, following the slightly undulating ellipse of its solar-orbit.

Some recent papers provide evidence that the global climate is subject to variation even over the past few centuries, i.e. before humans began burning up to the present 7 billion tonnes of carbon fuels each year. Global temperatures declined abruptly by around 2 degrees C from the Medieval warm period (1200 AD to 1500 AD) to the little ice-age (1500 AD to 1850 AD), when skaters built fires to roast chestnuts on the frozen surface of the river Thames in London. More detrimentally, there were widespread and frequent crop failures during the latter cold period which led to perhaps a million deaths from famine and disease. After 1850, average temperatures rose by around 4 degrees C, and this was before humans began releasing huge amounts of CO2 into the air. This amount of global warming was about 7 times the temperature increase observed during the past century and cannot be attributed to CO2 emissions since the CO2 levels in the atmosphere were relatively low then.

Since 1977, both the Earth mean temperature and the atmospheric CO2 levels have increased, which is usually taken as causative of the former, and yet during the 30 years prior to then, the temperature actually fell despite the fact that CO2 levels were increasing monotonically year on year. Surely if CO2 levels determine the temperature, the latter should have risen between 1945 and 1977, rather than the converse.

Although some 80% of the total manmade CO2 has been emitted since 1945, more than half of the warming observed in the past century occurred between 1890 and 1945, which could not have been a result of CO2 emissions. Of course, we may have more warming to come from those later emissions of CO2. We don't know yet.

It may be concluded that CO2 does not cause planetary temperature rise, but there are other regulatory systems at work. Now, it is impossible to state that the present unprecedentedly high levels of CO2 will not cause the earth to warm to the extent that the climate models indicate, but there is no direct evidence that recent global warming is caused by rising CO2 levels.

It has been argued that if global warming is not caused by CO2, then mathematical predictions of global catastrophe are meaningless, and we enter the red-zone of an arrogant belief that we can re-engineer the earth systems by curbing our carbon emissions, and hey presto global warming will be switched-off. As noted, the inertia in the system may be far greater than these models indicate if the ocean sinks for CO2 are becoming saturated.

There are however two imperatives. Firstly, we have to use less oil, gas and ultimately coal because these are present only in finite reserves, and so energy efficiency and conservation in relocalised (less transport-intensive) societies appears a must. It becomes arguable just how much of our effort and resource should be spent on actual carbon-remediation schemes (e.g. biochar production and other carbon capture technologies) , if CO2 is not the problem, and the earth has its own agenda irrespective of what we do.

Moreover, if we cannot ameliorate global warming, surely it makes more sense to devise strategies for how to survive under the stresses that it will impose upon us, in terms of sea-level rise, keeping those in vulnerable health comfortable during especially hot periods, and providing enough water for all of us. We need to take realistic and practical action. The rest is theory, untested and probably untestable until nature's own plan is revealed to us in full.

Related Reading.
"CO2 might not be cause of climate change." By Dr Don J. Easterbrook. htt://

Monday, December 01, 2008

No Quick Fix to Greenhouse CO2 levels.

The surface layers of the oceans are becoming saturated with CO2, and less able to absorb more of it; hence, policies to curb carbon emissions will not have an immediate impact, and it may take hundreds of thousands of years for the natural (pre-industrial) balance of the gas to be restored. I have tried to keep an open mind about the facts of Anthropogenic global warming and climate change, whereby it is assumed that our profligate use of fossil fuels has contributed to an excess of CO2 in the atmosphere, which will cause the planet to heat-up, with concomitant climate change that will prove detrimental to life on Earth. The general consensus is that we should reduce our use of fossil-fuels, in the hope that the planet will recover its equilibrium, through natural absorption processes. Around 40% of the carbon burned from fossil fuels since 1950 has been absorbed, while the remainder has accumulated in the atmospheric burden of CO2.

More proactive measures are proposed too: for instance the growth of plants, to absorb CO2 from the atmosphere through photosynthesis, which are then pyrolysed in the absence of oxygen to form various gaseous and liquid products that might feed markets for fuel and organic chemicals, leaving a residue of carbon ("biochar") which could be dug into soil, both to improve its fertility and to bury some of that miscreant greenhouse carbon. The latter would pose a considerable undertaking, however, and to restore CO2 levels to those of the pre-industrial era would require pyrolysing most of the world's biomass for the next 40 years, and burying its thermal residue of biochar.

Lesser scale operations could pull-down enough CO2 from the atmosphere to restrain it from the putative 450 ppm tipping-point, beyond which the climate runs out of control. The level is around 390 ppm now. However, a new study offers little cheer that even such gargantuan feats of political cooperation and engineering might save us, even if they could really be done. The findings contradict prevailing views that CO2 will be cleaned-out of the atmosphere during the next century or so by natural forces, if we simply curb our emissions by around 80% by 2050, as Britain has pledged to do, by switching over to carbon-free sources of energy. Now this is a massive imperative in its own right, and I wonder as to the viability or veracity of such a policy.

Enter, Professor David Archer of the University of Chicago, who is quoted as saying: "the climatic effects of releasing fossil fuel carbon dioxide into the atmosphere will last longer than Stonehenge, longer than time capsules, far longer than the age of human civilization so far. Ultimate recovery takes place on timescales of hundreds of thousands of years, a geologic longevity typically associated with public perceptions of nuclear waste."

Most of the CO2 that is removed from the atmosphere is absorbed by the oceans, but the process is becoming slower. The ocean waters to a depth of around 100 metres, which is where CO2 is principally and initially dissolved are becoming saturated with the gas, which increases their acidity, and discourages further uptake. In order to re-activate the solvent power of the surface layers of water, fresh seawater from lower depths needs to recycle upward, but this process takes "centuries or a millennium." If the surface waters are becoming warmer too, the process is slowed-down yet further.

In a forthcoming paper (to be published in Annual Reviews of Earth and Planetary Sciences) the additional posit is made that the recycling process per se is insufficient to absorb all the unwanted CO2 from the atmosphere, and the snail-pace weathering of rocks which locks-up CO2 as solid limestone will be required, taking thousands of years to be accomplished. Professor Ken Caldeira, a co-author of the report, raises his game still. He concludes that even after the "pollution" by CO2 stops, the mean temperature of the Earth will settle at a new higher level, rather than falling.

This makes sense, if the heating effect of CO2 is as severe as climate-models indicate, and the concentration of the gas remains steady and relatively high over long periods. James Hansen, director of the NASA Goddard Institute of Space Studies, concludes that the "long lifetime of CO2 emitted by fossil fuel burning" means that simply reducing emissions does not provide a solution, and that some of these fuels must "be left in the ground" forever, and that gas must actually be removed from the air. Now that latter point does sound potentially like the biochar strategy.

What Hansen in fact proposes is removing CO2 by growing trees (you'd need an awful lot of them!), and then burning them to produce electricity and capturing the CO2 before it is emitted. Interesting that BP have just pulled out of the carbon capture game. He also thinks that there should be no more coal-fired power plants built.

Growing trees, other biomass, algae etc. is probably the only way we can withdraw significant amounts of CO2 from the atmosphere. The question is what do we do with that carbon-rich material once we have it? Pyrolyse it and bury the carbon biochar, or burn it and bury the carbon dioxide? Leaving more of the fossil fuels in the ground (i.e. not using them) makes a lot of sense, and seems to point to some combined strategy of energy efficiency and curbing inefficient oil-based transportation. We will need to do these things anyway, as fossil and other resources run expensive and eventually short, but if these scientists are right, it just won't do much to alleviate global warming and climate change.

If the predictions prove true, then we are stuck with the consequences of our actions and will need to weather climatic shocks and changes of various kinds. That noted, it is the instance of running-out of plentiful oil and then natural gas that will impact most immediately on human life and indeed all life, along with limiting supplies of clean water. Testing the theoretical models of scientists may be a luxury for future generations. Our practical actions now will contribute to how that future may unfold, and whether it contains civilisation as we have come to know it.

Related Reading.
"Greenhouse gases will heat up planet 'for ever'."