Wednesday, November 29, 2006
Peak Oil Unlikely in the Short Term?
Projected world demand for petroleum liquids indicates an increase from approximately 85 million barrels per day in 2005 to 115 barrels daily, in 2030, according to ExxonMobil estimates. This can only be sustained if there is enough oil actually in the ground to be extracted and whether it can be recovered at the necessary rate. Dr Richard Vierbuchen, who is the vice-president of the Caspian/Middle East region for ExxonMobil said that supply can adequately meet the increasing demand. I guess he would say that though, wouldn't he? He stated further that "estimates of the liquids resource base have been increased over the last 50-100 years, and are likely to continue to do so." Now why is that exactly? Well, he says that "Forecasts of an imminent peak in global production appear to underestimate major sources of growth in the resource base, particularly improved recovery and resources made economic by new capabilities." I presume the latter is a veiled allusion to "unconventional oil", for example that recovered from oil sands ("Tar sands" in reality), or produced by coal liquefaction. He then went on to attack the fundamental analysis made by M. King Hubbert in 1956, stating that it is not readily applicable to forecasting global liquids production, while conceding that it did work in predicting that US Lower-48 oil production would peak in 1965-70, and it did in fact peak in 1971.
His criticism of the Hubbert method is that it cannot account for an increasing resource base, and this much is true. In effect, what Hubbert did was to estimate how much oil was in the ground, how much had been drawn off and hence how quickly the peak in production was likely to be arrived at. One consequence of his analysis was that there is a lag of about 40 years between the peak in oil discovery and the peak in oil production. Hubbert's method was based on the number of squares on a sheet of graph paper, representing the volume of oil in the reserve, which must be fitted under a curve representing the rate of extraction, and in its simplest form is "bell-shaped", so that production leading up to the peak is a symmetrical mirror-image of the production after the peak. Of course, it will never be so simple, as extracting oil beyond the peak point (when the reserve is half empty) is a more difficult matter than when the first well is sunk into eager, virgin territory.
He has a point, but the issue of peak oil is not about running out of oil; but that cheap oil will run-out, and the price thereafter increase to some imponderable level (with consequences that may be pondered only too clearly). The central and underpinning feature of any prediction is the quantity of oil there really is down there - further issues are how readily that may be extracted depends on the precise geology of particular regions, and the quality of that oil, in terms of the refining of it that is necessary before it can be used as a fuel.
According to Vierbuchen, "although annual global production has exceeded annual discoveries since the early 1980's, annual global reserve additions still exceed annual production because of reserve growth in increasing fields." I think he is referring to methods of improved oil extraction, that previously unyielding wells can be made to do so, e.g, by blasting steam into them. Such enhanced recovery methods, are believed to damage the well-geology, and it is thought that the Saudi reserve may be partly inaccessible because the damaged rock will hinder extraction of the oil . However, this is simply getting more of what is contained out, not increasing the volume of the reserve. He also refers explicitly to oil from gas, coal, very heavy oil, bitumen and shale, but as I have pointed out before, these are much harder to convert into oil, in terms of the energy that needs to be invested, potentially reducing the EROEI to an unfavourable ratio. (i.e. it takes so much energy to get the stuff out it isn't worth it for what energy is actually recovered from the final oil product).
Now, in support of this, Michael Huston, who is professor at the University of Huston (and also a managing partner of a petroleum consulting firm), reckons that peak oil will not strike for "at least the next three centuries". This is by way the most optimistic estimate I have seen, but let's see what it means in reality.
We are now extracting 85 million barrels a day x 365 = 31 billion barrels a year. If this increases to the projected daily production of 115 million barrels, by 2030, in that year the world will have drawn off 42 billion barrels. Undoubtedly, some of this will be in the form of "unconventional oil" and as this is a rough estimate I shall assume that over time, on average, 100 million barrels are extracted daily, or 36.5 billion barrels each year. In "three centuries", that means 300 x 36.5 = 10,950 billion barrels... or 11 trillion barrels (roughly).
Now my understanding is that there is 1.2 trillion barrels left in the ground (equal to what has already been used, and so we are at that half-way point), which is enough for say 32 - 38 years. I posted an article called "Peak Oil - all Bunkum?" recently, which refers to another optimistic reckoning that there are 3.74 trillion barrels worth to be had, but that seems to include everything - crude oil from wells, and all manner of synthetic oil. This was based on a report entitled "Why the Peak Oil Theory Falls Down: Myths, Legends, and the Future of Oil Resources," produced by Cambridge Energy Research Associates (CERA), a consulting company based in Cambridge, Massachusetts, which is only available for $1,000, which I am not prepared to pay.
So, even 3.74 trillion barrels is only enough for 100 years (with a half-way point in 50 years, if projected production levels are met and maintained), and as I have pointed out, to produce most of that is going to involve huge efforts to provide the necessary infrastructure for coal and gas liquefaction, bitumen extraction etc. etc. and of course the energy to run it. I am feeling that we will manufacture a lot more oil, and mostly from coal, and we will burn more coal per se for direct heating and as the means to generate ever-increasing amounts of the world's electricity. I am uncomfortable about these very high estimates of what might be produced in terms of oil, simply because they give the appearance that getting it will be easy, along the lines of the kind of oil production we are familiar with, and this is deceptive, or indeed a deception. Probably these different "kinds" of oil should not all be reckoned together on the same energy balance sheet. I doubt there is any coincidence that it is those with the most to profit from acting as though the "business as usual" scenario can go on for decades or centuries, who seem to be most blatant in their denials that peak oil is imminent. Perhaps they will be living protected behind armed-defenses, when they are proved unequivocally wrong, and the majority of human civilization collapses.
Peak oil is about running out of the cheap oil we have become accustomed too, and that certainly is running out, whether or not we find access to other, more costly won, versions of its commodity, in the future. In any event, human societies will find themselves based around the facts of less, readily available oil.
Monday, November 27, 2006
Pouring Water on Chinese Coal Liquefaction.
Shell Gas and Power Developments BV and the Shenhua Ningxia Coal Industry company (Shenhua-Ningmei) signed a joint agreement to study coal-liquefaction and the technical and commercial feasibility of launching a direct coal liquefaction plant with a daily output of 70,000 barrels (about 10,000 tonnes) of oil products and chemical feedstocks. South African based Sasoil has also joined as a collaborator with the Shenhau group, to build two coal-to-liquids plants using Fischer-Tropsch technology developed by and unique to Sasoil, which is the world leader in producing fuel from coal. The Fischer-Tropsch process was developed in Germany and kept the country in fuel during the oil-blockades of WWII; it has also fuelled South Africa during spates of political sanctions, and remains the principal source of oil in this country. One may conclude that the technology has a clear future in breaking the dependency of individual countries on imported oil, mainly from the Middle East... so long as there is enough coal available as a feedstock.
Coal liquefaction consumes massive quantities of water. Although part of the restriction to only "big" plants is intended to spread coal-to-oil production across the country, many regions of China, especially in the north and northwest, are already extremely short of water. Significant environmental discharges of effluent gases, waste (i.e. contaminated) water and industrial effluent are also attendant to coal liquefaction processes. The profitability of producing oil from coal depends on the prevailing price of crude oil on the world markets, and since this varies year on year, and it takes up to five years to build a coal liquefaction plant, there is an element of risk as to whether the plant will immediately return money or not. However, once we hit the Peak Oil production point, crude oil will become increasingly expensive. It is reckoned (in China at least) that the technology is viable so long as the world price is about $25 a barrel. Personally I doubt it will ever fall to anywhere near that again - it was three times that some time back, and not much less now. Hence, coal liquefaction will be attractive anywhere on economic grounds, even if the environmental picture is less so.
Water as a resource is under pressure in many parts of the world. It is therefore a central issue to estimate whether the water reserve of a region can support any new production processes, without detriment to the environment and to the people (and animals) who live there. It appears as a twist of nature that many regions that are potentially well provided with means for "unconventional" oil production are relatively short of water. So, there is plenty of coal in parts of Australia, Africa, China, India and North America, where supplies of freshwater are limited. As noted, coal liquefaction is intensive of water, in part to provide the hydrogen atoms to convert coal (mainly carbon) into hydrocarbons, and also to run cooling systems for the plants themselves. We hear much too, about producing oil from the Alberta Oil Sands (the name "tar sands" is more accurate), by cracking the bitumen they contain into oil for use as a fuel. The latter process relies heavily on gas to "crack" the bitumen into liquid hydrocarbons, but it also uses a lot of water. Indeed, water allocations made by Alberta to oil sands projects amount to 359 million cubic metres per year, which is twice the quantity of water used by the city of Calgary. The whole enterprise threatens the water supply (in terms of quantity and quality) to Saskatchewan and the Northwest Territories through the Mackenzie River system.
Most of the oil sands operations draw their water from the Athabasca River, which is a tributary of the Mackenzie, most of which is not returned to the river. Strip mining of the oil sands uses between 2 and 4.5 cubic metres of water to extract one cubic metre of synthetic crude oil. The water becomes heavily polluted and only 10% goes back into the river, with the remainder being stored in enormous holding ponds, among the biggest structures ever constructed by humans on Earth. The oil sands yielded over one million barrels per day in 2005, but it is believed that they may be exhausted by 2050. The story here is similar to conventional crude oil production, in the sense that initially the resource is relatively easy to extract from close to the surface, but the process becomes increasingly demanding as deeper levels are accessed. This is true of coal production too. The Energy Returned on Energy Invested (EROEI) for producing oil from oil sands is currently about 3, which is just about viable so long as there is sufficient gas available for the purpose. The point must come, and long before the resource is exhausted, when the investment of gas, water, environmental clean-up, etc. etc. no longer justifies the return.
It is a case of making hay while the Sun shines, and finally it may be water that proves itself as the limiting energy resource.
Saturday, November 25, 2006
Washing Machine Spins Nanoparticle Regulations
Accordingly, the US' Environmental Protection Agency is under pressure to regulate commercial products containing silver nanoparticles. But it is not yet clear how precisely this 'knee jerk reaction', as it has been described, will be enforced.
In a "guilty-'til-proven-innocent" regulation by the EPA, any company marketing a product as containing silver nanoparticles to kill bacteria must provide scientific evidence that the particles pose no environmental health risk. A tricky one indeed. How can that really be "proved"? Methods for determining "nanotoxicology" are in only very early stages of development, mainly as it is difficult to know exactly what to look for. For example, there has been a study of carbon nanotubes (like short little straws made of carbon, a bit like a piece of graphite-sheet rolled back on itself) aimed to prove that they can generate free radicals. Accepting the Free Radical Theory of Disease (an extension of the Free Radical Theory of Ageing), if they were found to have produced radicals, that might be some evidence that we should fear them. However, there was no such evidence found, and moreover, it was concluded that the presence of carbon nanotubes actually diminished the yield of reactive oxygen radicals when present in systems known to generate them. However, we are quite some way from any conclusion than nanoparticles are actually good for you... although they may ultimately be so proven. Who knows? The jury is not so much as "out" as not yet elected.
The decision is the result of legal enmeshings concerning the 'Silver Wash' washing machine, marketed by Samsung as containing silver nanoparticles in order to kill bacteria in clothes. Some US water authorities are afraid that discharged nanosilver particles might concentrate in wastewater treatment plants, killing bacteria which were meant to detoxify the wastewater. That's a good point, in the sense that a broad-spectrum antibiotic can kill both the nasties and the good bugs in the digestive tract, with well known consequences, also ending up at a water treatment plant somewhere nearby. In this particular context, nanosilver could be listed among other environmental pesticides, and would accordingly need to be tested under the Federal insecticide, fungicide and rodenticide act (Fifra). So long as the silver nanoparticles were contained within the washing machine, it could be classified as a 'device', and this exempted it from Fifra. However, in taking the view that some of the particles could actually "escape", EPA have now reconsidered this decision. As EPA spokesperson Jennifer Wood put it: 'The release of silver ions in the washing machines is a pesticide, because it is a substance released into the laundry for the purpose of killing pests.' So there!
Although this particular washing machine uses silver ions, which may not constitute nanoparticles, silver nanoparticles are used to kill germs in such products as air-fresheners, shoe liners, socks and food-storage containers. In all probability, these products will all now have to be tested under the regulations. Silver nanoparticles are also added to bandages to speed healing; but these and other medical applications are regulated by the US' Food and Drug Administration, not the EPA. A legal loophole remains for companies who drop anti-microbial claims from their nanosilver products, since it is only products marketed as 'anti-microbial' that will have to be regulated.
Undoubtedly, the new regulation will justify more research into the toxic effects of nanoparticles, but who will pay for it? Will it be government funded e.g. through the Research Councils, or will the manufacturers and suppliers of these new technologies bear the burden? I think it most likely that industry will put some money into university labs., say by supporting a few Ph.D projects, which is a far cheaper option than doing in all in-house themselves at full-costs, and the universities are mostly (in the U.K. at least) sufficiently desperate for cash they will take whatever crumbs might thereby drop their way.
Wednesday, November 22, 2006
Peak Oil - All Bunkum?
A new report concludes that the Peak Oil theory is wrong. I would love to believe this, but unfortunately the report is only available at a cost of $1,000, which I am not prepared to pay for the privilege of reading it. For those more urgently inclined, or with better lined pockets, it is entitled "Why the Peak Oil Theory Falls Down: Myths, Legends, and the Future of Oil Resources," and is produced by Cambridge Energy Research Associates (CERA), a consulting company based in Cambridge, Massachusetts (not in that tranquil eastern corner of England). However, the following is based on what I have managed to glean from an elementary search of the web. The report seems to say that while petroleum is a finite resource, the midpoint in the genealogy of oil production will follow a protracted, undulating plateau rather than a bell-shaped curve (according to the Hubbert Peak analysis). It arrives at a figure for world oil reserves of 3.74 trillion barrels, which is three times as much as the 1.2 trillion barrel reserve estimated by Peak Oil enthusiasts. However, as far as I can determine, this is the total of conventional and unconventional oil, which might prove misleadingly optimistic.
What I mean is that this is the entire reserve of oil in the ground - not only the "sweet" light crude, which is relatively easy to refine into fuel, but the heavier fractions that are more difficult to extract in the first place, and need to be separated from sulphur and other compounds that would prove extremely noxious in the final fuel component - and whatever oil is considered to be potentially extractable from tar-sands, shales and bitumenous rocks (even coal?). True, it is the fifth time it has been said that we are running out of oil, but Peak Oil is not a claim of that, but that the cheap, easily got hydrocarbon resource which has underpinned the growth of the industrial world (including its population of 6.5 billion people, up from 2 billion in around 1900) will decline beyond a production maximum, with an inevitable and inexorable price hike on what is left. So, the $1,000 dollar barrel may be the marketplace norm one day, by which time the number of gasoline fuelled cars in use will have plummeted worldwide, along with most of the manufacturing industry that depends on oil as a fuel or a chemical feedstock, which in some extent is just about everything. The EROEI (Energy Returned on Energy Invested) is uncertain for these unconventional sources - meaning that they may take so much energy to extract oil from it is not worth the trouble.
In Jeremy Leggett's book "Half Gone", he refers to "late toppers" and "early toppers", meaning there are some who think the peak will come sooner (around now and certainly by 2010... only 3 years away), while others (mostly in the oil industry - or that's what they say, whether they believe it is another matter) think it will not arrive for another 30 years. Since we burn just over 30 billion barrels a year as a human species, one trillion barrels is about 33 years worth. So the late toppers are banking on there being another "trillion" down there. The truth is nobody really knows for sure, but the EROEI has fallen from a favourable ratio of around 100 in the early days of oil extraction (the "gushers" of "black gold") to a present value of around 8. I discussed this in a much earlier posting "You Need Energy to Get Energy - Time is Running Out". A value of 3 is probably the lower limit below which the energy costs are simply too high to render a particular resource viable, and I'm pretty certain that many of the unconventional sources (tar-sands, for example, come in at around 1.5, when all energy costs, mainly gas and conventional oil that are used in oil production - and water too, since the process is an enormously "thirsty" one, and that's before we get around to cleaning up all the consequent pollution too, as it is also pretty "dirty"!) will fall short of this.
To judge the true reserve of oil we need a full energy costing on each type of resource and method of recovery of oil from it. There are plenty of arguments that Hubbert's original analysis (in 1956) is too simplistic, and a simple bell-shaped curve cannot model the complex business of oil production and demand faithfully. I don't doubt it, but Hubbert's essential premise was a simple one. He made an estimate of how much oil there was probably in the ground (within limits, and took the most optimistic upper value), and how much had been used, hence how much there was likely to be left. When he predicted the arrival of the U.S. production peak (about twenty five years later in 1970) he was spot on. The best estimates I have seen for conventional oil is that we have used about half of it, and so there is the other half, about another trillion barrels left. Extracting that will prove more difficult however, and unconventional sources yet more so.
So, when I get to read the report - if it is released more economically into the public domain - more will become clear. However, I have yet to be reassured that the idea of Peak Oil is a mistake. I am not one who rubs their hands with glee at the prospect either, which frankly terrifies me, in some messianic "we are bad people, and due for our comeuppance" ecstasy. I like my little life just as it is, but I am fairly certain that the world cannot continue wasting its resources as though there is no tomorrow - that day will come eventually, and when it does we should be prepared for it. As I have noted before, we need to begin applying the brakes on the runaway train of fossil fuel consumption, while there is still time to put something else in its place. "You Need Energy to Get Energy - Time is Running Out".
Monday, November 20, 2006
"Global Warming" Caused by Natural 1,500 Year Cycle?
Accordingly, Singer and Avery would need to garner meticulous evidence to support their claim for a 1,500 year "natural" cooling-warming cycle, and indeed it would seem they have done so. Their supporting documentation encompasses records from ancient Rome, Egypt and China; paintings housed in museums painted 12,000 years ago; tooth enamel from Viking cemeteries; analyses using the most modern technology made of ice-cores, seafloor sediments, tree rings, fossilised pollen and cave stalagmites (the ones that grow "up"!).
Singer points out that the first clue to the 1,500 year cycle came only recently, when the first cores were sampled from the Greenland ice-sheet in 1983. The cycle was too long to have been picked-up by earlier peoples without access to accurate thermometers and precise, written records. The Greenland ice-core samples show the 1,500 year cycle undulating back through a period of 250,000 years, which raises temperatures at the latitude of New York and Paris by 1 - 2 degrees C for centuries at a time, and even more at the North and South Poles, with a global average of around 0.5 degrees C. In 1987, analysis of the first Antarctic ice-core samples confirmed that the cycle extended further back, and to at least 400,000 years, spanning four ice-ages and that the effect was indeed a global phenomenon. Evidence from the undersea sediments from all six oceans, in tree ring samples taken from the northern hemisphere, in the advance and retreat of glaciers both north and south, from Greenland to New Zealand, and in stalagmites taken from every continent, including southern Africa. Pollen samples show a complete reorganisation of North America's trees and other plants over the past 14,000 years, which works out to one such change every 1,650 years. The deepest seabed sediment cores show that the cycle has been operating for at least the past million years.
So, what exactly is the cause? Observations of sunspots made over the last 400 years, along with more recent analyses of carbon and beryllium isotopes, seem to connect the cycle to variations in the Sun's radiant output (energy), as recently detected by satellite measurements. Antarctic ice samples show that there is a close correlation between temperatures and CO2 levels over the past 400,000 years, which is usually taken to imply that increased CO2 levels cause global warming. However, on closer inspection, the studies show that there is actually a lag of around 800 years between the increase in temperature and that in atmospheric CO2 concentrations. So, the heating comes first, and then the CO2 levels rise. This, say Singer and Avery, makes sense because as the oceans warm, they will release some of the CO2 which is dissolved in them into the surrounding air.
The notion of natural cycles runs counter to majority currently perceived beliefs about global warming, i.e. those that place the burden of blame squarely on the shoulders of (selfish, greedy, careless!) humankind. If the warming does indeed come first to release more CO2, we may still be in for a shock, as the greenhouse-legacy of this blanketing gas subsequently heats the Earth further still, and compounds any effects of a natural warming trend. No one knows for sure. However, such natural cycles of heating and cooling must surely be taken into account in any serious models of what the world's climate is likely to be in 50, 100 or more years hence; the predictions from which are now being used to determine or justify government policies about the future of the human race.
Friday, November 17, 2006
Porous Carbon: Room for Exaggeration!
Porous carbon is produced by "activating" (heating) wood, and it is the most dense types of wood (e.g. teak) that give the highest surface area materials. The procedure always results in a distribution of pore sizes, rather than the single pore dimension found in zeolites (molecular sieves). The latter feature can be adapted to produce types of porous carbon (called carbon molecular sieves) with a very narrow range of pore sizes by forming an organic polymer within the zeolite structure (which must fit that tight porous constraint), turning it to carbon by heating, and then dissolving (leaching) the zeolite template away usually using hydrofluoric acid (HF). The most commonly encountered forms of activated carbon have surface areas of around 1,000 or so square meters (m*2) per gram, and some with smaller pores offer internal surfaces amounting to 2,000 m*2. I am aware of porous carbons, formed by heating e.g. teak, and which contain a quite restricted range of pores in the low micropore (i.e under 20 Angstrom units or 2 nanometers nm) range, say under 10 A, which can reach 3,000 m*2/g and there is one claim of a material that has 5,000 m*2/g. However, if a gram and a half of some porous carbon really is enough to equal the MCG (at 18,100 m*2), it must have some particular properties to give it an internal surface area of around 12,000 m*2/g!
I shall first propose a highly simplified model of a porous solid. I shall assume that the extended material can be made up of a large number of small cubes, each of which contains a spherical pore. The solid is then "constructed" by placing a large number of cubes side by side, so that six cubes fit around each (one at each of the six sides - left, right, forward, backward, top, bottom). If we assume that a hypothetical solid has pores of 3 A (0.3 nm) in diameter, it would fit inside a cube of 3 x 3 x 3 = 27 A*3 (cubic Angstroms). The radius (r) is half that, i.e. 1.5 A, and so the internal surface area of the pore is obtained from:
4 x pi x r*2 = 4 x pi x (1.5)*2 = 28.27 A*2 (square Angstroms).
Assuming a density for the carbon of 0.6 g/cm*3 (cubic centimetre), we see that 1 gram of the porous carbon would occupy 1/0.6 = 1.67 cm*3
1 cm = 10*8 A, and so 1 cm*3 = 10*24 A*3. 1.67 cm*3 = 1.67 x 10*24 A*3
Therefore, you could fit 1.67 x 10*24 A*3/27 A*3 = 6.19 x 10*22 cubes into 1g of porous carbon.
This gives a total surface area of 6.19 x 10*22 x 28.27 A*2 = 1.75 x 10*24 A*2.
1 m = 10*10 A, and 1 m*2 = 10*20 A*2. Hence 1g of the carbon would contain 1.75 x 10*24 A*2/10*20 A*2 = 1.75 x 10*4 = 17,500 m*2.
----------------------------------------------------------------------------------------
We can work the calculation similarly for a 5 A pore, with the radius r = 2.5 A, and an internal surface area of 4 x pi x (2.5)*2 = 78.54 A*2 located in a cube of 5 x 5 x 5 = 125 A*3
Again, the volume of 1 g is 1.67 x 10*24 A*3, which could fit 1.67 x 10*24/125 = 1.336 x 10*22 cubes.
Hence, the total internal surface area amounts to: 1.336 x 10*22 x 78.54 = 1.049 x 10*24 A*3 = 10,493 m*2/g
-----------------------------------------------------------------------------------------
Now, our hypothetical "Australian" carbon would need an internal surface amounting to around 12,000 m*2/g. If the density is also 0.6 g/cm*3 (as assumed above), that gives an internal surface area of 12,000 x 0.6 = 7,200 m*2/cm*3 = 7.2 x 10*23 A*2. This has to be divided (fitted) into a volume of 1 cm*3 = 7.2 x 10*23 A*2/10*24 A*3 = 0.72/A
Our model of the solid is of a number (n) pores contained (obviously) in n cubes, and defining the geometry of the pore = 4 x pi x r*2 and the cube = (2r)*3 = 8r*3, and taking their ratio:
4n x pi x r*2/8n x r*3 we find that the term n cancels, leaving us with a simple geometrical factor:
4 x pi x r*2/8 x r*3 = 7.2 x 10*23 A*2/10*24 A*3 = 0.72/A = 1.57/r = 0.72/A
Therefore, r = 1.57 A/0.72 = 2.18 A, and so the pore diameter is 2r = 2 x 2.18 = 4.36 A.
To check the calculation, we can work out from 4 x pi x r*2 that the pore would have a surface are of 59.72 A*2 and fit into a cube of 4.36*3 = 82.88 A*3. In 1 cm*3 = 10*24 A*3/82.88 A*3 = 1.2 x 10*22 cubes, and so the surface area is 1.2 x 10*22 x 59.72 = 7.2 x 10*23 A*2 = 7,200 m*2, which is our correct starting value.
I am not aware of any such porous carbon, in fact with an area of 12,000 m*2/g and so I propose that the figure is nearer 15 grams of porous carbon (one decimal point place removed from the gram and a half that was supposed) to equal the area of the MCG. Even if the super material does exist,which I doubt, and can be synthesised, it is unlikely to be made on a large scale for gold mining, which would use the best but more common or garden variety. 18,100 m*2/15 grams = 1,207 m*2/g, which is reasonable for most carbons I have worked with.
---------------------------------------------------------------------------------------
Just to complete the exercise about porous solids, for 10 A pores, we end up with r = 5A, and so the pore has an internal surface area of 314.2 A*2 (from 4 x pi x r*2) in a cube of 10 x 10 x 10 = 1000 A*3, giving 1 x 10*21 cubes in a 1 cm*3 cube. In that case, the total surface area would be 1 x 10*21 x 314.2 = 3.142 x 10*23 A*2 = 3,142 m*2 or around 5,000 m*2/g for a density of 0.6 g/cm*3.
For a pore of 1 cm diameter in a 1 cm*3 cube, with r = 0.5 x 10*8 A, we arrive at a surface area of 3.142 x 10*16 A*2/1 x 10*20 = 3.142 x 10*-4 m*2 i.e. 3.142 cm*2! (just 5 square centimeters /g). This shows how the surface area drops as the size of the pores increases.
------------------------------------------------------------------------------------------
These are very rough and ready calculations, and the measured surface areas are always a good deal less than this, for the following reasons: (1) there is always a pore size distribution with a substantial fraction of larger pores (meso 20 - 500 A; and macro > 500 A) which reduces the intrinsic internal surface area of the material from that expected if it contained only micropores (< 20 A); (2) simple arguments about the geometry and number of pores tell nothing about the pathways by which molecules are actually absorbed into the porous particle, to which there are always barriers and restrictions, and hence the actual surface covered is less than the total surface area. As an example, zeolite X, where the pores are (almost) all small (10 A) and well defined, the actual adsorption capacity of nitrogen molecules on the surface accords with a value of around 700 m*2/g, rather than a value of around 3,000 m*2/g (assuming a density of 1 g/cm*3); (3) last but certainly not least - the structure of a porous solid is not a neatly packed arrangement of pores within cubes, but more complex, and with substantial inaccessible space occupied by its essential framework.
....this makes the original figure of one and a half grams of carbon to match the area of the MCG even less likely, however!
[For comparison: a tennis court has an area of 262 m*2; a football pitch (for international games) around 7,400 m*2; a cricket ground anywhere in the range 15,000 - 18,000 m*2. The MCG arena is reckoned at 18,100 m*2 and the entire Oval cricket ground site covers 24,279 m*2].
Wednesday, November 15, 2006
Antarctic Blows Hot and Cold on Climate Change.
Science should always be open to speculation in the light of evidence, otherwise it is not science but self-interest or hysteria. New discoveries about nature will continue to be unravelled, but in my humble opinion, even if there are natural trends (as the geologic record shows) that act to warm the Earth on a roughly 100,000 year cycle, and so climate change is not entirely our fault, it is still madness to continue burning for no real purpose beyond inertia and self-interest those remaining reserves of oil and natural gas in a vain attempt to preserve a lifestyle that will prove ultimately untenable. So, CO2 emissions and their consequent heating of the globe notwithstanding, reductions in burning precious and irreplaceable resources of fossil fuels remain the order of the day. We must act now to use them in driving some sustainable long term strategy of living on this world, which in all probability will involve renewables and nuclear power (if it is so implemented that supplies of uranium and thorium can be eked-out to last for hundreds of years).
Antarctica has shown two new and unexpected features. In the first place, although the "floating" northern peninsular is indeed melting as is shown in the media, this will not influence sea-levels directly. Imagine a glass of "something" with three large ice-cubes floating in it, and filled to the brim. When the ice melts, will the liquid overflow the glass? Counter-intuitively, it will not, because the density of ice is only about 90% that of liquid water (hence just 10% of icebergs are visible - above surface - due to the buoyancy factor), and so when it melts the total volume of water stays the same. Put another way, the water displaced by the ice has the same volume as the total ice when it is melted into liquid water. However, the East Antarctic Ice Sheet - a 2 mile thick, 2.7 million square mile prairie of land bigger than Australia - has increased its mass every year between 1992 and 2003 from increased snowfall, according to satellite radar measurements. Since it is normally too cold to snow there (i.e. the air is too dry to carry much water to be precipitated as snow), it is thought that global warming is to blame, presumably that there is therefore more evaporation of water into the air which falls as snow. This additional snowfall is sufficient to increase the ice-sheet by an extra 45 billion tonnes per year, which is about the same as the amount of water flowing into the oceans from the melting of the Greenland ice cap; an interesting coincidence, which may help to keep sea-levels steady. Sea level is believed (by some) to be rising worldwide by an average of 1.8 millimeters a year due to the expansion of water as the oceans warm, and from the additional outwash from glaciers melting in Greenland, Alaska, tropical highlands and elsewhere in Antarctica. Each millimeter (mm) of increased sea level corresponds to about 350 billion tonnes of water. The growth of the East Antarctic ice cap is thought enough to slow down sea level rise by around 0.12 mm a year, i.e about 6% of the total.
Now, is the melting northern/western Antarctic ice or the growing eastern ice-sheet due to global warming, as thought? Here comes the second discovery, namely an active undersea volcano, previously unknown in the Antarctic Sound, at the northernmost tip of Antarctica (where we usually see the ice melting in torrents). Evidence for the volcano was dredged-up in a study aimed at investigating why a massive ice-sheet known as the Larsen B collapsed and broke-up several years ago. The find corroborated mariners' observations that the seas in this region were "discoloured", which is consistent with an active undersea volcano. Highly sensitive temperature sensors moving continuously across the seabed of the volcano also revealed signs of geothermal heating of seawater, especially near the edges where the most freshly deposited rock was observed. Could undersea heating be a factor in enhancing evaporation of water in the region, and enhancing snowfall on the Eastern ice-sheet, not or only in addition to, global warming?
Monday, November 13, 2006
Green Gold.
Gold cyanidation is also known as the cyanide process or the MacArthur-Forrest Process, and is a method used for extracting gold from poor grade ore ("tailings") by converting it to soluble complex aurocyanide anions. Although the procedure is that most commonly used for gold extraction it is attended with controversy on account of the toxicity of cyanide and the perceived potential for contamination by it, since there have been a number of environmental catastophes involving cyanide, e.g. in Romania where fish stocks in rivers were devastated some years ago. The process was originally invented in 1783 by the Swedish chemist Carl Wihelm Scheele, who was also the discoverer of chlorine, a gas used on a large scale in industry, e.g. to make bleach, but which was also used as the world's first chemical gas-weapon in the trenches of WW1. The underlying chemical reaction is called the Elsner reaction, and can be written as:
4Au + 8NaCN + O2 + 2H2O --> 4NaAu(CN)2 + 4 NaOH.
The ore is finely ground (comminuted) and is often further concentrated using froth floatation or centifugal concentration, and the resulting alkaline ore slurry is then mixed with a solution of cyanide anions (obtained by dissolving 250 - 500 parts per million of sodium cyanide or potassium cyanide in water). The negatively charged cyanide anions extract the gold from the ore in the form of positively charged gold cations to form the soluble aurocyanide complex, NaAu(CN)2 as shown above. In general, the finer the gold particles, the more quickly they will dissolve. For instance, a 45 micron gold particle might dissolve in 10 - 13 hours, while a 45 micron particle might take from 20 to 44 hours to dissolve in the same solution. It has been found that the addition of lead nitrate can increase the rate at which the gold is leached from rocks and the quantity recovered, particularly in processing partially oxidized ores. Indeed, oxygen (since it is consumed in the reaction shown above) is a critical factor in the gold cyanidation process. Air or pure oxygen gas can be bubbled through the mineral pulp to increase the dissolved oxygen concentration. Oxygen can also be "added" by adding hydrogen peroxide solution to the pulp.
The gold is then recovered from the "pregnant" solution (as it is called) using a number of different processes, but passage through highly porous carbon is most commonly used. So high is the internal surface area of the material that around six grams of it would equal the area of the Melbourne cricket ground! The carbon contains micropores (pores of molecular dimensions, similar in size or smaller than those in zeolites) to filter out the gold, and somewhere around 8 kilograms of gold can be extracted by a tonne of carbon. The gold can be removed from the carbon by using a strong solution of caustic soda and cyanide. This is known as elution. The gold is then plated out onto steel wool through electrowinning. Resins that are specific for gold can also be used in place of activated carbon, or where selective separation of gold from copper or other dissolved metals is required.
However, cyanide is a highly toxic material, which is why the process is controversial. For example, one teasponful of 2 per cent cyanide solution can kill a human adult (although it would take probably several hours to die. It is hydrogen cyanide, "Prussic Acid" of Nuremberg Trials and James Bond movies who's lethal effects are "instantaneous"). The toxic effects on fish occur at far smaller concentrations that this. Indeed, the worst environmental catastrophe caused by mining in the history of the U.S. was at the Summitville mine, where 27 miles of a Colorado river were left "dead" by cyanide poisoning. There have also been disasters in Kyrgyzstan, French Guiana and Romania, where spillage of cyanide in Baia Mare resulted in widescale contamination of the river Tisza. In the United States, the state of Montana along with several other countries have banned gold mining using cyanide. Although cyanide is toxic, it is readily broken down when exposed to sunlight in the presence of oxygen (air), although this is little comfort when contaminated waters have leaked into groundwater, or other underground sources of freshwater that are protected from the sun, on dank cloudy days, or in winter especially in Eastern Europe (Romania) when rivers are largely covered by ice and snow.
In fairness, most of the operations have been conducted without obvious incident, but clearly there is leakage of cyanide, for example from plastic-lined ponds, and the areas surrounding some mines in the U.S. have been found to have elevated levels of cyanide, even decades after they were first processed. The following examples, however, show there is no call for complacency.
A History of Accidents
Ten miners were killed when a disused slime dam at the Harmony mine in South Africa, operated by Randgold, burst its banks and buried a housing complex in cyanide contaminated mud in Feburary 1994.
Cyanide and heavy metal leaks from the Summitville gold mine killed all aquatic life along a 27 kilometer stretch of the Alamosa river in the San Juan mountains of southwestern Colorado by the time the mine was shut down in December 1992. The total clean-up costs have exceeded US$150 million.
Failure of a leach pad structure at the Gold Quarry mine in Nevada released about a million liters of cyanide-laden wastes into two creeks in 1997.
Over 11,000 fish were killed along an 80
kilometer stretch of the Lynches River by a cyanide spill from the Brewer gold mine in South Carolina in 1992.
On May 20,1998, a truck transporting cyanide to the Kumtor mine in Kyrgyzstan plunged off a bridge spilling 1762 kilograms of sodium cyanide into local surface waters. Local people have reported at least four deaths that they claim resulted from the spill. Hundreds of people also checked into local hospitals complaining of health problems following the spill.
More than 3.2 billion litres of cyanide-laden tailings were released into the Essequibo river in Guyana when a dam collapsed at the Omai gold mine in August 1995. Studies by the Pan American Health Organization have shown that all aquatic life in the four kilometer long creek that runs from the mine to the Essequibo has been killed.
On May 29, 1998 six to seven tons of cyanide-laden tailings spilled into Whitewood Creek in the Black Hills of South Dakota from the Homestake Mine, killing a substantial number of fish.
On the night of January 30th, 2000, spillage of 120 tonnes of cyanide at a gold reprocessing facility near the town of Baia Mare in Romania, resulted in widescale contamination of the rivers Tisza and Danube. 150 tonnes of dead fish were recovered, and the drinking water supplies of around 3 million people were threatened. Claims for compensation in Romania, Hungary and Slovakia remain outstanding.
Saturday, November 11, 2006
"The White Feather."
"The White Feather."
Dedication: To soldiers of the Great War (and all wars).
"In the miserable winter of their war
he put his fist through the panel,
oak gave way first
then off to the asylum.
He was like so many men,
in an age without words
for his case or condition.
Post traumatic stress disorder
we call it now. In his time
shell shock (mad or a coward)
twitched the neighbours
with sons and husbands of their own
still under fire.
When they let him out
he walked for miles,
while the tendons in his arm mended
approvingly, while his nerves
rasped-on unseen.
The sudden allegation - "Traitor!"
switched him back into khaki -
mud and rats and dead friends
he never mentioned;
souls who screamed their names
at dead of night
while he slept on fitful sentry duty
next to a woman who
no longer knew who he was.
And now a scream by day: "Coward!
Go on! (sneering) Take it!"
He didn't look into her hand or eye.
The flesh was sewn with silver wires,
rather than amputate
(he pleaded with the surgeon to save his arm),
restrained politely in a steel brace
and kept hidden beneath an overcoat
slung over the lot.
The silly hat with a faded flower,
well-meant and plain with crooked teeth
and a Salvation Army dress,
became a German soldier,
shouting him down in a language
he didn't recognise,
so he lashed-out in terror and ran
for his life unthinking,
leaving a single white feather and a smashed face,
howling in horror on the ground:
another casualty in the long deep trench of war."
by Chris Rhodes
Author's Comments:
"This very sad tale is true, and is of my grandfather who won the
Military Medal having served in the trenches of WWI, but his nerves
never fully healed from that experience. I pity the poor misguided
woman too, who tried to award him the badge of cowardice."
Friday, November 10, 2006
Shall Coal be Crowned King?
As oil and gas resources are projected to decline any time now, attention is increasingly focussed on coal as the principal fossil energy source in the longer term. It has further been proposed that synthetic oil can be manufactured from coal, e.g. using Fischer-Tropsch technology or other methods of coal liquefaction, as I have described in previous postings ("Gasoline from Coal"; "Coal may lead to 'Limitless Oil'"). How much coal there is actually present in the Earth is anyone's guess, but it is only the extractable coal that is of any significance. For example, 3,000 billion tonnes (3 trillion tonnes) have been discovered off the coast of Norway, but that cannot be readily exhumed from its undersea location. Indeed, I have seen estimates of almost 10 trillion tonnes of coal as a world resource, but 1.2 trillion tonnes (or a bit less) as a reserve, from which 5,540 million tonnes of coal were produced in 2004. This quantity rises year on year according to rising demand, particularly in China and India to fuel a gigantic and unparalleled phase of industrialisation. China puts on-stream a new coal-fired power station every week, each of which is about the equivalent of 4 typical U.K. power stations in its output.
Coal is one of the dirtiest of fuels. I have written previously "Coal and Dust" on the troubles of the Chinese coal industry, both in terms of the number of miners killed (about 50 times as many!) compared with the number killed in the U.S. during the production of an equivalent amount of coal, and the worrisome levels of pollution, and according health problems that it brings. This is a consequence of the enormous quantities of ash, and flue gases containing pollutants, notably arsenic compounds and sulphur dioxide. It is possible to first clean the emissions before they are released, but this of course adds cost to the whole enterprise. It is not widely realised that almost 50% of the energy used to mine coal in fact comes from oil. As I have pointed out, it takes energy to produce energy, but oil to produce coal is one vital component in the energy balance sheet, especially when coal is spoken of as a precursor to a synthetic substitute for scarce naturally abundant oil.
I shall assume that there are 1.2 trillion tonnes of coal that can be extracted with relative efficacy. This suggests there are 1.2 x 10*12/5.54 x 10*9 = 217 years worth left in the ground, ignoring any growth in demand. Now that is the amount available to satisfy current damnds for coal as a fuel mainly for direct heating and electricity generation. So what about conversion of coal to synthetic oil by various methods of coal liquefaction? The world burns 84 million barrels of oil per day, which x 365 days = 30,660 million barrels in a year. One barrel contains 159 litres (I had worked this out at 163.7 litres from Imperial Measures, but this is the oil industry figure). The density of crude oil varies enormously, and is highest according to the various levels of oxygen and sulphur compounds it contains, the "sweet" light crude being principally pure hydrocarbons, which have the lowest densities.
However, it is assumed as an oil industry figure that there are 7.3 barrels per tonne of crude oil, which implies a mean density for crude oil of 862 kilograms per cubic metre (for reference, one cubic metre of water weighs 1,000 kilograms). We would therefore estimate the mass of oil consumed worldwide as: 30,660 x 10*6/7.3 = 4,200 million tonnes, which is not far from the B.P. figure for the year 2004 of 3,770 million tonnes. Using Fischer Tropsch chemistry, 1.5 barrels of synthetic oil is obtained per tonne of coal, which amounts to 1.5/7.3 = 0.205 tonnes, or a 20% yield. However, there are Chinese coal liquefaction plants that claim production of 0.33 tonnes of oil per tonne of coal and so I shall hedge on the side of optimism and use this conversion instead. Hence, to produce all of our 3,770 million tonnes of oil synthetically from coal, we would need 3,770 million x 3 = 11,310 million tonnes of coal plus the 5,540 million tonnes currently used, i.e. we would need to roughly treble our coal production to meet this total demand, on top of the fuel required to actually run the processes themselves, whereupon the world coal reserve would run out in a little over 70 years.
Probably this is not at all realistic, and if we used half the currently produced coal (5,540 million tonnes) for oil synthesis = 2,770 million tonnes, that would give 2,770/3 = 923 million tonnes of oil or 24% of current demand. Interestingly, this amounts to about 20 million barrels a day, which is coincidentally the requirements of the U.S. alone, although I doubt the rest of the world would be prepared to give it all to them!
We would then need other sources mainly for electricity generation to substitute for the missing half of the coal production taken from the market to make oil from, maybe nuclear, and some renewable resources.
Now, coal is not evenly distributed, and all such sums depend in reality on actual levels of availability and extraction. The majority endowed regions in the world are the U.S., former U.S.S.R., Australia, China and India. Once upon a time the U.K. was a major coal producer and exporter, producing anywhere up to 300 million tonnes per year, but this has fallen as oil and natural gas have been adopted. Now the latter is running out, attention is once again turned to our old friend "coal". However, we have only 220 million tonnes or so left in readily accessible deposits. With the industrial strife of the coal mining industry in the 1980's (and in the 1970's too), and the final closure of many of the mines, it is not a simple matter to just start-up production again. True, the mines that were sealed-shut, spitefully with concrete by Margaret Thatcher's government, could be blasted open again, but meanwhile they have flooded severely, and it is a difficult matter to pump-out so much water and render the workings safe enough that they can be mined once more.
In 2005, we used 62 million tonnes of coal. Of that, 20 million tonnes (around 50:50 from deep mines and from surface mines) were produced in the U.K., while the rest was imported mainly from Europe. Clearly we cannot be self-sufficient in coal, and at best we have 220/62 = 3.55 years worth on these shores!
Even if we turned it all into oil, we'd have 220/3 = 73 million tonnes of oil, which is about one year's worth in total, for transportation and for industry including supply of oil as a chemical feedstock for manufacturing processes. If we provided all of our 62 million tonnes of coal from the home base, that would leave us with a capacity to produce (220 - 62)/3 = 53 million tonnes of oil, which is less even that our present annual demand to fuel transportation of 57 million tonnes. In short, we need to rely on coal imports...
Even at status quo levels of production (20 million tonnes per year), we still only have enough of our own coal to last 220/20 = 11 years.
So, we are in fact worse-off in terms of coal than we are in oil and gas, both of which the U.K. is now also a net importer of. It is proposed there are more (but harder to get) coal reserves in and off Scotland and under the North Sea, perhaps 1,500 million tonnes altogether. So, if we made all our oil from coal and provided that and all other demands for coal from U.K. reserves, we would need 220 + 60 = 280 million tonnes a year, so that's 1,500 million/280 million = 5.37 years worth. Even at this optimistic level of accounting, we are inexorably enmeshed in the European/world coal/oil/gas/uranium "mix" that we depend upon to provide our energy, and will never be self-sufficient in fossil fuels as some have voiced, using coal. There just isn't enough of the stuff left, as we have been digging it up for years, and selling most of it. I suspect we will experience cuts in electricity supply, as the rest of Europe did last week, before not too long. The nations with clout will be those with ample resources "at home" or accessible to them, and they are the U.S., Russia, China, India, Australia and Eastern Europe, increasingly part of the European Union, in consequence of the recent Enlargement policy. The EU, including the U.K., will need to maintain good cooperation with President Putin, since between them, Europe and "Russia" hold around 40% of the Earth's coal!
Monday, November 06, 2006
Flying on a Higher Plane.
We hear that plane flights are set to treble by 2030. I doubt this very much, since there will be insufficient fuel available with which to fill the tanks of so many planes, but there is no doubt that currently more of us are flying year on year. To aid this trend, a team of researchers has drawn-up a revolutionary aircraft design that reduces fuel requirements by a dramatic 35%. In addition, this aircraft is much quieter when it flies. The SAX-40 has been developed by a consortium between the University of Cambridge and Massachusetts Institute of Technology, a U.K. - U.S. hands across the sea venture, and is a radical aircraft design with favourable aerodynamic properties. As the researchers contend, oil may not remain at the $78 a barrel price it was a few months back (indeed it must eventually go up; but market forces might act to restrain the price for some time), but high fuel costs are likely to be a serious consideration in the future. Hence, fuel-efficiency is a major parameter in the future projections of all airline companies.
The new plane is said to have a tailless wedge-shaped fuselage with two "bat" wings. Innovation costs are attractive too, since in making the aircraft body in the form of a tube, manufacturers can easily build a family of variants of any desired size, using many of the same parts in all cases. Since the engines are to be located under the wings, they are more accessible for maintenance or full scale replacement half way through the intended 30 year lifespan of the aircraft. Indeed, there have been many improvements made in aircraft design during the past 50 years, particularly in terms of lighter materials based on composites and more efficient engines, but honing further improvements along these lines is becoming increasingly difficult, or indeed in improving the basic existing design of an aircraft.
The radical design proposed will undoubtedly auger in further developments and perhaps greater fuel efficiency will be possible through limiting aerodynamic drag effects. For aircraft manufacturers such as Boeing or Airbus, returns on investment must come quite rapidly. Boeing is already working on developing fuel-cells to power air-conditioning and electric systems in aircraft, which currently run off a plane's engines, and so draw-down their overall efficiency in terms of air-mileage. I am sceptical that this will make much difference in practice as we are back to the issue of using the fuel itself to generate hydrogen gas to run the cells. I doubt that such a device will make very much difference to overall fuel consumption, whereas an entirely new better streamlined craft certainly would.
The timescale is the truly crucial factor in getting this innovation (literally) off the ground. The fuel cell technology is not expected to be available for another 15 years, by when peak-oil will have bitten hard into the backbone of the world's energy corpus. There is always risk involved when anything "radical" is launched, and airlines already under pressure from fuel costs etc. are less likely to take them, relying instead on the tried and trusted designs that have proved themselves viable previously. However, with increasing concerns over climate change, and economic drivers including airlines having to pay full-costs on fuel and "green taxes" on emissions, searching for "holy grails" might be perceived as more worthy and worthwhile. Undoubtedly, the skies will not begin to fill with tubular aircraft any time near the immediate horizon, since when an airline buys a new plane, it must be convinced that it will continue flying for decades in order to earn its keep. Speculation is that even under the best of circumstances, the SAX-40 is unlikely to be running before 2030 - by which time we are to believe there will be three times the number of conventional aircraft in the air? Some part of the growth-edifice must give way long before then, and that is most probably the underpinning supply of aviation fuel. Meanwhile those that can will make hay while the sun shines, which I doubt will be for longer than a decade.
Friday, November 03, 2006
Curbing Car and Plane Use.
The following letter and reply was published in this month's Chemistry World. The text in the [brackets] was edited out, but is of course the crux of the matter.
From Chris Rhodes.
The RSC Policy Bulletin article entitled Growing energy (Issue 4, autumn 2006, p5) notes the commitment to biofuels by the US and UK.
To replace even 5 per cent of the fuel consumed annually in the UK with bioethanol would require turning over around 6300 square kilometres of arable land for the purpose, or 10 per cent of the total arable area of the UK, which would conflict with food production.
The article talks about converting waste products from existing agriculture to ethanol, for example wheat straw. This sounds like a perfect solution. In fact, it would at best provide the equivalent of just 6.5 per cent of the total fuel currently used. At first sight, the figure seems rather feeble, and so it is. There is no way we can produce enough ethanol to match our current level of fuel use, either using biomass waste or without compromising food production. On the other hand, if we move to systems of energy efficiency: living in localised communities, which would cut fuel demand by 90 per cent, then 6.5 per cent of that remaining 10 per cent begins to look significant.
[Otherwise we can neither break our dependency on imported fuels nor meet the government's targets to reduce CO2 emissions].
I am reassured that survival is possible for the UK in terms of intrinsic fuel supplies, but only given a paradigm shift in the way we live our lives.
Details of calculations on energy provision can be found on the Energy Balance website.
C Rhodes CChem FRSC
Reading, UK
Jeff Hardy, Environment, energy & sustainability forum, RSC, replies:
The RSC agrees that energy efficiency is critical in enabling the UK to meet carbon emission reduction targets and to cut fuel demand. This was one of several key messages in the RSC response to the DTI energy review.
However, if the UK is to meet the imminent targets of the renewable transport fuel obligation (5.75 per cent by 2010 and perhaps 10 per cent by 2015) without significant imports of biofuels and with minimum competition for arable land then biofuels must be produced from agricultural and forestry waste. Significant research challenges remain before biofuels from this route are economically competitive.
Wednesday, November 01, 2006
"Big Oil" Mergers Likely: but how much is left?
I have mused previously, that the immediate signs of peak oil are likely to be economic, i.e. prices of goods going up, and it will be drivers of this kind that begin to apply the brakes to demand for oil. It is a messy thought that the countryside and roadsides will become littered with Hummers and indeed all other vehicles that no one can afford to fuel any more. Localised communities may begin to arise around our feet as we are increasingly less able to move further afield. But as I allude, it is the economic drivers that we should watch out for, to gauge how things are going in terms of global fiscal health. Indeed, the international oil giants are in trouble, as oil reserves shrink, taxes and other costs rise, and producing nations (e.g. in the Middle East, Russia and South America) renege on deals which the west particularly are relying upon, or simply nationalise (hang onto) their assets. One solution to these troubles might be massive mergers between companies (or effectively even between nations).
I note that the main holders of oil reserves are Saudi Aramco, PDVSA (Venezuela), Iraqi Oil Ministry and NIOC (Iran) which account for around 700 billion barrels. Interestingly, the major gas holders (forgive the pun) are Gazprom (Russia), Qatar PET and NIOC (Iran), controlling gas reserves to a total of 500 billion barrels of oil equivalent. There is less gas than there is oil, though the quantities are comparable, but these fuels are used for different purposes. Clearly, Iran is very well endowed in both gas and oil, and it will be instructive to see how the economic chess-pieces are deployed on the atlas-board that will decide the new world order.