Mining the Climate Data
- Dec 24, 2014 7:00 pm GMTJul 7, 2018 9:09 pm GMT
- 954 views
We live in the Information Age. Geophysicists mine seismic data to find new oil and better techniques for enhancing production of old fields. Politicians mine data to raise funds, stir passions and fears and learn the likely voting patterns of constituents so they can either encourage or discourage them for getting to the poles. One of the most active sectors of today’s economy is premised on the notion that it can interpret every nuance of our purchasing and web browsing habits for the benefit of marketers, so it is confounding to understand how these same, otherwise technically insightful individuals, can be so collectively blind to the climate record.
Three easy to read charts correlating atmospheric CO2 concentrations and temperature over three different time scales are indicative of where we have been, where we are going and what we have to do to save the planet from climate catastrophe.
The first of these charts shows how the concentration of CO2 in the atmosphere has increased by about 120 parts per million (ppm) since mankind began burning fossil fuels in earnest at the onset of the industrial revolution 250 years ago.
At that time CO2 concentrations were about 280 ppm, which was already near the maximum of the previous 400,000 years according NOAA’s paleoclimate record from Antarctic ice core data below, which also shows a strong correlation between CO2 levels and atmospheric temperatures.
Temperature change (blue) and carbon dioxide change (red)
This record also shows that the four previous times CO2 levels were near 280 ppm, temperatures were about 2oC warmer than present, which would spell disaster for today’s young people, future generations, and nature according to the report issued a year ago by 18 leading scientists.
The consensus of the IPCC seems to be that 450 ppm by the end of this century will equate to 2oC but Hansen et al concludes we need to reduce levels to less than 350 ppm to avoid unacceptable climate effects while the paleo record suggests that 280 has repeatedly been enough for the planet to reach 2 degrees higher than we are today.
Greenhouse concentrations are currently 120 ppm above the previous highs, so we should note with some trepidation that previous decreases of roughly 80 ppm from that level corresponded with temperature declines of about 10oC.
If the CO2/Temperature correlation were to hold for a 120 ppm increase above 280, we would be looking at a 15oC increase at the poles, a 4oC averate increase and dire consequences.
For a look at what 400 ppm has looked like in the past see: 400 ppm World, Part 1: Large Changes Still to Come. (Hint Ellesmere Island in the Artic was 18oC warmer and was home to an ancestor of the modern camel and sea levels were 25 meters higher.)
As shown in the following chart produced by the Scripps Institution of Oceanography, the CO2/Temperature correlation appears to have broken down over the past 15 years.
According to Scripps researchers in a study published a year ago, this is attributed to cooling of eastern Pacific Ocean waters associated with the Pacific Decadal Oscillation which is a 20 to 30 year cycle of warming and cooling of surface waters in the North Pacific Ocean for as yet unknown reasons.
More recent studies suggest the Southern Pacific and Atlantic Ocean have also been taking up excess heat with the common denominator being deep water heat uptake and the brevity of the underlying condition.
The obvious lesson to be taken from these three records is that we have to move trapped heat into the deep ocean, whereas the conventional wisdom is we either have to stop burning fossil fuels or capture and bury the CO2 that results from the burning of the same.
We are already experiencing the effects of climate change and scientists tell us that these will be with us for 1000 years even if we immediately stop adding CO2 to the atmosphere. That approach, which would correspond with a conversion of all energy systems to wind, solar, nuclear and/or fusion, therefore has less than an ideal outcome even if it could be accomplished in a timely fashion.
SkepticalScience explains that 1 ppm is the equivalent of 7.81 gigatonnes of CO2 (GtC). To get back to Hansen’s 350 ppm would therefore require the sequestration of 390 GtC and to get back to 280 ppm, which the long term record shows probably would still get us to another two degree increase in temperature, we would have to sequester 937 gigatonnes of CO2.
Vaclav Smil has pointed out the impossibility of implementing CCS in a timeframe that would prevent CO2 levels from rising above 450 ppm, let alone from reducing from current levels but for the sake of argument consider the costs that would have to be incurred.
Jeremy van Loon of Bloomberg puts the price tag for equipping all existing power plants with CCS technology at $17.6 trillion.
For starters the electricity sector accounts for only 39 percent of U.S. energy-related CO2 emissions so to sequester all emissions, which would be impossible to capture in the first place, would cost $45 trillion and this only gets us back to locking in global warming for another 1000 years. Currently we are adding about 10 GtC to the atmosphere each year so to capture and sequester the 50 Gtc in excess of 350 ppm, even if it could be done, would cost at least $225 trillion.
I make this to be $4,500/tonne, which is all overhead.
Smil estimates 7.2 barrels of oil equates to1 tonne of CO2 and a barrel of oil provides 12 gallons of diesel and 19 gallons of gasoline so effective CCS would raise the price of these fuels by $20/gallon.
Jesse Jenkins tells us American households, on which the rest of the world is depending to take the lead in the climate battle, appear willing to pay roughly $2-8 per ton to combat climate change. CCS is therefore a nonstarter. The amount of carbon we could sequester with the amount of money we are prepared to spend would make virtually zero difference to the climate and thus the data indicates we are left with only one option; sequestration of surface ocean heat.
A heat pipe is a device that removes damaging heat to a heat sink. The most damaging heat associated with climate change is that which is accumulating at the surface of the tropical ocean. Heat pipes can move this to the safety of the deep ocean and in that process produce as much energy as we currently derive from fossils fuels and revenue from the sale of the same.
In effect we can neutralize the effect of the accumulation of 250 years worth of CO2 in the process of producing all the energy we need and the revenue required to pay for the fix. We also can stop burning fossil fuels and sequester carbon in the process of electrolyzing sea water to produce supergreen hydrogen.
The MIT masters paper “Assessment of Ocean Thermal Energy Conversion” of Shylesh Muralidharan points to a cost-saving of this kind of ocean thermal energy conversion plant over conventional designs of up to 45% and a capital cost for a 100 MW plant of $2650/kW. The following table from that paper also points to the fact that even conventional designs have the highest capacity factor of all energy technologies and are more than competitive on a levelized capital cost basis with most and on par with nuclear and advanced coal with CCS, which do nothing to remedy the ocean heat problem yet are offered frequently as the answers to the climate problem.
Where the climate has gone and how it got there is in the record, as is the evidence of what has to be done and it is easy enough to calculate the affordable cost.
What we cannot afford is the status quo or unnecessary diversions up technological blind alleys.
The bonanza that can be derived from mining the climate record is the most significant and socially beneficial legacy we could possibly leave our children.