It’s a funny old world. The simplest things can turn out to be complicated once you start examining and questioning. Which is often why most people avoid talking about issues and ideas and instead concentrate on simpler things. Like themselves.
Just such a question arose earlier this past year, prior to the Canadian federal elections. The Suzuki Elders drafted a memo to federal politicians urging them to consider the long-term effects of reckless resource exploitation on our grandchildren’s future. We were thinking specifically of climate change and actions which lead up to it, e.g. massive additions of carbon to the global atmosphere from Canadian sources such as the tar-sands. But, someone observed, most elected politicians also have children, and grandchildren too in some cases. Why aren’t they equally concerned about these issues and the future?
I actually tried to find out the answer to that. I wrote a letter to my MP (a Progressive Conservative) who has two children in high school and asked him that very question. The response was somewhat underwhelming. He thanked me for my continued support and urged me to contribute to the party coffers.
So let me try an analytical approach. I know two individuals who typify very different environmental attitudes. I am going to examine their stories and see if I can detect any significant contributory factors.
Denzel Smith is a 36 year-old graphics designer, married with two small children, and owns a heavily-mortgaged house in Dunbar, Vancouver. He also owns a 10-year-old Toyota, two mountain bikes and a 52-inch television set. He has hiked the West Coast Trail a dozen times, and in summer hauls his wife and kids around the province on camping trips. He is a member of one racquet-ball club and three organizations which promote environmental conservation and green living. Denzel has attended numerous protest meetings and demonstrations against hot topics like the tar sands, oil pipelines and tanker traffic along the B.C. coast. He identifies with the underlying driving forces behind the current Occupy Movement, but considers the implementation as hopelessly misguided and ineffectual.
Just 1350km to the east, in Rosedale, Calgary, lives Justin Smith, aged 33. He is a part-owner of an electronics supply store. He too is married, and has two small children. He has a mountain bike and lots of other toys as well, including a Toyota FJ Cruiser, a Kawasaki Ninja 1000 which his wife detests, and a spanking new powerboat which spends eight months of the year behind his garage swathed in a blue tarpaulin. Justin is a member of a winter health club. In summer he rides, boats and jogs, sometimes with his family, sometimes alone. Justin supports oil and gas development in his home province, including the pipelines being proposed to carry tar-sands oil to the U.S. and to Asia via terminals on the B.C. coast. Although he doesn’t do business with the oil industry, he is disdainful of west coast environmental groups who oppose energy developments in Alberta, referring to them as wackos, parasites and socialistic job destroyers.
These two Smiths typify two different attitudes to the environment. Justin regards the natural world as an opportunity to test his mettle – a muddy track to be conquered by four-wheel drive, a lake to be crossed at full throttle, a prairie highway to be covered at the fastest speed possible on two wheels. He sees resource development and extraction in any form as economically imperative, necessary for progress and something which should logically be entrusted to private enterprise. He maintains that the critical bottom line will always point the way to a safe and appropriate scope of development.
Meanwhile, back in Lotus Land, Denzel thinks of his environment as a fabric, something in which he can immerse himself. He uses his bike as an exploration device. He knows every nook and cranny of the Endowment Lands that he rides through. He can identify a few hundred bird species and just about every common tree and native plant he encounters on his hikes. He is content to spend hours sitting on a rock next to a creek staring at everything or nothing in particular while the kids play in the rock pools. He is an urban dweller and a typical user of materials and resources that a modern life-style requires. He has no strong feelings about most developments, he just objects strongly to single-focussed massive exploitation with huge impacts and huge implications for other users and for long-term sustainability.
One thing I forgot to mention about these two Smiths. They are brothers. Both born, raised and schooled in Prince Albert, Saskatchewan, where their parents still live. Both brothers attended the University of Regina, from where Denzel made his way across the Rockies to B.C., while Justin chose the shorter hop to Calgary. The Smith boys see each other once or twice a year, usually at Thanksgiving and usually at their parents’ home. Still one big happy family, although things can get heated if someone steals the last piece of pumpkin pie or mentions things like tar-sands or fracking. So the Smiths are brothers and share the same parents, same upbringing, same schools, same social backgrounds, but they differ totally in their environmental perceptions.
When you search textbooks and web pages to find a basic reason or set of reasons why individuals differ in their fundamental attitude to the environment, you usually encounter the name of the late Lynn White, a professor of history. In a much quoted 1967 essay White famously targeted Christianity as the root cause of environmental degradation because of its core beliefs that humans are fundamentally distinct from the rest of nature, and that nature is present merely to serve human ends. He contrasted this with pagan animism in which all things are deemed to possess, or be associated with, life spirits and this leads to an associated level of moral constraint. White’s often-quoted thesis understandably has caused much ecclesiastical furore and a large number of rebuttals over the years. From my perspective I think the professor may have been a little too cloistered down at UCLA. A tour through Hindu India, Muslim Indonesia or communist China might have convinced him otherwise. In any event, the religious aspect doesn’t figure in my Smithian analysis – the last time either Smith boy ventured near a church was in 1998 when Grandma was laid to rest.
Amongst the published rebutters of White, the name of Lewis Moncrieff is most often cited. His proposition is that environmental attitudes have their roots not in theology but in the kind of western culture that has developed over the past few centuries. Two key revolutionary changes laid the foundations for the evolution of modern society – (1) a trend towards more equitable distribution of power and wealth by evolving democratic political structures, and (2) dramatic increases in the production of goods and services through scientific and technological development. As a consequence of industrialization, people moved from the country into metropolitan centres, increased the demand for goods and services, and increased the density of the by-products of human consumption (e.g. pollution, habitat loss, etc.).
I can relate both Smiths to this theory, but at different levels. Denzel’s worldview exemplifies the first part, i.e. more equitable distribution of power and wealth by evolving democratic political structures, although Denzel himself would argue that the trend now seems to be the other way. Justin reflects the second part – increasing the production of goods and services through scientific and technological development.
For Justin, as with so many people today, the end point is what counts. He is focused on the outputs of the industrial process – the cars, the bikes, the cell phones, the toys, the wine, the food. The consumerist credo of the 21st century tells him that’s just great and urges him to buy a few more goodies. Or sell a few more from his business. The processes by which all these products are created and the by-products of their creation such as wastes and industrial emissions don’t generally show up on his radar, and those that do are dismissed as ‘collateral damage’. He uses cool military jargon he picked up from playing Modern Warfare 2 on his Xbox. For Justin, big energy developments such as the tar sands and long range oil pipelines are triumphs of technological innovation, drivers of employment and the economy at all levels – local, provincial, national and even international for those lucky countries queuing up to buy Canada’s oil.
Denzel’s primary focus, on the other hand, is the hugely complex system which provides all these material benefits. He is all too aware of the vast array of interconnections in the real world. All the components that go into Justin’s cars, bikes and electronics, and all the ingredients needed to make the food and drink he consumes come from somewhere and are themselves part of complex production and extraction processes. The materials all have to go somewhere after they’ve been used, consumed, excreted, trashed or crashed into a tree. The modern industrial world is running out of absorption capacity for all this stuff – the garbage dumps are full, the oceans can’t take any more plastic and effluents, the atmosphere’s carbon load is starting to show up as bad news for the climate. Denzel sees the signs and evidence all around him – he notices such things.
Denzel doesn’t dispute the value of resources or the jobs their extraction and transportation generate. He just doesn’t think the material benefits are worth the massive environmental and social costs. He thinks the whole concept of exporting tar sands oil is illogical anyway. While Canada spends billions in energy conservation and other programmes to try and keep carbon emissions as low as possible, it sells huge quantities of high-carbon oil to countries who burn it and dump more carbon back into the global climate than Canada saves though conservation programmes in the first place.
So when the inevitable question comes from across the Thanksgiving dinner table “What else you got in mind, dude? You got another way of converting lots of oil, which we have, into dollars and jobs which we want?” Denzel quietly helps himself to the last pour of wine in the bottle and replies “Leave the damned stuff where it is. It’s been lying there for a hundred million years. It will keep until we have better technology, of which you’re so fond, to make more intelligent use of it”.
While neuroscientists are pushing the boundaries of their science and uncovering the highly complicated relationship between neural pathways and behavioural patterns, geneticists and molecular biologists have developed equally spectacular technologies and methods for linking human behaviour to specific genes and genetic patterns. Mark my words –its only a matter of time before scientists uncover a Green Gene. Denzel has it and Justin doesn’t. Probably as simple as that.
In an ideal world, inexpensive, reliable, and safe sources of green energy would abound, and we could avoid using energy derived from either nuclear fission or coal burning. But we’re not there yet, and with climate change already affecting life on our planet, most of us believe that we need to move quickly to using clean energy sources to limit the rise in global temperature caused by greenhouse gas emissions.
In a talk on energy and climate entitled, “Innovating to Zero”, Microsoft’s Bill Gates gives a compelling argument for why we need nuclear power in an age of increasing levels of atmospheric CO2 . Using a simple equation, he argues that CO2 is a product of the number of people on the planet, the services delivered per person, the energy needed per service, and the amount of CO2 produced by each unit of energy. The first two are heading up and are unlikely to be stopped. The cost of energy is decreasing, but not enough. So that leaves the fourth factor. We must use energy that does not produce greenhouse gases, but we need reliable energy – energy that’s available when the sun doesn’t shine and the wind doesn’t blow. Gates believes that nuclear power offers this promise and should be part of the mix, especially if improved (safer) technology is employed. Energy conservation should be a viable way to transition from dirty to clean energy, but increases in services delivered per person along with a growing population would quickly eat up conservation savings.
Like coal power, nuclear power is economical and does not fluctuate as much as wind or solar power. Unlike coal, it is considered clean in terms of the amount of greenhouse gas emissions produced by the power plant itself, although uranium mining and processing are not without risks and environmental impact. But the public is overly fearful of nuclear power, seeing it as an accident waiting to happen and, when it does, likely to adversely affect millions. Of equal concern, radioactive wastes from power plants accumulate and represent a threat by terrorists willing to handle the material, but this has not yet occurred. Accidents at nuclear power plants have the potential to be dangerous to the local population and environment as we’ve recently appreciated with the Fukushima disaster, and once long-lived radioactive elements like cesium-137 and strontium-90 are released, they can contaminate the surrounding land for decades. A case in point, the a 30 km exclusion zone surrounding Chernobyl remains empty of people twenty-five years after that disaster.
Fortunately, nuclear power plant “accidents” that spread deadly isotopes are rare, and the planet has suffered only two (avoidable) serious events that rank at the top of the International Nuclear Event Scale. As serious as these events were, there were few immediate deaths. At Chernobyl, the nuclear core of a poorly designed and operated reactor exploded and was cast outside the facility. Thirty-two radiation workers died shortly after radiation exposure at Chernobyl. At Fukushima Daiichi, in spite of IAEA concerns, an older reactor was operating without adequate safety precautions to ensure reactor coolant in the event of an earthquake and tsunami. No one has died from acute radiation poisoning at Fukushima. Other than thyroid cancers (which are mitigated by potassium iodide tablets and easily treated) increases in the incidence of other types of cancer have not been conclusively linked to radiation from the Chernobyl accident . Cardis and colleagues  estimated that “of all the cancer cases expected to occur in Europe between 1986 and 2065, around 0.01% may be related to radiation from the Chernobyl accident”. Although a tiny percentage, this still represents a large number of excess cancer cases, more than 5000 to date. However, air pollution is estimated to end life prematurely in at least 17,000 US citizens per year  and up to 850,000 globally . A 2002 analysis by the International Energy Association concluded that nuclear power ranked much lower than coal in terms of impact on biodiversity, accidents, and health risks, and only ranked higher on risk perception .
When seen in comparison to the risks of deriving energy from burning coal, the evidence that deriving energy from nuclear power is dangerous remains relatively weak. It is the perceived threat that is strong, and this threat recently caused Germany to close eight of their nuclear power plants and to begin to phase out the remaining nine by 2022. Although the intent is to generate energy cleanly, almost half of the energy in Germany currently comes from coal, and it is difficult to believe that this percentage will not rise in the next few decades, thus contributing further to global warming.
Coal-derived power, in addition to being a major contributor to greenhouse gas emissions and acid rain, is hardly safe. Thousands of coal miners die in accidents each year, and the public is susceptible to lung and heart effects from air-borne pollutants. In 2000, the Ontario Medical Association declared air pollution “a public health crisis”  and coal-fired power plants as the single largest industrial contributors to this crisis, producing carbon dioxide, fine particulates, and cancerous heavy metals including mercury. In 2005, the Ontario Medical Association estimated that air pollution costs the province more than six hundred million dollars per year in health care costs, as well as causing the premature deaths of thousands of Ontarians each year . Although of little health consequence, it is worth noting that burning coal produces fly ash that concentrates natural radioactive isotopes in excess of levels produced by nuclear power plants under normal operating conditions . Disposal of toxic coal combustion wastes, orders of magnitude larger in volume than nuclear wastes, has also come under scrutiny .
We constantly accept risks in our lives without giving it much thought. A person who smokes twenty cigarettes a day over their lifetime would shorten their life, on average, by six years. A person currently living 50 km from Fukushima who is exposed to an extra 3 mSv per year over their lifetime (the average background exposure is now greater than 3 mSv per year thanks to medical imaging) would shorten their life by 15 days . What cannot be easily evaluated, and is therefore ignored in these risk assessments, is the psychological trauma to evacuees and to those who fear the consequences of minimal radiation exposure because they do not comprehend the risks. Wild animals, ignorant of continuing radioactive decay, are now thriving in the Chernobyl exclusion zone .
Economic arguments favour the use of coal over nuclear power when waste management and decommissioning are taken into account. Nuclear plants are very expensive to build (and dismantle) although estimated capital costs for advanced coal plants with carbon control and sequestration appear to be on par with costs to build nuclear power plants . The cost to run and maintain coal plants can be higher than nuclear power plants, in part because of the transportation costs of coal. A major concern with both nuclear and coal power plants is that once the plants are built, they are likely to be around for a long time because the infrastructure is so costly to develop. Public pressure will be needed to ensure that these plants are closed as soon as clean energy sources become available.
In summary, although recent events at Fukushima warn us that safety standards and compliance must be improved, nuclear power plants operating normally produce less greenhouse gas and toxic emissions, less global environmental damage, and fewer health issues than coal-burning power plants. Neither represents a safe, sustainable, energy choice, but given a choice between these two, nuclear power comes out on top. According to Walter Keyes, a proponent of nuclear power who has worked as an energy consultant for the Saskatchewan and Federal governments, “If climate change really is the serious global issue that most scientists believe it is, there is a very limited amount of time to fix the problem and we should not be wasting valuable time debating which non GHG (green house gas) generation source is the best – we need them all, desperately!” .
1. Bill Gates on Energy: Innovating to Zero! TED talks, February, 2010. http://www.ted.com/talks/bill_gates.html
2. UN Summary of the Chernobyl Forum, Chernobyl’s Legacy: Health, Environmental and Socio-Economic Impacts, IAEA, 2006. http://www.iaea.org/Publications/Booklets/Chernobyl/chernobyl.pdf
3. Cardis E, Krewski D, Boniol M, Drozdovitch V, Darby SC, Gilbert ES, et al. 2006. Estimates of the cancer burden in Europe from radioactive fallout from the Chernobyl accident Inter. J Cancer 119, 1224–1235 (2006).
4. US Environmental Protection Agency, Power plant, mercury and air toxics standards, March, 2011. http://www.epa.gov/airquality/powerplanttoxics/pdfs/overviewfactsheet.pdf
5. World Health Organization. Estimated deaths and DALYs linked to environmental risk factors. http://www.who.int/quantifying_ehimpacts/countryprofilesebd.xls
6. International Energy Agency, Environmental and health impacts of electricity generation, June 2002 (Table 9.9) http://www.ieahydro.org/reports/ST3-020613b.pdf
7. Canadian Medical Association, June 27, 2000. http://www.collectionscanada.gc.ca/eppp-archive/100/201/300/cdn_medical_association/cmaj/cmaj_today/2000/06_27.htm
8. Ontario Medical Association Illness Costs of Air Pollution (ICAP) – Regional Data for 2005. https://www.oma.org/Resources/Documents/d2005IllnessCostsOfAirPollution.pdf
9. McBride JP, Moore RE, Witherspoon JP, Blanco, RE. Radiological impact of airborne effluents of coal and nuclear plants. Science, 202: 1045-1050, 1978.
11. U.S. Nuclear Regulatory Commission, Instruction concerning risks from occupational radiation exposure. Regulatory Guide 8.29, Feb. 1996. http://www.nrc.gov/reading-rm/doc-collections/reg-guides/occupational-health/rg/8-29/08-029.pdf
12. Hinton TG, Alexakhim R, Balonov, M., Gentner N, Hendry J, Prister B, Strand P, Woodhead D. Radiation-induced effects on plants and animals: Finds of the United Nations Chernobyl Forum. Health Physics 93: 427-440, 2007.
13. US Department of Energy/Energy Information Administration, Levelized cost of new generation resources in the annual energy outlook 2011. http://www.eia.gov/oiaf/aeo/electricity_generation.html
14. Howell, G and Keyes W, Green (renewable) energy versus nuclear energy. Part five of an eight part written debate regarding nuclear power generation. Mile Zero News and Banner Post, March 17, 2010. http://www.computare.org/Support%20documents/Guests/MZN%20Nuclear%20Debate/5%20of%208%20Green%20Energy%20Howell-Keyes.pdf
by Stan Hirst
Just one month ago the federal general elections in Canada put the Conservative Party firmly in power. Responses to the election outcome amongst environmental groups and environmentally-concerned individuals across the country ranged from disappointment to dismay to a number of other reactions, most of them negative.
The reason is not hard to discern. Conservatives in Canada are perceived as having an awful environmental track record. For the recent election, the Conservative Party‘s election platform contained not one word about environment. By contrast, the opposition Liberal Party promised to create clean energy jobs, invest in clean energy and energy efficiency, create a cap-and-trade system for reductions in greenhouse gas emissions, and to protect Canada’s air, oceans, waterways, forests and Arctic resources. The New Democratic Party’s platform proposed a shift away from fossil-fuel dependence, underlined the compatibility of environmental health and economic growth, and promised to develop green energy industries.
The future doesn’t seem to bode much better. The background documents for the June 2011 Conservative National Convention, at which future policy was set, contained just one short statement on an environmental topic, i.e. “we believe that an effective international emissions reduction regime on climate change must be truly global and must include binding targets for all the world’s major emitters, including China and the United States”.
Is this disconnect between conservatism and environmental consciousness in Canada typical of all conservatives or conservative governments? Consider the following statements from south of the 49th parallel.
- I do not intend that our natural resources should be exploited by the few against the interests of the many.
- The only trouble with capitalism is capitalists – they are too damn greedy.
- As we peer into society’s future, we – you and I, and our government – must avoid the impulse to live only for today, plundering for, for our own ease and convenience, the precious resources of tomorrow. We cannot mortgage the material assets of our grandchildren without risking the loss also of their political and spiritual heritage.
- The basic causes of our environmental troubles are complex and deeply embedded. They include: our past tendency to emphasize quantitative growth at the expense of qualitative growth; the failure of our economy to provide full accounting for the social costs of environmental pollution; the failure to take environmental factors into account as a normal and necessary part of our planning and decision making; the inadequacy of our institutions for dealing with problems that cut across traditional political boundaries; our dependence on conveniences, without regard for their impact on the environment; and more fundamentally, our failure to perceive the environment as a totality and to understand and to recognize the fundamental interdependence of all its parts, including man himself.
- We are now facing hard choices in our energy policy. Future generations — my children and grandchildren, along with yours — will have to live with the decisions we make today. And so it is time for us to make some tough and — hopefully — smart choices regarding our energy use and production before it is too late.
All penned and uttered by democrats and green-tinged radicals, right? Wrong. All spoken by hard-core Republican conservatives – Theodore Roosevelt, Herbert Hoover, Dwight Eisenhower, Richard Nixon and John McCain.
Or maybe the disconnect between conservatism and environmental consciousness in Canada now typifies the attitudes of mainstem right-wing parties struggling to deal with 21st century environments? A quick check around the globe suggests that this isn’t true either.
The British Tories, the party of Disraeli, Churchill and Thatcher, are today in a coalition with the Liberal Democrats. The Conservative 2011 programme declares that Britain needs to protect the environment for future generations, make the economy more environmentally sustainable, and that much more needs to be done to support the farming industry, protect biodiversity and encourage sustainable food production.
The governing German party, the Christian Democratic Union, declares in its manifesto that its policies are based on the Christian view of Man and his responsibility before God, and then goes on to state that the objective of an ecological and social market economy, as they see it, is to achieve a synthesis of economy, social justice and ecology. Amongst a host of economic and policy actions cited as a basis for this synthesis, the CDU include the need for ecological elements in tax legislation, environmental levies, compensation schemes, certification and liability regulations, and the cutting edge concept, at least by current Canadian standards, of rewarding environmentally sensitive actions by using market incentives and linking costs to environmental damage to establish ecologically realistic prices.
Flipping to the other side of the globe, we find the most conservative party in Australia, the rural-based National Party of Australia, outlining its 2011 environmental platform by supporting targets for greenhouse gas emissions, proposing direct action plans to reduce emissions through soil carbon sequestration and use of bio-char, revegetation of marginal land, clean coal technology, carbon capture and the use of algae, and encouraging public participation in voluntary carbon markets involving individuals, communities, agriculture and business, and strong state support to non-petroleum based fuels. Pretty radical stuff all around.
So why are Canadian conservatives not up there with the rest of the conservative world in addressing urgent environmental issues?
Conservatives believe in personal responsibility, limited government, free markets, individual liberty, traditional values and a strong national defence. Thus, conservative policies generally emphasize empowerment of the individual to solve problems. By contrast, those of more liberal persuasion believe in government action to achieve equal opportunity, equality for all, alleviation of social ills and the protection of civil liberties and individual and human rights. So liberal policies generally emphasize the need for the government to solve problems.
Looking at energy through this type of filter, we might see that Canadian conservatives consider fossil fuels to be good sources of energy (which they are of course in Canada) and, since they are abundant, their exploitation should be promoted and increased both on land and at sea. Increased domestic production by large corporations would lead to lower domestic prices plus huge incomes from the two biggest gulpers of energy on the planet – the U.S. to our south and the ever-burgeoning Chinese economy just across the Pacific. Canadian conservatives might feel that wind, solar and biomass will never provide comparable levels of plentiful, affordable and, above all, profitable sources of power. The opposing liberal view points, i.e. that oil is a diminishing resource, that other sources of energy must be explored, that government must produce a national plan for all energy resources, and must subsidize alternative energy research and production don’t play well in Canada because fossil fuels generally are not diminishing resources. They may be getting more difficult and expensive to recover, but that is just part of the ongoing and traditional challenge for private enterprise.
Looking at climate change through the same filter would surely lead a market-conscious conservative to conclude that since global warming is caused primarily by an increased production of carbon dioxide through the burning of fossil fuels, which Canada produces, and burns, in prolific quantities, the combating of climate change needs to involve realistic pricing of fossil fuel extraction and use through carbon taxation and firm regulation, through reduction in fossil fuel use by a plethora of measures to increase energy supply from renewable sources, and by a shift in consciousness towards regarding earth as an ecosystem, and not as a supply depot. All of this is missing from the current conservative platform, who sees it all as just raising taxes, increasing prices, losing jobs and impacting on individual freedoms. A prevalent conservative approach to dealing with climate change is to deny that it is happening at all.
Climate change presents a very difficult problem for Canadian conservatives. The root cause of the problem is burgeoning greenhouse gas emissions from fossil fuel use around the globe. Once upon a time the major contributors of greenhouse gases to the global climate were North America and western Europe. Now they’re increasingly being put out by the Asian industrial powers, and Canada contributes to their impacts by selling them oil and coal. And while the causes of climate change are global, the impacts – storms, droughts, rising sea levels, disappearing glaciers, changing weather patterns – will be felt globally as well. The items in the classic conservative toolkit – personal responsibility, limited government, free markets, individual liberty, traditional values and empowerment of the individual to solve problems – have not thus far dealt well with the root causes of climate change. Maybe they’ll deal better with the consequences.
by Stan Hirst
I have meticulously calculated, on the back of an envelope, that there are least a gazillion trees in British Columbia. And, using another envelope, I figured that if one excludes trees growing in national parks, old-growth forests, recreation areas, parks and other areas where you don’t want guys in plaid shirts and yellow hardhats hanging out with chain saws, then there would still be a humongous number of trees available for harvesting (on a sustainable basis, of course).
I dug out my chemistry notes from a half century ago and found a neat diagram which showed that if you started with a box marked “cellulose” on one side of the page, and then drew lots of squiggly lines and arrows in all directions, you could wind up with another box named “ethanol” on the other side of the page. Now, for those of you who didn’t take chemistry fifty years ago, I can tell you that trees contain lots of cellulose, in fact cellulose makes up about half the mass of the average tree. And, for those of you who maybe don’t get out much, I can divulge that ethanol makes passable motor fuel.
So therefore, using my last envelope, I deduced that if British Columbia set to work making ethanol from wood cellulose, we would end up with enough home-grown fuel to run all our trucks, SUVs, cars and scooters, and still have enough left over to send over to Alberta.
Now, you might ask, why on earth would we want to do that? Well, for a number of reasons, some sound and some less so. One major reason is to replace or reduce the amount of fossil fuels used by motor vehicles and thereby cut back on carbon dioxide emissions, which in turn will lead to a reduction in the overall carbon loading in the atmosphere.
Carbon dioxide (CO2) is one of the exhaust products produced when gasoline is burned as a fuel to power a motor. Burning just 1 litre of regular gasoline produces 2.4 kilograms of CO2 (most of the weight of CO2 comes from the oxygen it contains, not the carbon). CO2 is also produced in large quantities from the combustion of coal, oil and natural gas by power stations, industry and homes. We’ve been spewing the stuff into the atmosphere since the industrial revolution, with the result that the global atmosphere now contains in excess of 390 parts per million CO2, an increase of 40% since the 1800’s. CO2 is a potent greenhouse gas and absorbs a significant part of the long wave radiation beamed upward from the earth’s surface. It then radiates part of that energy back downwards to add to the surface warming effect. More CO2 in the atmosphere thus means more return of heat. As a result, the mean global temperature has risen by 0.8°C over the past century, giving rise to a multitude of climatic changes, many of which may well be irreversible.
There are many ways to reduce CO2 outputs from vehicles. The best way, and you don’t need an envelope to figure this one out, is simply not to use the vehicle at all. That’s not a favoured choice for most of us in the 21st century. The next best way is to use vehicles less or use fewer vehicles by switching to mass transit or pedal power. Other ways are to use really, really fuel efficient vehicles or even vehicles which don’t need petroleum-based fuels, e.g. electric cars. And yet another approach is to switch from petroleum-based fuels to biofuels such as ethanol made from starch, dextrose, cellulose and other green plant-derived feedstocks.
This confuses a lot of folks. Ethanol, like any carbon-based inflammatory substance, also produces CO2 when burned. In fact, a litre of ethanol, when burned, will produce about 2 kg of CO2, only slightly less than the output from the combustion of the same volume of petroleum. But ethanol’s energy content is 40% lower than that of regular fossil-fuel gasoline, so one needs to use more of it to get the same power from a motor, which means more CO2. By one estimate, using ethanol as a motor fuel (blended in with regular gasoline) produces 54% more CO2 per kilometre driven than conventional gasoline alone.
So then, why even think of using ethanol as motor fuel at all? One good reason (which is often overlooked in the ethanol versus gasoline debate) is that the carbon released as CO2 when ethanol fuel is burned is recent carbon which was in the atmosphere (as CO2) just a few growing seasons before the ethanol was distilled from corn, sugar cane or whatever plant feedstock was used. And that released CO2 will be drawn back out of the atmosphere again when the newly replanted crops of corn, sugar cane and the like are actively growing and fixing energy through photosynthesis. Contrast this with the carbon released from fossil fuels which is old. Actually very old – fossil fuels were formed by compaction and anaerobic decomposition of buried dead organisms millions of years ago. This means that CO2 derived from the combustion of petroleum and other fossil fuels is added to the existing carbon loading of the atmosphere, hence the steady rise in global atmospheric CO2 over the last two centuries.
Ethanol production in the U.S.A. has increased exponentially during the past decade, from 6.2 billion litres in 2000 to 13.2 billion litres in 2010. By comparison, Canada’s ethanol production now exceeds 2 billion litres annually, virtually all of it from prairie-based corn and cereal crops. Enerkem operates a small ethanol plant at Westbury, Quebec, using wood waste (used telephone poles) as feedstock.
The proportion of ethanol in commercially available gasoline continues to hover around 10% which is the safe maximum proportion for engines built for regular petroleum-based fuels. Flex-fuel vehicles can use up to 85% ethanol in their fuel and are slowly increasing in availability across North America. Ethanol production from corn has worked so well because it is chemically easy to change the starch in mashed corn into dextrose and from there to ethanol by fermentation.
Many factors drive the recent increase in ethanol production in the U.S., including corn crop subsidization schemes, powerful farming lobbies in Washington, and strategic concerns over oil imports. An increasingly difficult issue for the fuel ethanol industry is the rise in food prices, brought about by land competition between fuel and food production. The Food & Agriculture Organization of the United Nations estimates that about 40% of recent global food price increases are due to land competition between food farming and the ethanol industry in the U.S. and elsewhere. The rest of the increase has been brought about by economic factors, droughts and increasing human populations.
Which, cunningly, brings me back to the subject of trees in British Columbia, which are a vast potential source of ethanol fuel. Not necessarily the whole tree. Waste produced during normal tree harvesting (tree tops, branches, bark) makes up 25-35% of the overall tree volume, and the proportion of waste is increasing as the number of trees damaged or killed by the Pine Beetle epidemic increases. The annual wood waste production in Canada now exceeds 35 million tonnes, of which over 26 million tonnes are produced in British Columbia.
The main constituent of wood, lignocellulose, is composed of three organic components – cellulose, hemicellulose and lignin. The cellulose part is a polysaccharide, i.e. a very long molecular chain of dextrose molecules, which can be converted into ethanol. But the cellulose has first to be separated from the other two components. Industrial plants typically use a variety of chemical treatments involving acids, ammonia, sulphites and/or solvents, as well as physical treatments involving steam or ozone to separate cellulose from lignin in wood-derived feedstocks.
Thanks to a few million years of biological evolution, cellulose it is a very durable substance. Cellulose in discarded cotton in landfills (mainly from “disposable” diapers) has been found to remain intact for more than 20 years. Present-day industrial processes use two basic approaches to break cellulose down into its dextrose components. One approach is to uses chemical hydrolysis, i.e. to attacking the cellulose with diluted acid under condition of high heat and high pressure. The second is to use enzymatic hydrolysis. Enzymes are proteins produced by living organisms and which catalyze (i.e. assist) chemical reactions. The prospect of being able to derive huge quantities of chemical energy in the form of ethanol fuel from cellulose has spurred technological innovation on an impressive scale within the past decades, and there are diverse approaches being developed. Many technology companies are combing chemical and enzymatic treatments to make the cellulose breakdown process faster and more efficient. There are now upwards of 25 companies developing cellulosic ethanol plants in the U.S. In Canada Iogen Corporation has built a demonstration facility near Ottawa to produce ethanol from the cellulose in wheat, barley and oat straw, and Lignol has established a Cellulosic Ethanol Development Centre in Vancouver, consisting of a pilot plant and an enzyme development laboratory.
Would the widespread and increased use of ethanol (and other similar biofuels such as biodiesel made from plant oils) as motor fuel make a significant environmental difference? The only valid way of comparing ethanol and fossil-derived fuels is in terms of their life-cycles which measure energy and carbon balances. This means calculating and comparing the energy and carbon contents of commercially-produced ethanol and oil-derived gasoline from the very start of the manufacturing process all the way through to the final stage when they are burned as fuels and release their energy and carbon. For fossil fuels this is a well-to-wheel analysis, i.e. starting at the oil well where the crude oil is tapped and ending at the gas pump which supplies the motor vehicle with fuel. For ethanol, the process starts with the harvesting and processing of the green plants which are used as feedstock and ends at the gas pump. In both cases, all the energy used by the production processes are included in the analysis, including the energy costs of the refineries, transportation, farming the crops and crop fertilization. They also include the energy values of useful by-products from fuel synthesis such as co-generated electricity and feed for livestock.
The results of these types of analyses show that ethanol production and use as a fuel can indeed reduce overall carbon emissions, but it depends on the way it is produced and used. Overall vehicle CO2 emissions may be hardly impacted at all if the fuel refineries use fossil fuels such as coal or oil as a power source and ethanol added to gasoline does not exceed 10% (the present standard for gasoline in B.C.). On the other hand, total emissions can be cut by more than 50% if refineries are powered by natural gas or, better yet, a waste product such as wood chips, and if ethanol contribution to motor fuel is hiked to 85%.
What life-cycle analysis does not do, at present, is include the costs of dealing with responses and adaptations to climate changes, and deciding how much of those costs are chargeable to the accounts of fossil fuel versus ethanol use
The playing field in the market is not yet level for biofuels. Research and development funding for ethanol development in Canada from 2006 through 2008 was $300 million annually. By comparison, the Alberta Oil Sands currently receive tax subsidies in excess of $1 billion annually. The refinery cost of ethanol is close to 55¢ per litre, but if the price is adjusted for energy content, then ethanol energy costs roughly the same as fossil fuel gasoline energy to produce, i.e. about 90¢ per litre. But there is lots of wood feedstock available, and ethanol fermentation technology is moving ahead by leaps and bounds, so future ethanol production costs are likely to be trimmed. Fossil fuel production, on the other hand, may face significant future cost increases as supplies of easy oil become ever more difficult to retrieve and refine.
What is the future of ethanol production from wood and its acceptance into the marketplace? I think I need another envelope.
First point – the global climate is changing. Not many people dispute that any more. The mean global temperature has risen by 0.8°C over the past century, and the ten warmest years on record have all occurred since 1998. Within the past century many significant climate changes have been measured and reported, including increases in the frequency of heat waves in the U.S., an increasing proportion of precipitation coming in the form of intense, flood-inducing events, an increase in tropical cyclone intensity in the Atlantic Ocean, Caribbean, and Gulf of Mexico, a huge decrease in the seasonal extent of Arctic sea ice, and a big jump in the rate at which glaciers are melting.
The rates of change seem to be accelerating and most of the profound secondary changes are negative. Dr James Hansen, the NASA scientist who first drew international attention to the impending climate disaster, testified way back in 1988 that Earth had entered a long-term warming trend. Today the effects of global warming on the extremes of the global water cycle – stronger droughts and forest fires on the one hand, and heavier rains and floods on the other – have become more evident in Australia, Europe, North America, Africa and Asia.
Second point – the causal factors of climate change are now very well known. Earth is surrounded by a relatively thin layer of greenhouse gases – water vapour, carbon dioxide (CO2), methane and nitrous oxide – which act as a thermal blanket. About half the incoming solar radiation passes through the atmosphere to the Earth’s surface where some is absorbed and the remainder reflected back into the atmosphere. Substantial amounts of the energy absorbed are again radiated outward in the form of infrared heat. These contribute further to the warming of the atmosphere.
Third point – humanity has drastically changed global climatic dynamics by adding huge amounts of CO2, methane, nitrous oxide and chlorofluorocarbons to the atmosphere. Activities such as deforestation, land use changes and the burning of fossil fuels have increased atmospheric CO2 by a third since the Industrial Revolution began. Decomposition of wastes in landfills, burgeoning agriculture, especially rice cultivation, and huge populations of burping and manure-producing domestic livestock have boosted the amounts of methane in the atmosphere by a factor of three since the industrial revolution. Methane is twenty times more active than CO2 in atmospheric heat retention.
The atmospheric concentration of CO2 measured at the Mauna Loa Observatory in Hawaii is a good indicator of where we are now globally in respect of atmospheric change. Back in 1959 when the data collection programme was initiated by the National Oceanic and Atmospheric Administration (NOAA) the CO2 level was measured at 316 parts per million (ppm) and the annual increase was less than 1 ppm. Today the level is over 392 ppm and the annual increases are 2.2 ppm and getting larger all the time.
James Hansen and his climate scientist colleagues concluded that we have either reached, or are very close to, a set of climate “tipping points”. That means that climatic changes are now at a point where the feedbacks from changes spur even larger and more rapid further changes. Hansen cites Arctic sea ice as a good example of this. Global warming has initiated faster sea ice melt and has exposed darker ocean surfaces that absorb more sunlight which leads to more melting of ice. As a result, and without any additional greenhouse gases, the Arctic could soon be ice-free in the summer. The western Antarctic and Greenland ice sheets are vulnerable to even small additional warming – once disintegration gets well under way it will become unstoppable.
Pause for reality check – not only is climatic change a reality, it is progressing at an accelerating rate, the negative consequence are getting greater, and the likelihood of us managing to slow or reverse the negative trends are getting smaller.
Fourth point – James Hansen and his fellow climate scientists looked at the atmospheric CO2 levels, then at the changes in climate which were occurring, and came up with the recommendation that a CO2 level of 350 ppm (last recorded back in 1987) was pretty much the upper allowable limit if massive climatic related adverse effects were to be avoided. The number 350 has a certain appealing ring to it, and has been widely adapted by environmental organizations such as Bill McKibben’s 350.org as a universal target for citizen and government action on carbon emissions. The protagonists are quite aware that the present global atmospheric CO2 level has already overshot that target by more than 40 ppm, but they argue, convincingly, that a reversal is absolutely essential to safeguard our long-term global future.
Fifth point – and now we’re at the crux of the problem. How on Earth, or anywhere else for that matter, do we get anywhere close to reducing the rate at which atmospheric CO2 increases in future, never mind actually reversing the trend towards 350 ppm?
We think of Earth’s carbon reservoirs as being great fields of coal and petroleum compounds, which are more or less stable until we dig them up and burn them. But the globe’s biggest carbon reservoirs are in the atmosphere, the ocean, living ecosystems and soils, and are highly dynamic. They all exchange CO2 with the atmosphere, they both absorb it (oceans) and assimilate it (ecosystems), and they release it (oceans) or respire it (ecosystems). The critical point is that anthropogenic carbon emitted into the atmosphere is not destroyed but adds to the stockpile and is redistributed among the other carbon reservoirs. The turnover times range from years or decades (living plants) to millennia (the deep sea, soil). The bottom line is that any carbon released into the atmosphere is going to be around for a long, long time. Up to 1000 years in fact.
Sixth point – so how do we get from our present scene of 390 ppm CO2 in the atmosphere and impending climate doom to something closer to 350 ppm and a more stable climate scenario? Straight answer – we cannot. We simply don’t have that option.
Seventh point – the absolutely best case scenario for reduction of CO2 emissions to the atmosphere would be an immediate halt to all activities leading to anthropogenic carbon emissions. Park all motor vehicles, no more home heating, no coal-fired power plants, no burning of natural gas, no aircraft flying overhead, shoot and bury 90% of all domestic livestock. Just shut down all of human civilization. No more anthropogenic carbon emissions. Would this sacrifice bring the CO2 level down in a hurry?
Dr Susan Solomon and her colleagues at NOAA, with the help of their sophisticate computer models have addressed that very question. They ran a coupled climate–carbon cycle model which has components representing the dynamic ocean, the atmospheric energy–moisture interaction, and interactive sub-models of marine and terrestrial carbon cycles. The model reveals, sadly for us, that climate change is largely irreversible for 1000 years after all carbon emissions cease. The drop in radiative forcing of atmospheric CO2 (i.e. the extent to which CO2 causes atmospheric warming) is largely compensated by slower loss of heat to the oceans. So atmospheric temperatures do not drop significantly for at least 1,000 years. And the natural interactive processes between the atmosphere, ocean and ecosystems would carry on. Atmospheric CO2 concentration would eventually drop back to 350 ppm by about 2060 and then flatten out to near 300 ppm for the rest of the 1000 years.
Eighth point – I haven’t noticed any great urges on the part of ourselves to go and huddle in caves and gnaw on pine nuts and raw fish (no wood-burning allowed) to make this scenario work, so what is more likely?
Global carbon emissions from fossil fuel use were 6.2 billion tonnes back in 1990 when global CO2 was near 355 ppm. The 2010 estimate is 8.5 billion tonnes. That’s a 38 % increase over the levels used to formulate the Kyoto Agreement. The annual growth rate of emissions derived from fossil fuels is now about 3.5%, an almost four-fold increase from the 0.9% per year for the 1990-1999 period. Carbon emissions from land-use change (i.e. mainly deforestation) in 2007 (in just that one year) were estimated at 1.5 billion tonnes of carbon. The biggest increase in emissions has taken place in developing countries, largely in China and India, while developed countries have been growing slower. The largest regional shift has been that China passed the U.S. in 2006 to become the largest CO2 emitter, and India will soon overtake Russia to become the third largest emitter. Currently, more than half of the global emissions come from less developed countries. Developing countries with 80% of the world’s population still account for only 20% of the cumulative emissions since 1751. There is nowhere for these rates to go, other than up.
When the Intergovernmental Panel on Climate Change produced their Fourth Assessment Report in 2007, they diplomatically tried to hedge their bets. So they churned out 40 different scenarios based on emissions scenarios for the decade 2000-2010 which encompassed the full range of uncertainties related to future carbon emissions, demographic, social and economic inputs and possible future technological developments. The model predictions were correspondingly wide, ranging from “best” to “worst” in terms of atmospheric CO2 levels and changes in the associated climatic driving forces. Now it has become apparent that the actual emissions growth rate for 2000-2007 has exceeded the highest forecasted growth rates for 2000-2010 in their emissions scenarios.
Ninth point – so the most likely future outcomes (by the end of the century) are those at the top end of the scale outputted by the computer models (diagram above). That is to say our grandchildren will be looking at CO2 levels above 900 ppm, mean global temperature rises of 5 or 6 degrees C over what they are today, and an average sea level rise above 0.5 metres. Plus all the storms, cyclones, droughts, floods, vanishing shorelines, water wars and famines that might creep in along the way.
The end – CO2 concentrations in the atmosphere and future temperatures are just numbers, and pretty much the only things that computer models can output. We will have to estimate the extent of global human misery by ourselves.
The other elders may drive me from the village with brooms and pitchforks when they read my confession. But the truth must out. I am, alas, a sceptic.
I am sceptical, as well as skeptical, that my beloved Earth is going to self-destruct on 31 December 2012. I think it’s more likely the Mayans ran out of wild fig bark on which they were drawing their calendars. I am sceptical that I am by nature diplomatic, charming and easygoing because Jupiter was hanging out with Venus in the Fourth House of the night sky right about the time I came into the world seventy-odd years ago. I am sceptical that the people responsible for the multi-billion dollar homeopathic remedy business have never learned to spell the words p-l-a-c-e-b-o and g-u-l-l-i-b-i-l-i-t-y. And all this scepticism flies in the teeth of the billions of people worldwide who buy into this stuff.
We sceptics are in good company. Albert Einstein was one. In 1933 he famously stated that black holes do not and cannot exist. He couldn’t see one and couldn’t find the rationale for them in his famous equations. Today his successors have no such problems and not only think they have identified nearly 30 black hole candidates in the Milky Way galaxy but are now getting the proof that the holes behave in the relativistic way that Einstein’s theories predict.
But I’m concerned that we genuine sceptics are being given a bad name by all these so-called climate change and global warming sceptics out there.
We need to address a few issues to sort out these guys in the black hats. Firstly, what exactly is a sceptic? What is climate? And what is climate change and what does it entail?
The Oxford English Dictionary defines a sceptic as one who maintains a doubting attitude with reference to some particular question or statement. Michael Schermer, the entertaining editor of Skeptic magazine enlarges the concept thus: “Modern skepticism is embodied in the scientific method that involves gathering data to formulate and test naturalistic explanations for natural phenomena. All facts in science are provisional and subject to challenge, and therefore skepticism is a method leading to provisional conclusions. The key to skepticism is to continuously and vigorously apply the methods of science to navigate the treacherous straits between “know nothing” skepticism and “anything goes” credulity”.
And what is ‘climate’ and how does it differ from ‘weather’?
Weather is the state of the atmosphere at any given moment to the extent that it is hot or cold, wet or dry, calm or stormy, clear or cloudy. The way the concept is used in daily life refers to day-to-day temperature and precipitation activity. By contrast climate is the term for the average atmospheric conditions over longer periods of time. The difference between the two creates major confusion for many. “How the heck can it be global warming when we’re having record snowfalls in eastern Canada?”
Which leads us to the obvious next question – what is the evidence for climate change?
Lots of prestigious institutions keep honest meteorological data and report their findings. At the national level, Environment Canada reports that the national average temperature for 2010 was 3.0°C above normal, which makes it the warmest year on record since nationwide records began in 1948. The previous warmest year was 1998, 2.5°C above normal. Four Canadian climate regions (Arctic Tundra, Arctic Mountains and Fiords, North-eastern Forest and Atlantic Canada) experienced their warmest year on record in 2010, and for six other climate regions the year was amongst 10 warmest recorded. Southern Alberta and Saskatchewan were the only parts of the country with close to normal temperatures. Environment Canada’s national temperature departures table shows that of the ten warmest years, four have occurred within the last decade, and 13 of the last 20 years are listed among the 20 warmest.
At the international level, the Climatic Research Unit of the University of East Anglia has global land and marine surface temperature data dating back to 1850. The Unit reports that the years 2003, 2005 and 2010 have been the warmest on record. The mean global temperature has risen by 0.8°C over the past century. The World Meteorological Organization reports that the ten warmest years on record have all occurred since 1998.
The U.S. Environmental Protection Agency has carefully summarized all the salient indicators of climate change occurring within the past century. These include:
- heat waves – the frequency of heat waves in the U.S. has risen steadily since 1970, and the area within the U.S. experiencing heat waves has increased;
- average precipitation has increased since 1901 at an average rate of more than 6 percent per century in the U.S. and nearly 2 percent per century worldwide;
- heavy precipitation – in recent years, a higher percentage of precipitation in the U.S. has come in the form of intense single-day events; eight of the top 10 years for extreme one-day precipitation events have occurred since 1990;
- tropical cyclone intensity in the Atlantic Ocean, Caribbean, and Gulf of Mexico has risen noticeably over the past 20 years; six of the 10 most active hurricane seasons have occurred since the mid-1990s; this increase is closely related to variations in sea surface temperature in the tropical Atlantic;
- Arctic sea ice – September 2007 had the lowest ice coverage of any year on record, followed by 2008 and 2009; the extent of Arctic sea ice in 2009 was 24 percent below the 1979 to 2000 historical average;
- glaciers around the world have generally shrunk since the 1960s, and the rate at which glaciers are melting has accelerated over the last decade; overall, glaciers worldwide have lost more than 8000 km3 of water since 1960;
- lakes in the northern U.S. are freezing later and thawing earlier than they did in the 1800s and early 1900s; the length of time that lakes stay frozen has decreased at an average rate of one to two days per decade;
- snow cover over North America has generally decreased since 1972 (although there has been much year-to-year variability); snow covered an average of 8 million km2 of North America during the years 2000 to 2008, compared with 8.8 million km2 during the 1970s.
So we honest sceptics have no issue with the evidence for global warming. Its incontrovertible. Not even Sarah Palin could refudiate it.
What about the evidence for anthropogenic inputs to global climate change? In other words, to what extent are human activities, specifically the emission of carbon dioxide, methane and other greenhouse gases, responsible for the global warming observed to date?
Total global green house gas emissions (expressed as carbon dioxide equivalents) are nearing 30 billion metric tonnes per year. As a result mean global atmospheric carbon dioxide concentration has gone from about 280 parts per million during pre-industrial times to more than 380 parts per million today. Earlier CO2 data were collected from ice-cores in eastern Antarctica and have been the subject of dispute by so-called climate sceptics, but the modern-day data come from state of the art instrumentation on Mauna Loa in Hawaii and are incontestable. From 1990 to 2008 the radiative forcing of all the greenhouse gases in the Earth’s atmosphere increased by about 26 percent, the rise in carbon dioxide concentrations accounting for approximately 80 percent of this increase.
It turns out that atmospheric CO2 is not homogeneous. Some of it contains carbon-12, the rest carbon-13 (one more neutron per atom than carbon-12). Green plants prefer carbon-12 in their photosynthetic reactions. When fossil fuels, which are derived from ancient plants, are burned, the carbon-12 is release into the atmosphere. Over time the continuous carbon-12 emissions change the atmospheric proportion of carbon-13 to carbon-12, and this proportion can be measured in corals and sea sponges. So not only have background levels of CO2 increased over the past century, they are directly linked to fossil fuel burning. And we honest sceptics are still cool with the concept.
Next question – is the extra anthropogenically-derived CO2 responsible for the observed warming trend? The so-called ‘greenhouse’ effect of CO2 is well-known, and can easily be measured in a laboratory. But it has also been measured globally over the past 30 years by satellite-mounted infrared sensors and found to be significant. Moreover, the amounts of global atmospheric downward long wave radiation over land surfaces measured from 1973 to 2008 have been examined and found to be significant in contributing to the global greenhouse effect.
The U.S. Protection Agency’s summary includes some biological indicators of long-term climate change in the U.S.:
- the average length of the growing season in the lower 48 states has increased by about two weeks since the beginning of the 20th century; a particularly large and steady increase having occurred over the last 30 years; the observed changes reflect earlier spring warming as well as later arrival of fall frosts, and the length of the growing season has increased more rapidly in the west than in the east.
- plant hardiness zones have shifted northward since 1990, reflecting higher winter temperatures in most parts of the country; large portions of several states have warmed by at least one hardiness zone;
- leaf and bloom dates of lilacs and honeysuckles in the lower 48 states are now a few days earlier than in 1900s;
- bird wintering ranges have shifted northward by an average of 56 km since 1966, with a few species shifting by several hundred kilometres; many bird species have moved their wintering grounds farther from the coast, consistent with rising inland temperatures.
So there you have it. Take all the scientific evidence available and it would be difficult indeed not to concur with the 97 out of 100 climate experts who think that humans are indeed causing global warming.
So, if the evidence satisfies the honest sceptics amongst us, i.e. those who take the time to seek out and evaluate the evidence and try their level best to come to an honest and defensible conclusions, why then is there a substantial body of opinion which holds countervailing views, i.e. that there is no warming or climate change (its all just natural variation), or that there is change but we ain’t responsible (its Mother Nature’s fault)?
That would be the subject of future postings from the Elders. It opens up the opportunity for some innovative taxonomy of climate change personalities, but I’ll leave the naming to others!
by Stan Hirst
“Stop the tar sands!” says my fellow Elder. Then he thumps the table. “We should make that the key objective of the Suzuki Elders.”
It’s easy to see why thinking people get upset over the tar sands, or Alberta Oil Sands as they are more safely referred to east of the Rockies. More than 600 km2 of boreal forest (roughly the same area as greater Vancouver) have already been cleared, mined or otherwise very significantly disturbed. One-fifth of Alberta’s land area is currently under lease for further such bitumen mining and extraction.
The gunk-like bitumen makes up only 10% of the excavated tar sand, so something like two tons of sand must be processed to get one barrel of oil. About 40% of the bitumen resource is beyond the reach of conventional excavation, so pressurized steam injection is needed to force the stuff to the surface. The necessary steam generation chews up 34 m3 of natural gas to produce one barrel of bitumen from in situ projects and about 20 m3 in the case ofintegrated projects, so daily use of purchased gas in the oil sands amounts to something like 21 million m3. Average emissions for oil sands extraction and upgrading (per barrel) are 3.2 to 4.5 times that for conventional crude oil.
Water requirements for steam and other uses range from 2 to 4.5 m3 of water per each cubic metre of synthetic crude oil extracted in a mining operation. The companies in the oil sands are licensed to withdraw 650 million m2 of water from the Athabasca River annually (equal to seven times the annual water needs of city the size of Edmonton). Oil sands operations currently recycle and reuse about ¾ of the water extracted from the Athabasca River.
An average of 1.5 barrels of a processed mix of water, sand, silt, clay, contaminants and unrecovered hydrocarbons is generated for every barrel of bitumen produced, and 200 million litres of this sludge is dumped into tailings ponds every day. The area of the ponds is currently in excess of 130 km2 (about the total area of Richmond B.C.) with a projected increase to 310 km2 by 2040. Tailings pond water is acutely toxic to aquatic organisms and mammals and contains substances such as naphthenic acids, polycyclic aromatic hydrocarbons, phenolic compounds, ammonia, mercury and other trace metals, some of which are toxic to humans while others are known carcinogens.
Unhappy situation? The future is scarier. China badly needs oil to keep its huge economy powering forward and has now invested $4.7 billion in Syncrude. China will presumably want its share of the crude oil at the end of a pipeline in Kitimat B.C.
Are the oil sands sustainable in the long term? Yes, says the Canadian Association of Petroleum Producers. The oil sands are the 2nd largest crude oil reserve in the world, and supply the U.S. with 20% of its crude oil. This proportion can only increase as Saudi Arabia and the Middle East run out of easy oil and Venezuela and the rest of the producers get ever shirtier with our southern neighbours. World surplus oil production capacity will disappear in the next five years, and the global shortfall by 2015 could reach 10 million barrels per day. Biomass is not a substitute for oil in most sectors because of low photosynthetic efficiency (Brazil, the world leader in biomass energy production, burns up just 1/3 barrel of ethanol but 4½ barrels of oil per person annually). Solar and other energy sources are a long way from replacing oil as the main driver of the transportation sector. Current mining and extraction operations affect less than ½ percent of the total oil sand area, so there is a lot more gunk out there to be dug up and processed.
Sustainable? No, say the greens. The current annual CO2 emissions from all this mining and refining are something like 40 million tonnes which is currently 5% of the Canadian total. This level of CO2 output will obviously increase if plans for oil sands expansion are implemented. To keep emissions down to levels consistent with Canada’s greenhouse gas reduction targets, very expensive and untested proposals for carbon capture and storage will need to be implemented. The tailings ponds stand accused of being leaky, to the tune of 11 million litres of contaminated water per day. Toxic and carcinogenic compounds from the tailings and emissions are contaminating surrounding water and areas and are suspected of causing cancer in local communities. Wildlife, especially waterfowl, are heavily impacted by oil sand operations and tailings disposal. By 2020 the projected water use in the oil sands will be an estimated 45 m3/s which is nearly half the Athabasca River’s low winter flow during eight of the years since 1980 and in every year since 1999. The Peace-Athabasca Delta downstream of the oil sands is already exhibiting negative effects of declining water supply from climate change and the impacts of the upstream Bennett Dam in B.C.
Naturally, none of this is acceptable. The evidence mounts daily that current oil sand operations are simply pushing the envelope too far and too close to allowable and acceptable limits in the natural and human ecosystem – too many emissions, too much danger to downstream human communities, too many ecological impacts, too great a drain on dwindling water resources. So what we should be doing is insisting, absolutely, that whatever they produce be churned out with full consideration to the ecosystem and the local communities, and with full reckoning of the costs thereof.
Let’s be honest – the oil people up in Fort McMurray are not munching tons of sand just to annoy a few of us down here. They are simply supplying a commodity which is in huge demand by society. They didn’t create the demand, they’re just good at providing what society wants, i.e. cheap fossil fuel to burn in gridlocked Escalades standing on the Santa Monica freeway.
Stop the tar sands? Not likely. The current value of the plant alone exceeds $90 billion. Billions of dollars accrue to tax revenues from the oil sands every year, and 60% go into federal coffers. The Alberta coffers currently take in almost $2 billion annually from royalties. This money is the source of many federal and provincial programs and services in infrastructure, health and education. One in every 15 Albertans works for the energy industry, Take a stroll around Calgary and check out the fine recreational facilities and art museums funded by Big Oil. You had better go tell those folks you want to shut them down, fellow Elder, because I sure ain’t.
The long-term solution to this quandary is not to storm the bastion or the tailings ponds or whatever. It’s much duller, more difficult and highly contentious (sort of Elder-ish). In other words, it’s a matter of economics and politics. Fossil fuels are simply too cheap to discourage the present profligate consumption and the associated high rate of oil production from sources such as the oil sands. Moreover, production from the oil sands is heavily subsidized by the taxpayer. The full costs of the negative consequences of production, especially the ones difficult to quantify (e.g. higher cancer rates in local people) are never costed into the production equation. So two approaches immediately present themselves: make oil energy costs totally realistic though elimination of subsidies, and level the playing field through the imposition of carbon taxes. B.C. has already demonstrated that the latter are not necessarily politically unacceptable.
The objective, fellow Elders, is not to stop the tar sands, it’s to make them redundant.
Now, the next job is to convince the other ten million Elders of this…….
I discovered a set of data that says Canadians are a happy lot. A 2006 world-wide poll by the prestigious Gallup organization puts Canada joint 3rd in the world in terms of life satisfaction, just 0.5 out of 10 index points behind the leading and supremely happy Ticos of Costa Rica.
Some years back a Dutch sociologist, Ruut Veenhoven, noted the analogy between ‘healthy years’ (a statistic used much by public health specialists) and ‘happy years’ and figured he could use life satisfaction statistics to give an index of a country’s happiness. He multiplied the satisfaction index by the average life expectancy for a particular country and came up with a number which he joyfully termed ‘happy life years’ or HLY. Doing this calculation for Canada we find ourselves brimming over with 64 HLY for each (average) Canadian, just a tad ahead of the Europeans (62.2 HLY) and Americans (61.2 HLY) and a little ahead of the average South American (50.3 HLY).
But then along comes the UK’s New Economics Foundation to totally rain on our parade. They divide the HLY for each country by the ecological footprint for that country (measured in ‘global hectares’) and end up with something they call the Happy Planet Index or HPI. The HPI is a cleverly designed metric which relates human well-being and happiness to the planetary cost of that well-being in terms of resource extraction and imposition upon nature. Canada’s HPI is just 39.4, which puts us at a miserable 89th in the world (out of 143). Costa Rica again heads the list with an HPI of 76.1. An astonishing 10 of the top 11 countries on the HPI list are south- and central American, with economies only about one-eighth the size of Canada’s (measured on a per capita basis).
The ecological footprint concept was developed in the late 80’s by Dr. Mathis Wackernagel in collaboration with Professor Bill Rees of the University of British Columbia. Wackernagel has gone on to develop and extend its practical use as a tool for measuring and assessing global sustainability. It is a measure of humanity’s demand on the biosphere in terms of the area of biologically productive land and sea required to provide the resources we use and to absorb our waste. In 2005 the total global ecological footprint was computed to be 17.5 billion global hectares, a global hectare being a world-averaged unit area for producing resources and absorbing wastes.
Canada’s ecological footprint computes out at something from 7.1 to 7.6 global hectares per capita, depending on the data and methods used to compute it. This compares to our South American neighbours who achieve their respective happiness levels on areas of from 1 to 2.5 global hectares per capita. A more telling comparison is that if every country on the planet lived at the same level of ecological impacts as our Latino friends, then the Earth could just about support us all indefinitely. If, on the other hand, they all guzzled fossil fuels and spewed out wastes the way North Americans do, we would need another three or four Earths to make ends meet.
But wait a moment say the cynics, especially those in Alberta. Canada’s economy is worth $1.5 trillion, compared to the $30 billion for a smallish South American country like Costa Rica. As hewers of wood, drawers of water and rampant extractors of resources from a huge, freezing cold country, you can’t expect us not to use resources like oil at a higher rate than much smaller economies which have warm climates and stacks of visiting tourists riding bicycles. Can you?
Well no, statistics from countries with widely disparate climates and natural resource bases are obviously not directly comparable. But consider this. A major component of the ecological footprint is the so-called carbon footprint (also measured in global hectares per person) which represents the biocapacity needed to absorb CO2 emissions from fossil-fuel use and land disturbance, other than the portion absorbed by the oceans. The average South American needs about 0.5 global hectares to absorb the CO2 emitted as a result of his driving, heating, burning, cooking and industrial processes. The average European needs 2.5 of the same global hectares. Canada, by comparison, needs 3.6 and the U.S.A. a whopping 6.4.
Does Canada’s size and cold climate account for the big differences between us and the Europeans and the South Americans? Ecological footprint data from the Global Footprint Network show that Canada’s per capita usage of land for urban areas, grazing and crops is not hugely different to that used by Europeans and South Americans, in fact we use less cropland per capita than the Europeans. What does stand out is the large contribution made to the North American footprint by use of forests [Canada’s forests cover more than four million km2, about 40% of the land area].
Canada’s very high fossil fuel consumption shows up in World Bank data which places Canada 11th in the world, with about half the annual fuel use per capita of Qatar, the United Arab Emirates, Bahrain and other choice locales with dirt cheap oil and refrigerated swimming pools. We have the same level of per capita fossil fuel use as Saudi Arabia. Each year Canadians use up more than six metric tons of fossil fuels (expressed as the oil equivalent) for each man, woman and child in the land. In so doing, we belch out 17 metric tons of CO2 per capita, which puts us globally in 11th place again.
Do we really need to use that much oil, gas and coal to maintain a happy life style? Its not a simplistic question, and has as much to do with lifestyles and consumerism as it does with resource use and distribution and the country climate. But one thing stands out. If you use large amounts of fossil fuel energy, you are going to pay for it. And that payment comes out of your pocket or from the bank in the form of a credit loan. The Certified General Accountants Association of Canada has compiled a detailed report on Canadian household debt and finds Canadian households at the top of the world list in terms of their debt-to-financial-assets ratio which is currently 10.1%. The U.S. trundles behind us with 7.2%, as does Europe (median 2.9%). And those friendly Latinos? Their debt is just 0.4% of household assets.
The evidence of human influences on climate change has become increasingly clear and compelling over the last several decades There is now convincing evidence that human activities such as electricity production and transportation are adding to the concentrations of greenhouse gases that are already naturally present in the atmosphere. These heat-trapping gases are now at record-high levels in the atmosphere compared with the recent and distant past.
The U.S. Environmental Protection Agency has recently published Climate Change Indicators in the United States to help the concerned public readers interpret a set of important indicators for climate change. The report presents 24 indicators, each describing trends in some way related to the causes and effects of climate change. The indicators focus primarily on the United States, but in some cases global trends are presented in order to provide context or a basis for comparison. The following is a brief summary of the report’s contents.
Global Greenhouse Gas Emissions. Worldwide, emissions of greenhouse gases from human activities increased by 26 percent from 1990 to 2005. Emissions of carbon dioxide, which account for nearly three-fourths of the total, increased by 31 percent over this period. The majority of the world’s emissions are associated with energy use.
Atmospheric Concentrations of Greenhouse Gases. Concentrations of carbon dioxide and other greenhouse gases in the atmosphere have risen substantially since the beginning of the industrial era. Almost all of this increase is attributable to human activities. Historical measurements show that the current levels of many greenhouse gases are higher than any seen in thousands of years, even after accounting for natural fluctuations.
Climate Forcing. From 1990 to 2008, the radiative forcing of all the greenhouse gases in the Earth’s atmosphere increased by about 26 percent. The rise in carbon dioxide concentrations accounts for approximately 80 percent of this increase. Radiative forcing is a way to measure how substances such as greenhouse gases affect the amount of energy that is absorbed by the atmosphere – an increase in radiative forcing leads to warming while a decrease in forcing produces cooling.
Weather and Climate
U.S. and Global Temperature. Average temperatures have risen across the lower 48 states since 1901, with an increased rate of warming over the past 30 years. Parts of the North, the West, and Alaska have seen temperatures increase the most. Seven of the top 10 warmest years on record for the lower 48 states have occurred since 1990, and the last 10 five-year periods have been the warmest five-year periods on record. Average global temperatures show a similar trend, and 2000–2009 was the warmest decade on record worldwide.
Heat Waves. The frequency of heat waves in the United States decreased in the 1960s and 1970s, but has risen steadily since then. The percentage of the United States experiencing heat waves has also increased. The most severe heat waves in U.S. history remain those that occurred during the “Dust Bowl” in the 1930s, although average temperatures have increased since then.
Drought. Over the period from 2001 through 2009, roughly 30 to 60 percent of the U.S. land area experienced drought conditions at any given time. However, the data for this indicator have not been collected for long enough to determine whether droughts are increasing or decreasing over time.
U.S. and Global Precipitation. Average precipitation has increased in the United States and worldwide. Since 1901, precipitation has increased at an average rate of more than 6 percent per century in the lower 48 states and nearly 2 percent per century worldwide. However, shifting weather patterns have caused certain areas, such as Hawaii and parts of the Southwest, to experience less precipitation than they used to.
Heavy Precipitation. In recent years, a higher percentage of precipitation in the United States has come in the form of intense single-day events. Eight of the top 10 years for extreme one-day precipitation events have occurred since 1990. The occurrence of abnormally high annual precipitation totals has also increased.
Tropical Cyclone Intensity. The intensity of tropical storms in the Atlantic Ocean, Caribbean, and Gulf of Mexico did not exhibit a strong long-term trend for much of the 20th century, but has risen noticeably over the past 20 years. Six of the 10 most active hurricane seasons have occurred since the mid-1990s. This increase is closely related to variations in sea surface temperature in the tropical Atlantic.
Ocean Heat. Several studies have shown that the amount of heat stored in the ocean has increased substantially since the 1950s. Ocean heat content not only determines sea surface temperature, but it also affects sea level and currents.
Sea Surface Temperature. The surface temperature of the world’s oceans increased over the 20th century. Even with some year-to-year variation, the overall increase is statistically significant, and sea surface temperatures have been higher during the past three decades than at any other time since large-scale measurement began in the late 1800s.
Sea Level. When averaged over all the world’s oceans, sea level has increased at a rate of roughly six-tenths of an inch per decade since 1870. The rate of increase has accelerated in recent years to more than an inch per decade. Changes in sea level relative to the height of the land vary widely because the land itself moves. Along the U.S. coastline, sea level has risen the most relative to the land along the Mid-Atlantic coast and parts of the Gulf Coast. Sea level has decreased relative to the land in parts of Alaska and the Northwest.
Ocean Acidity. The ocean has become more acidic over the past 20 years, and studies suggest that the ocean is substantially more acidic now than it was a few centuries ago. Rising acidity is associated with increased levels of carbon dioxide dissolved in the water. Changes in acidity can affect sensitive organisms such as corals.
Snow & Ice
Arctic Sea Ice. Part of the Arctic Ocean stays frozen year-round. The area covered by ice is typically smallest in September, after the summer melting season. September 2007 had the least ice of any year on record, followed by 2008 and 2009. The extent of Arctic sea ice in 2009 was 24 percent below the 1979 to 2000 historical average.
Glaciers. Glaciers in the United States and around the world have generally shrunk since the 1960s, and the rate at which glaciers are melting appears to have accelerated over the last decade. Overall, glaciers worldwide have lost more than 2,000 cubic miles of water since 1960, which has contributed to the observed rise in sea level.
Lake Ice. Lakes in the northern United States generally appear to be freezing later and thawing earlier than they did in the 1800s and early 1900s. The length of time that lakes stay frozen has decreased at an average rate of one to two days per decade.
Snow Cover. The portion of North America covered by snow has generally decreased since 1972, although there has been much year-to-year variability. Snow covered an average of 3.18 million square miles of North America during the years 2000 to 2008, compared with 3.43 million square miles during the 1970s.
Snowpack. Between 1950 and 2000, the depth of snow on the ground in early spring decreased at most measurement sites in the western United States and Canada. Spring snowpack declined by more than 75 percent in some areas, but increased in a few others.
Society & Ecosystems
Heat-Related Deaths. Over the past three decades, more than 6,000 deaths across the United States were caused by heat-related illness such as heat stroke. However, considerable year-to-year variability makes it difficult to determine long-term trends.
Length of Growing Season. The average length of the growing season in the lower 48 states has increased by about two weeks since the beginning of the 20th century. A particularly large and steady increase has occurred over the last 30 years. The observed changes reflect earlier spring warming as well as later arrival of fall frosts. The length of the growing season has increased more rapidly in the West than in the East.
Plant Hardiness Zones. Winter low temperatures are a major factor in determining which plants can survive in a particular area. Plant hardiness zones have shifted noticeably northward since 1990, reflecting higher winter temperatures in most parts of the country. Large portions of several states have warmed by at least one hardiness zone.
Leaf and Bloom Dates. The timing of natural events such as leaf growth and flower blooms are influenced by climate change. Observations of lilacs and honeysuckles in the lower 48 states indicate that leaf growth is now occurring a few days earlier than it did in the early 1900s. Lilacs and honeysuckles are also blooming slightly earlier than in the past, but it is difficult to determine whether this change is statistically meaningful.
Bird Wintering Ranges. Some birds shift their range or alter their migration habits to adapt to changes in temperature or other environmental conditions. Long-term studies have found that bird species in North America have shifted their wintering grounds northward by an average of 35 miles since 1966, with a few species shifting by several hundred miles. On average, bird species have also moved their wintering grounds farther from the coast, consistent with rising inland temperatures.