Aldyen Donnelly: Ethanol versus biodiesel

In regards to biofuels, I have gone on record for years that once we fully consider the impacts of ethanol—not just from corn, although corn ethanol is the worst of the ethanol options—it will become socially unacceptable to deal in ethanol at all. Biodiesel is a different story. 

Biodiesel from palm oil should be a "no". But biodiesel from recycled waste oils and virgin canola and soy is okay—the biodiesel does not displace the food value of canola and oil, it uses oil after it is separated from the protein in the beans, so the food value can still go to the food market. Part of the reason for this difference is that we know how to sustainably harvest canola and soy using minimum tillage and multiple crop rotation practices.

However, we still do not know how to harvest corn at commercial scale using minimum impact methods.

All things considered, you really want to skip the alternative "alcohols" or sugar-based fuels and stick with oil. There are many reasons, including the above. Another is that oil-based fuels can incorporate large recycled content, while you don’t put recycled content in alcohol-based fuels. Vegetable-based oils can be blended with mineral-based oils at any point in the supply chain. Not true for alcohol/sugar-based biofuels.

More importantly, the diesel engine and power train remains a better platform for plug-in electric hybrid technology development than the gasoline engine platform. So we should be thinking oil, clean diesel, and diesel engine platforms for innovation.

In the longer term, I think production of biodiesel from algae is likely to prove the big breakthrough—and I put that within 5 years, not 20. Algae-biodiesel reactors will be built at diesel-oriented oil refineries. The biodiesel reactor uses waste heat and CO2 from the diesel refinery in the algae-to-biodiesel production process, and the biodiesel is run through the refinery’s hydro cracker to ensure that it will perform in cold climates just like petroleum-based biodiesel. Then we can put 50% biodiesel blends in EXISTING TRUCKS, let alone new diesel electric hybrids—conventional gasoline combustion engines can tolerate no more than 7.7%, gross, ethanol.  

The best existing diesel-electric hybrid (made by Opel/Vauxhall) uses the diesel engine only to recharge the batteries—not for direct power—and has a range over 740 km and much lower GHGs than the newest prototype gasoline hybrid.

It makes NO sense to dedicate resources to develop gasoline additives. We should be thinking about how to shift the gasoline market to clean diesel and diesel electric hybrid engines and power train technology.

One key to getting there is to get the sulphur content in diesel fuels down to 10 parts per million from the existing regulated 15 parts per million. This makes inexpensive catalytic converters and fine particulate traps workable on diesel-powered vehicles, so that we get air pollution reductions at the same time we get GHG reductions at relatively low incremental fuel and vehicle costs.

The issue for cellulose-based ethanol is the diesel fuel that has to be consumed to move the cellulose feed stocks to processing plants. Another difference between biodiesel and ethanol is that biodiesel plants can be designed to hit economies of scale at relatively small plant sizes. So we can locate the plants close to the feedstock supplies. Ethanol production usually requires very high production volumes to hit economies of scale. So we have to ship biomass feed stocks large distances to the plants.  

Having said that, I think there might be one (only one) BC-based ethanol production technology that could hit good numbers with small- scale plants. Whether or not I am correct about the one BC example, small scale economies is what we have to be looking out for in all biomass-based fuel alternatives. It simply costs way too much (in $s and fuel) to ship waste wood, which is only 50% carbon at best, long distances to make transportation fuels.

Probably one of the largest problems is that some decision-makers still do not know the difference between ethanol (what I call "alcohol" or "sugar-based" fuels, to real scientists’ disdain) and biodiesel ("Oil"). I tried to encourage then-NR Minister Lunn to differentiate between the two some years ago. In that regard, I think I failed.

Posted in Aldyen Donnelly | Leave a comment

Aldyen Donnelly: Denmark’s overlooked District Heating network

I recently entered into a dialogue with a Canadian business journalist about lessons Canada can learn from European energy and environmental market regulations.

My principal point is that while many Canadian academics and environmentalists misrepresent, often quite unintentionally, the impacts of post-1990 European energy tax and cap and trade measures, they appear to have missed the VERY important positive story of District Heating in Europe. I think this is largely because the lion’s share of European District Heating Infrastructure was fully developed by 1985 and has nothing to do with post-Rio/UNFCCC decision-making.  

One of the key resulting failures is that we are now planning to shut down and decommission coal-fired power plants that are rarely retired in Europe. In Europe, the same plants are worked through three phases rather than simply shut down. In phase one, they make only electricity. In phase two they co-generate steam (which, in Europe, is usually condensed into hot water to supply a District Heating network) as well as producing electricity. In phase three (after 45+ years of operation), they cut back their electricity production and continue to make steam to supply District Heating.

An aged coal-fired power generating unit might operate at an efficiency rate as low as 30%—meaning the energy value of its power output equates to 30% of the energy value of the fuel fed into the generating plant. Fossil and or biofuels are combusted to make steam, which is then pushed through turbines or other processes to make electricity.  

The average GHGs per MWh of electricity output is usually in the 1.1 to 1.3 TCO2e/MWh net output range. But the aged boilers often make steam/hot at 85% efficiency.  

On average, 10,000 pounds of steam converts to hot water that displaces about 1 MWh-equivalent of electricity or natural gas demand. When we take away the step of converting the steam into electricity, and directly use the steam or condensed steam to supply district heating, the GHGs per MWh of useful heating value delivered to customers can range anywhere between 0.4 and 0.8 GHGs per displaced MWh-equivalent of electricity or natural gas demand.  

Using aged boilers to supply heat into the district heating network can be a very cost-effective transition in a GHG reduction strategy. The key is the distance of the boilers from the homes, commercial offices and industry that can use the steam/hot water.

Denmark has been able to develop hot water recovery, transmission and recycling systems at a retail cost to Danish consumers in the CAD$0.06/ to CAD$0.10/displaced MWh retail price range. This is significantly less than the CAD$0.46/MWh average price for electricity that is currently charged Danish households.

The Danes have found that they can pay off capital and operating costs in a network that carries hot water as far as 55 kilometers from steam/heat sources to retail customers.  Steam is more efficient than hot water over short distances; hot water is more efficient than steam over longer distances.

Note that the Danish District Heating network was almost fully developed by 1985, when Danish retail (taxes included) power prices were still below CAD$0.12/MWh.

The reporter was on his way to Copenhagen and decided, as a result of our conversations, to set up meetings with Danish administrators and operators of the Danish District Heating network, which currently supplies hot water and displaces 60% of the nation’s previous electricity and natural gas demand to heat space and hot water.

Posted in Aldyen Donnelly | Leave a comment

The ozone hole did it

(Jan 10, 2010) Climate change is real and man-made, explains University of Waterloo professor Qin-Bin Lu, author of a new study published this week in the peer-reviewed journal, Physics Reports. Continue reading

Posted in Global Cooling | Leave a comment

Egyptian eyeliner may have warded off disease

(Jan. 8, 2010) Clearly, ancient Egyptians didn’t get the memo about lead poisoning. Their eye makeup was full of the stuff. Although today we know that lead can cause brain damage and miscarriages, the Egyptians believed that lead-based cosmetics protected against eye diseases. Now, new research suggests that they may have been on to something. Continue reading

Posted in Hormesis | Leave a comment

Aldyen Donnelly: Key data for electricity life cycle emissions

I use the UNFCCC GHG data for sector and national GHG estimates. Download the CRF reports for the countries you are interested in to get the best GHG stats for power sector.  Download the CRF zip file here

To see direct GHGs from electricity production, as well as upstream fossil fuel mining and refining emissions, expand the .xls file for the year(s) you are interested in and go to Table1.A(a)s1.  This is where you get the best electricity sector, coal mining and petroleum product refining GHG statistics.

To see nation-specific GHG emission factors for well-head/mine-mouth and processing GHGs for liquid and gaseous fuels, not including crude extraction or finished fuel transport emissions but including heavy oil upgrading GHGs (which are blended into the refining stats), look at Table1.A(a)s2 and Table1.A(b). 

So far, you have looked at energy consumption GHGs only and have not yet seen crude fuel extraction GHG estimates. Section 1.AA.2.F in Table1.A(a)s2 gives you the upstream fugitive GHG factor for each of the fossil fuels. Then Table1.A(b) gives you an estimate of the combined extraction through processing/refining (including fugitive) GHG factors for domestically produced fossil fuels. The GHG factors in Table 1.A(b) usually differ from one nation to the next. But it is going to be rare that the upstream GHG factor for imports is actually the same as it is for domestic production. 

The Table A(b) GHG factor is usually a pretty good estimate of the upstream emissions factor for the domestic production for the country you are looking at, and not appropriately applied to imports. But it is currently the case that both the Canadian and US CRF reports use a common North American GHG factor for natural gas. This factor, therefore, understates the upstream GHGs associated with Canadian natural gas consumption while it overstates the US upstream GHG factor. So if you use the CRF factor to estimate the full fuel cycle GHGs for Canadian electricity produced from natural gas, you will end up with an underestimate.  For most analyses, however, this factor will serve you well enough.
 
For a little snapshot of the differences between Canadian and US natural gas CO2 (not GHG) factors, go here. Select "Indicators", "CO2 emissions" then (1) CO2 from the consumption of natural gas and (2) CO2 from the flaring of natural gas. Then select "Natural Gas", then (1) consumption and (2) production.

They divide consumption emissions by consumption volumes, and flaring emissions by production volumes. You will see that this dataset puts the Canadian CO2 emission rate per unit of dry gas consumed at 1.5% higher than the US rate, and Canadian CO2 emissions from flaring per unit of dry natural gas produced at 122% the CO2 flaring discharge rate for US domestic gas production. This partially reflects the fact that Canadian natural gas reserves in the west have higher sulfur contents than typical US gas reserves, hence the higher flaring CO2 rate. And this US EIA dataset does not attempt to account for fugitive losses other than from flaring.

Beyond  the limited information you can find at the UNFCCC National Reports website, no one’s estimates are very good for developing nations.

But in the UNFCCC CRF reports, the estimates for electricity consumption (in Tjs) covers all electricity consumption in some national reports, and only fossil-and biomass-fuelled electricity consumption in others (no nuclear or hydro). So while this is the best place to get good electricity sector GHG estimates, you can’t rely on the CRF reports for total electricity demand or production part of the equation.

The best estimates of total electricity output, consumption, imports and exports, by fuel of origin and including renewables, hydro and nuclear, are here, under "Electricity" and "Renewables". Use the gross heat content tables in these sections to convert GWh and Quadrillion BTUs to Tj or whatever common energy output reporting format you wish to use.

The production, consumption and trade estimates at this US EIA website are simply reprinted International Energy Agency (IEA) data and easier to access and use than the IEA data. But the GHG data at the US EIA site is unreliable (CO2 only for some nations, more or all GHGs for others). The easiest way to put pretty good GHG/MWh-equivalent numbers is to put the GHG estimates from the UNFCCC CRF reports together with the electricity production, consumption and/or trade data from the US EIA website.

You may or may not want to see what it would mean to adjust national GHG liabilities to reflect electricity trade, as the US proposed climate change legislation dictates. Note that the US climate change bills oblige states to book GHGs arising from electricity demand in the state of final consumption, not the state in which generation occurs. This US-dictated GHG reporting procedure potentially delivers huge trade advantages to the US at the expense of Canadian clean electricity exports. We have not yet properly discussed this issue in Canada, at least as far as I know.

The US EIA version of the electricity trade data does not tell you electricity country of origin or destination for exports. If you need to get more into the implications of booking power generation attributes to the importing state/national GHG inventory, and want to assign net GHGs arising from electricity trade to the importing nation, you might want to buy some data directly from the IEA, here. The free data (last year reported 2008) does not tell you enough.. But the IEA emissions reports cover only CO2 and not all GHGs, so while you can use the IEA electricity production trade numbers with confidence, you still want to use the UNFCCC CRF GHG estimates where they are available. 

I find the IEA data too expensive. So for Europe, I use Eurostat production, import and export data (which is consistent with IEA and OECD data), which you can find here, go to "Energy Statistics – quantities (nrg_quant)"  under "Database". Here you can find electricity imports and exports by country of origin and destination for European nations’ electricity consumption.  Again, use the UNFCC GHG data to establish national sector GHG baselines, do not the mixed up CO2/GHG data in the Eurostat database for GHG/MWh baseline national estimates..

For Canada/US electricity trade, you can rely on the National Energy Board electricity trade reports, here. Here you can see trade between provinces and states.  If you want o figure out what GHG factor to assign to trade broken down by province and state:

  • for provincial electricity sector GHG factors go to Environment Canada’s national GHG inventory, go here for 2006 and earlier numbers, email request for most recent—2007—report and go down the Table of Contents to click on ANNEX 9: Electricity Intensity Tables).
  • for or US electricity sector GHG factors  by state, go here, and do an "all programs" state level emission search under the "unit level emissions" option.

There are still some significant problems with the Environment Canada estimates, but these are still the best official estimates you will get for now. The US numbers are much more reliable.  This fact alone is a significant source of tariff risk for Canada/US electricity trade.

If you do not care about imports and exports by country/state of origin, destination, another good source with a different production breakdown is the set of production import and export estimates from the free IEA tables here(go here, click on excel or archives after "electricity" under "Related Surveys". )

For the most up-to-date comparable data on national energy and environmental taxes and policies, including current energy and environmental tax rates and total government revenues, go to the OECD Economic Instruments database, here.

But remember that it is rarely the case that tax increases are fully passed through as retail price increases on each fuel that government might target with taxes. For example, fuel suppliers might respond to an increase in taxes on gasoline with a reduction in the wholesale price of gasoline and a commensurate increase in the wholesale price of (tax exempt) propane or ethanol. To see clearly how rarely point of production or consumption taxes/tax increases translate into equivalent retail (tax included) prices/price increases, go back to the Eurostat website, here, click on "database", then "environment and energy", then "energy", then "energy statistics – prices" then compare prices with and without all taxes, with and without VAT. And compare rates for residential customers to industrial customers.

Note that energy vendors will shift cap and trade compliance cost burdens in the same way they have always manipulated wholesale fuel prices to move around carbon tax burdens. Ultimately, integrated energy companies will lay off the lion’s share of any new Canadian tax or cap and trade regulation compliance costs on domestic "captive" customers (where there is no risk of customer flight), who will pay a disproportionate share of either cost burden (relative to "footloose" industrial customers and export markets.,  This is the primary reason I opposed carbon/CO2 taxes as a climate change mitigation measure and favour product standards (with credit trading and banking over quota-based cap and trade.

Posted in Aldyen Donnelly | Leave a comment

Low Dose Radiation Evades Cancer Cells’ Protective ‘Radar’

ScienceDaily
October 6, 2004

A new study shows that lower doses of radiation elude a damage detection “radar” in DNA and actually kill more cancer cells than high-dose radiation. With these findings, scientists believe they can design therapy to dismantle this “radar” sensor allowing more radiation to evade detection and destroy even greater numbers of cancer cells.

Researchers at the Johns Hopkins Kimmel Cancer Center tested the low-dose radiation strategy on cultured prostate and colon cancer cell lines and found that it killed up to twice as many cells as high-dose radiation. The extra lethality of the low-dose regimen was found to result from suppression of a protein, called ATM* which works like a radar to detect DNA damage and begin repair.

Theodore DeWeese, M.D., who led the study, speculates that cells hit with small amounts of radiation fail to switch on the ATM radar, which prevents an error-prone repair process. DeWeese, who will present his evidence at the annual meeting of the American Society for Therapeutic Radiology and Oncology on October 5 in Atlanta, explains. “DNA repair is not foolproof – it can lead to mistakes or mutations that are passed down to other generations of cells,” explains DeWeese, chairman of the Department of Radiation Oncology and Molecular Radiation Sciences at Johns Hopkins. “A dead cell is better than a mutant cell, so if the damage is mild, cells die instead of risking repair.”

Higher doses of radiation cause extreme DNA damage and widespread cell death, so the ATM damage sensor is activated to preserve as many cells as possible, protecting, ironically, the cancer cells under target for destruction by the radiation. While the low-dose regimen works in cultured cells, it has not proved successful in humans. This has lead to effort by Hopkins scientists to study ways to use viruses that can deliver ATM-blocking drugs to the cells. Tests in animals are expected to begin soon.

In the current study, colon and prostate cancer cells lines were treated with either high levels of radiation or small amounts spread over many days. Low-level radiation is approximately 10 times more powerful than normal exposure, while high doses are 1,000 times stronger. Approximately 35 percent of colon cancer cells survived low-dose radiation as compared to 60 percent receiving high-dose. In prostate cancer cell lines, half of the cells survived low-dose radiation, while 65 percent remained in higher doses.
In the low-dose group, ATM activation was reduced by 40 to 50 percent.

The researchers proved ATM inactivation was the culprit since low-dose irradiated cells fared better after ATM was reactivated with chloroqine, best known as a treatment for malaria.
“Tricking cancer cells into ignoring the damage signals that appear on its radar could succeed in making radiation more effective in wiping out the disease,” says DeWeese.
This research was funded by the National Cancer Institute.

Research participants from Johns Hopkins include Spencer Collis, Julie Schwaninger, Alfred Ntambi, Thomas Keller, Larry Dillehay, and William Nelson.

Posted in Radiation | Leave a comment

Aldyen Donnelly: Carbon taxes and what they mean for Canada

Please note that the wide-ranging exemptions that were embedded in France’s carbon tax proposal have also always existed and continue to exist in Swedish, Danish, Norwegian and Dutch carbon/CO2 tax laws. Germany, on the other hand, does not, nor has it ever, had a carbon or CO2 tax. These other nations do not have tax fairness laws comparable to the provisions in the French constitution in place.

Generally, GHGs arising from the combustion of fossil fuels in power production, petroleum refineries, aluminum smelters, cement plants and industrial chemical plants are and have always been carbon/CO2 tax exempt under most European tax systems.  Refineries, aluminum smelters, cement and chemical plants also have received free CO2 allowance allocations equal to forecast "business as usual" GHGs through 2012. This is true even for, say, BP, whose European ETS-covered operations realized a 24% growth in GHG emissions between 2005 and the end of 2008. 

It remains curious to me that Canadian carbon/CO2 tax proponents often refer to European carbon/CO2 tax models as "successful" but have never proposed or discussed the range of tax exemptions that typify the EU tax models. Of course, the rationale for the power sector and industrial carbon/CO2 tax exemptions is the prevention of job losses.

The important questions that Canadian CO2 tax and cap and trade proponents have avoided addressing to date include:

  • If Canada imposed CO2 taxes on Canadian sectors that are CO2 tax exempt in Europe and elsewhere, how does the government of Canada protect Canadian export market shares from lower cost (untaxed), more GHG-intensive European petroleum, aluminum, cement and chemical exports?
  • If Canada shorts the free allocation of Canadian GHG quota to Canadian sectors that receive free GHG/CO2 quota allocations equal to their "business as usual" GHG forecasts, how does the government of Canada protect Canadian export market shares from lower cost (uncapped), more GHG-intensive European petroleum, aluminum, cement and chemical exports?
  • European, Japanese and US governments have reserved the right to impose GHG tariffs on imports from Canada if Canada fails to cut national GHGs 20% below 2006 levels by 2020. If Canada gives these sectors the GHG/CO2 "pass" that they currently receive under the European ETS and, in some sectors, under the proposed US Senate and House climate change bills, how does Canada achieve our national GHG reduction target?

The Answer?

Product Standards. Canada’s ultra low sulphur diesel regulation is a product standard. The government neither sets prices, nor selects the technologies the market will employ to comply with new product standards. 

A product standard regulates carbon content or supply chain GHG emissions at the first point of distribution of the regulated products, Canada should implement a series of GHG or carbon product standards that:

  • Are at least as stringent as the European CO2 "benchmarks" for benchmarked sectors. The EU has developed emission benchmarks to determine best practices for "trade sensitive" sectors.  At this time, EU member states propose to freely allocate CO2 quota up to the benchmark emission intensity rates for EU ETS covered facilities for the post-2012 control periods. The benchmarked sectors are: aluminum, cement, ceramics, chemicals, glass, gypsum, iron and steel, iron ore, lime, mineral wool, non-ferrous metals except aluminum, pulp and paper, oil refineries. 
  • Could reasonably ensure Canadian achievement of the 20% reduction target from 2006 levels by 2020. Most Canadian facilities in the EU benchmarked sectors discharge lower GHGs (absolutely and per unit of output) than the current draft EU benchmarks. Canada could contemplate initial product standards for these sectors that are technically more stringent than the EU benchmarks.
  • For the electricity and biofuels sectors, combine the attributes of the existing US Renewable Fuel Standard and proposed US Renewable Electricity and Electricity Efficiency Standards into a single, more flexible Canadian Renewable Energy Standard.

With these product standards in place and our ability to prove at WTO, NAFTA and in US courts that our product standards are comparable in terms of environmental outcomes as the combination of regulations and GHG/CO2 quota allocations that are in place or proposed in Europe and the US, Canada should be able to successfully challenge the protectionist elements of the EU and US tax and cap and trade regimes.

Our ability to win any such trade dispute will rely, first, on our implementation of a Canadian facility-level emission reporting regulation that WTO, NAFTA and US courts would agree are "comparable" to section 40 of the US Code of Federal Regulations, part 75. This does not mean we have to implement reporting regulations that are identical to the highly invasive and administratively costly US emission reporting regulations.  But we do need to implement reporting regulations that are significantly more stringent than those that are in place federally or proposed by any Canadian province to date. 

As long as Canada fails to implement "comparable" (under WTO’s definition of "comparable", not the US definition) facility level reporting regulations, the EU, Japan and the US will maintain that Canada’s national GHG inventory claims cannot be verified. The WTO has in the past and will in the future uphold EU, Japanese and US GHG tariffs on our exports as long as the US can successfully make that case.

No investor likes to commit to a sector that can reasonably be forecast to be embroiled in a trade dispute in the near or medium term. Therefore, Canada must consider implementing new emission (pollutants and GHGs, not just GHG) reporting regulations as well as the key product standards as soon as possible, with the objective of having the regulations fully developed before the US Congress votes on a US climate change bill. 

Among the product standards, the Renewable Energy Standard is top priority.  Canada should implement the Canadian Renewable Energy Standard as soon as possible to demonstrate that we have equivalent-to-existing-US-and-EU renewable energy and fuel standards in place. Our goal should be to put this standard in place, by Order in Council, along with the new facility-level GHG reporting rules, between March 31 and July 31, 2010.

Canada should completely develop the other product standards (starting with reference to the EU benchmark studies and refining these standards in consultation with Canadian industry) including completing public consultation, by the July 31 deadline.

The other product standards can include clauses stipulating that these standards will not come into full effect unless/until the government of Canada finds that the US has implemented "comparable" standards. 

This approach positions Canada to provide the US with early warning that any highly protectionist US GHG quota allocation and trading scheme will be challenged by Canada, and also to communicate our view that product standards are more efficient than quota-based supply management for energy, building product and food markets. Canada’s move should cause a new debate to emerge in the US—which would be in Canada’s interest. 

This approach increases the odds that both Canada and the US will drop quota allocation and trading and shift to product standards, creating a more positive market signal.

Please note that credit trading and banking is a key element in all existing and proposed EU, US and Japanese product standards. It is not necessary to create, auction or allocation emission quotas to spawn a vibrant and disciplined secondary market for environmental attributes. Each of our product standards should allow for over-compliance credit banking and trading, and credit trading across sectoral boundaries.

Beyond these regulations, Canada also has to incorporate a new Class 27 Capital Cost Allowance regulation in Schedule II of the Corporate Income Tax Act, which new Class 27 regulation will focus on providing incentives for EXISTING plant operators to invest in GHG and pollution reduction measures. 

One year depreciation (same year "expensing of capital expenditures", in US tax lingo) is standard for investments that measure, control or reduce regulated emissions in the UK, EU and US at this time. 

Canada cannot attract the necessary new capital investment unless our tax act at least matches the one year tax deferral that is already available to investors in the UK, Europe and the US. The government of Canada should consider upping the ante by allowing entities that qualify for accelerated depreciation under the new Class 27 to bank (but not trade) CCA credits for up to 10 years.

I am not recommending that Class 27 credits become flow-through credits. I think the credits should not be severable from the equipment installations that qualified for accelerated depreciation. But the credit banking provision puts pressure on new technology adopters to become profitable in their Canadian operations—if they are not already—within 10 years.

I had hoped to draft at least two strawdog product standards and a new draft Class 27 regulation for illustration purposes by today, but have been unable to do so. I do anticipate, however, that I will be able to complete the two samples by the end of next week for consideration and criticism.

Special Note to Provinces About Green Bonds: Don’t Do It

Only the federal government can implement the Class 27 recommendation. Either the provinces or the federal government can implement the key product standards, but I do recommend these as federal initiatives. Given the potential for federal product standards, provinces will all consider additional measures to increase regional economic development opportunities.

Some provinces are considering other measures, such as issuing green bonds to raise financing for new green technology projects. With respect, I wish to recommend an alternative to that proposal.

Given current provincial deficit levels and the likelihood that interest rates will increase in the near term, provinces should not consider raising net debt (other than to retire existing, higher interest rate-bearing debt).

Under US federal tax law, interest income earned by private entities that lend capital in the form of debt to electric utilities and other key infrastructure projects is corporate income tax exempt, within some limits. I strongly recommend that provinces consider amending provincial tax law to incorporate income tax exemptions for interest income earned from loans to projects that meet certain environmental criteria. This approach should achieve the same objective as "green bonds", but mobilize private sector capital markets without increasing government deficit and debt levels.

Posted in Aldyen Donnelly | Leave a comment

Wikipedia meets its own climategate

(Dec. 30, 2009) Jimmy Wales, the founder of Wikipedia, had an   article in yesterday’s Wall Street Journal drawing attention to the rise of “online hostility” and the “degeneration of online civility.” Continue reading

Posted in The Deniers | Leave a comment

Wikipedia meets its own climategate

Tom Bethell
American Spectator
December 30, 2009

Jimmy Wales, the founder of Wikipedia, had an   article in yesterday’s Wall Street Journal drawing attention to the rise of “online hostility” and the “degeneration of online civility.” He (and coauthor Andrea Weckerle) suggested ways in which we can “prevent the worst among us from silencing the best among us.”

I agree with just about everything that they say. But there is one problem that Mr. Wales does not go near. That is the use of Wikipedia itself to inflame the political debate by permitting activists to rewrite the contributions of others. All by itself, that surely is a contributor to online incivility.

The issue that I am particularly thinking about is “climate change” — or global warming as it was once called (until the globe stopped warming, about a decade ago). Recently the Financial Post in Canada published an article by Lawrence Solomon, with this remarkable headline:

How Wikipedia’s green doctor rewrote 5,428 climate articles.

Solomon draws attention to the online labors of one William M. Connolley, a Green Party activist and software engineer in Britain. Starting in February 2003, Connolley set to work on the Wikipedia site.  I continue with a two-paragraph direct quote from Mr. Solomon’s article:

[Connolley] rewrote Wikipedia’s articles on global warming, on the greenhouse effect, on the instrumental temperature record, on the urban heat island, on climate models, on global cooling. On Feb. 14, he began to erase the Little Ice Age; on Aug. 11, the Medieval Warm Period. In October, he turned his attention to the hockey stick graph. He rewrote articles on the politics of global warming and on the scientists who were skeptical of the band [of climatologist activists]. Richard Lindzen and Fred Singer, two of the world’s most distinguished climate scientists, were among his early targets, followed by others that the band [of activists] especially hated, such as Willie Soon and Sallie Baliunas of the Harvard-Smithsonian Center for Astrophysics, authorities on the Medieval Warm Period.

All told, Connolley created or rewrote 5,428 unique Wikipedia articles. His control over Wikipedia was greater still, however, through the role he obtained at Wikipedia as a website administrator, which allowed him to act with virtual impunity. When Connolley didn’t like the subject of a certain article, he removed it — more than 500 articles of various descriptions disappeared at his hand. When he disapproved of the arguments that others were making, he often had them barred — over 2,000 Wikipedia contributors who ran afoul of him found themselves blocked from making further contributions. Acolytes whose writing conformed to Connolley’s global warming views, in contrast, were rewarded with Wikipedia’s blessings. In these ways, Connolley turned Wikipedia into the missionary wing of the global warming movement.

Online replies to this article included the following, appearing about 24 hours after Solomon’s article went on line:

Recently, the Wikipedia Arbitration Committee determined that “William M. Connolley has, on a number of occasions misused his administrator tools by acting while involved” and, as a consequence, “William M. Connolley’s administrative privileges are revoked.”

[Link: en.wikipedia.org/…/Abd-William_M._Connolley]

But three days later, on December 23, a follow-up article by Solomon said this:

How do Connolley and his co-conspirators exercise control? Take Wikipedia’s page for Medieval Warm Period, as an example. In the three days following my column’s appearance, this page alone was changed some 50 times in battles between Connolley’s crew and those who want a fair presentation of history.

So he is still at it, apparently. Connolley has for years been involved with a website called RealClimate.org. It broadcasts the views of a group of warmist ideologues, otherwise known as “working climate scientists.”  (Among them is Penn State’s Michael Mann, the inventor of the “hockey stick.”) My guess is that even if Connolley’s Wiki privileges have been revoked, his RealClimate allies continue to labor on his behalf.

The interesting paragraph below comes from Connolley’s own Wiki entry, and I suppose was written by him:

His work was also the subject of hearings by Wikipedia’s arbitration committee after a complaint was filed claiming that Connolley was pushing his own point of view in an article by removing material with opposing viewpoints. A “humiliating one-revert-a-day” editing restriction was imposed on Connolley, and he told The New Yorker that Wikipedia “gives no privilege to those who know what they’re talking about.” The restriction was later revoked, and Connolley served as a Wikipedia administrator from January 2006. [The New Yorker article was by Stacy Schiff, July 31, 2006]

It is not surprising that Connolley should think that he knows what he is talking about and that he should be “privileged.” The question is: How does Wikipedia decide between him and his allies and those who say that Connolley et al. do not know what they are talking about?

One is tempted to reply: By looking at the science. But here is an important and little-noted point. The scientific problem posed by measuring manmade global warming, if such warming really exists, is huge. There is no more complex field of science. That is because so many areas of expertise are involved — everything from the temperature effects of oceans and of cloud cover, to the study of ice cores, to the spacing of tree rings, to the proper placement of thermometers. (How many should there be in Siberia, how close should they be to New York City? and so on.)

Faced with the complexity of the way these variables interact — and I could have mentioned half a dozen more — the true scientist, at least initially, finds it difficult to be certain about the outcome. Politicians, or politicized scientists, then seized their opportunity.  Ideologues like Connolley and politicians like Al Gore filled the vacuum. Armed with world-saving missionary zeal, they milked the prestige of science to suit their own political advantage.

In so complex a field, the skeptics needed time to recover their more detached sense of what is really going on with the weather. So the warmists enjoyed a head start thanks to their political zeal and their lack of scrupulosity. Now they have come close to persuading politicians all over the Western world that we must change the way we live or sink beneath the waves.

But with the leaked emails known as Climategate more people are beginning to see that deception, not science, has been their principal weapon. And we see also that Wikipedia has lent itself to that deception.

The political exploitation of science has gone on for some time — discrediting nuclear power in addition to the use of oil and coal has been just one of its several goals. One unintended consequence, as Fred Singer said recently, is that the public may begin to disbelieve everything that begins “science says.” In the present climate, that might be healthy, but in the long run it would not work to America’s or the world’s advantage.

A footnote: Mr. Wales may be interested to know that the responses to Solomon’s article were quite civil, surprisingly so given the shocking nature of his charges. Here are two. I particularly commend the second:

[From an academic] “I will not accept any references from Wikipedia in any paper I review from here on out until this is resolved.”

“I see that a banner ad is appearing on most Wikipedia pages asking for ‘donations’…. I think I’ll contribute to more worthwhile charities.”

For myself, I shall continue to investigate this issue over the next few days, and I hope to post a follow-up next week.

Read the full article here.

Posted in Climate Change, Energy Probe News, The Deniers | Leave a comment

Special Report – Global Warming

Tulsa Beacon

December 24, 2009

While the United Nations, President Obama and former Vice President Al Gore look for billions of dollars to fight “global warming,” scientists all over the world are debunking the theory as “bad science” and political manipulation.

U.S. Secretary of State Hillary Clinton said last week in a United Nations climate summit in Copenhagen that America should lead the way in spending $100 billion to fight climate change (or global warming).

Lawrence Solomon has written a book, “The Deniers: The World Renown Scientists Who Stood Up Against Global Warming, Hysteria, Political Persecution and Fraud.” (Buy the book here).

In the book, Solomon chronicles the opposition to global warming and the science that contradicts the views of Obama, Gore and the United Nations.

Dr. Edward J. Wegman is director of the Center for Computational Statistics at George Mason University, chair of the National Academy of Sciences Committee on Applied and Theoretical Statistics and a board member of the American Statistical Association.

The U.S. House Energy and Commerce Committee asked him to assess the “hockey stick graph,” a key global warming tool created by Michael Mann of The University of Massachusetts.

The hockey stick graph shows that for most of the past one thousand years, temperatures in the Northern Hemisphere were cooling – until 1900. Then temperatures began to rise through the 1990s as mankind began to industrialize and use more hydrocarbons for energy. Mann claimed that the 20th Century was the warmest of the past millennium, the 1990s was the warmest decade and 1998 was the warmest year.

In 1990, the Intergovernmental Panel on Climate Change published a graph that showed a warming period in the Middle Ages (14th Century) that led to the “Little Ice Age,” from which we have been emerging since the early 1700s.

Mann’s hockey stick graph eliminated the Medieval Warming Period, which made the 20th Century warming look more dramatic.

According to Solomon, “More than any other single piece of evidence, (the hockey stick) made global warming a serious popular and political issue.”

The United Nations IPCC 2001 report on global warming made the hockey stick graph the centerpiece on its report to international policymakers.

A Canadian mining scientist named Stephen McIntire examined the hockey stick data and found that the 1600s, not the 1900s, were the hottest century. That didn’t necessarily disprove global warming but it showed that Mann’s graph was an enormously effective prop. Professor Ross McKitrick, an economist at The University of Guelph, has joined McIntire in calling the hockey stick graph a “phony.”

The United Nations decided that Mann’s credentials were more impressive than McIntire’s.

Then, at the request of Congress, Wegman assembled a panel of expert statisticians, including the board of the American Statistical Association.

They agreed to repudiate Mann’s hockey stick graph and vindicate McIntire and McKitrick.

“Our committee believes that the assessments that the decade of the 1990s was the hottest decade in a millennium and that 1998 was the hottest year in a millennium cannot be supported,” Wegman wrote.

When measuring temperature for one thousand years, statistics become important. Only recently did the world have accurate thermometers scattered all across the globe continuously measuring temperature.

Global warming theorists have to rely on sporadic and discontinous data such as tree rings, ice cores, lake and ocean sediment and other less reliable sources.

Wegman’s report showed that tree rings are not reliable to support the hockey stick graph.

So, why did so many rush to accept Mann’s graph?

Wegman said the peer process doesn’t work because there is “too much consensus.” Mann’s reviewers all came from a very tight-knit paleoclimate community that highly respected Mann.

Solomon writes that “the hockey stick experience has convinced Wegman that much of climate science should be taken with a grain of salt, since so many studies have been peer reviewed by reviewers unqualified in statistics.”

The IPCC dropped the hockey stick graph from its 2007 report – an indication of its lack of credibility.

Is the Earth warmer?

That is the key question for those who hold to global warming.

Dr. Vincent Gray, has a Ph.D. in physical chemistry from The University of Cambridge. He has published more than 100 scientific papers and wrote The Greenhouse Delusion: A Critique of Climate Change 2001.

He has called the IPCC (United Nations) process a “swindle.”

Gray is one of the “2,500 top scientists” from around the world that the IPCC cites as backing their reports. He wrote 1,900 comments on the final draft on a recent IPCC’s report.

“Right from the beginning, I have had difficulty with this procedure,” Gray wrote. “Penetrating questions often ended without any answer. Comments on the IPCC drafts were rejected without explanation and attempts to pursue the matter were frustrated indefinitely.”

His conclusion was that the information in the IPCC reports were “unsound.”

For example, Gray says that temperature stations are not properly stationed. Ninety percent are on the land while 70 percent of the Earth’s surface is covered by the oceans. Temperature stations are disproportionately located near cities, which are heat sources.

Geophysicist Syun-Ichi Akasofu, a founding director of the International Arctic Research Center of The University of Alaska in 1964 discovered the origin of storms in the aurora borealis. He has published more than 550 professional journal articles.

Akasofu says the Earth slowly warmed about one half of one degree Celcius during the 20th Century. That also happened over the course of the 18th and 19th centuries. The rate of warming has been fairly consistent since the Little Ice Age, which ended in 1900.

“The Earth may still be recovering from the Little Ice Age,” Akasofu said. If that is true, there is no need to blame greenhouse gases for warming in the 20th Century.

In other words, if there has been slight (one half degree) global warming, it is part of a cycle and not man made. The Norwegian Sea has been continuously receding since 1800 due to the North Atlantic Oscillation – a natural phenomenon.

Many glaciers advanced during the Little Ice Age and have been receding ever since, Akasofu states.

What is the impact of CO2 (carbon dioxide)?

The warmists believe that human activity and industrialization have increased the level of greenhouse gases in the atmosphere to the point of raising the Earth’s temperature – and thus leading to eventual climactic catastrophe.

Most scientists agree that carbon dioxide absorbs space-bound infrared radiation and leads to warming and increased evaporation at the Earth’s surface.

The question is how much of an impact it truly has.

Former Vice President Al Gore in his movie, An Inconvenient Truth, states that a rise in CO2 always brings about a rise in temperature.

But arctic data show a large rise and fall in temperature between 1920 and the early 1970s while the global average shows little change. A second major fluctuation happened in 1975. These correlate with changes in CO2 levels.

In other words, these normal fluctuations show a natural cycle that diminishes the greenhouse effect.

“Indeed, there is so far no definitive proof that most of the present warming is due to the greenhouse effect,” Akasofu said.

Arctic temperature had a cooling effect from 1940 to 1975 even though CO2 levels began to rise in 1940.

Akasofu said you cannot know the exact effect of CO2 without subtracting natural causes that are hard to measure.

The IPCC’s own climate change models show carbon dioxide to be irrelevant.

Dr. Tom Segalstad is head of the Geological Museum at The University of Oslo and was a former expert reviewer for the IPCC.

Cars are a source of CO2 emissions and so are people (as they exhale). The oceans absorb and release CO2. In fact, most scientists believe that CO2 can’t stay in the atmosphere for more than five years because it is absorbed in the oceans.

CO2 in the atmosphere and the oceans reach a stable balance when the oceans contain about 50 times more than the atmosphere.

Segalstad said that based on isotope mass balance calculations, if CO2 in the atmosphere had a lifetime of 50 to 200 years (as claimed by the United Nations), the atmosphere would have half of its current CO2 mass. That’s nonsense. The IPCC counters with a “missing sink” model that claims half of the CO2 is “hiding somewhere.”

Looking for the missing sink in the biosphere, carbon cycle modeling shows deforestation must have added a lot of CO2 to the atmosphere. Instead of finding a missing sink, the models find another CO2 source.

An error of “about three giga-tonnes of carbon annually not explained by a model would normally lead to complete rejection of the model and its hypotheses,” Segalstad said.

“It is all a fiction.”

Astrophysicist Nir Shaviv of Israel had logically concluded that increases in carbon dioxide and other gases led to the greenhouse effect. He thought greenhouse gases increased in the 20th Century due to human activity. Nothing else explained it, he thought.

Shaviv has changed his mind.

“…after carefully digging into the evidence, I realized that things are far more complicated than the story sold to us by many climate scientists or the stories regurgitated by the media,” Shaviv said.

Shaviv believes that CO2 plays only a subordinate role – a circumstantial one at best.

Man’s role is so uncertain that the Earth may have been cooling, Shaviv said. We understand the role of CO2 but we don’t understand other factors.

It’s like the man who lost his keys in the dark but looks for them a block away under a street light because the light there is better, Shaviv said.

“Solar activity can explain a large part of the 20th Century global warming,” Shaviv said. He thinks natural solar processes account for 80 percent of the warming.

He favors curbing the use of fossil fuels – not to prevent global warming but to lessen pollution.

Lawrence Solomon is the author of The Deniers. He is a columnist with National Post (Toronto), Executive Director at Energy Probe and a self-described environmentalist. The book is published by Richard Vigilante Books (www.richardvigilantebooks.com).

Read the original story here. 

Posted in Climate Change, Energy Probe News, The Deniers | Leave a comment