• Thanks for stopping by. Logging in to a registered account will remove all generic ads. Please reach out with any questions or concerns.

A scary strategic problem - no oil

No oil indeed:

http://www.theglobeandmail.com/report-on-business/commentary/the-mythical-assertion-of-fossil-fuel-scarcity/article1431898/#

Neil Reynolds
The mythical assertion of fossil fuel scarcity
We have at least enough to run till 2050 at a minimum, more probably through 2080 and perhaps through 2100

Neil Reynolds

Published on Friday, Jan. 15, 2010 12:00AM EST

Last updated on Saturday, Jan. 16, 2010 3:49AM EST

Born to an English coal miner's family in 1930, energy economist Peter Odell came by his lifelong interest in carbon fuels naturally. At 80, as professor emeritus of international energy studies at Erasmus University in Rotterdam, he's still interested.

Writing this month in the European Energy Review, a Netherlands-based trade magazine, Prof. Odell uses his first post-Copenhagen podium to assure people that the world will never use up its global endowment of crude oil - and that we'll consume at least twice as much of it in the 21st century as we did in the 20th.

As befits an academic who has studied oil economics for decades, Prof. Odell is an acknowledged (though often controversial and occasionally eccentric) global authority on the industry. He calculates that 1.5 trillion barrels of oil have been added to the world's proven reserves since 1971 - the year in which the U.S. hit "Hubbert's Peak" - but only 800 billion barrels have been consumed. He insists that carbon emissions have caused no significant harm so far and are very unlikely, given technological advances, to do so in the future.

He calculates that China's increase in emissions every day exceed Denmark's celebrated reductions for an entire year. He remains sympathetic (but not committed) to the notion of Russian and Ukrainian theorists that oil is a renewable, self-perpetuating resource. He thinks China will one day buy Exxon, and Russia will one day buy Shell. He predicts that the best days for North Sea oil are still ahead (with 30 billion barrels left to go).

In his most recent comments, he sets out to de-mythologize the popular but dubious belief that there is an inherent scarcity in the world's carbon fuel resources. In fact, Prof. Odell says, there is no need to foreswear carbon fuels any time in the foreseeable future - which, by his reckoning, takes us safely through 2050 at a minimum, more probably through 2080 and perhaps through 2100.

He concedes that production of conventional oil will peak around 2050 but insists that the cause will be a global preference for natural gas, not a scarcity of oil. Even in 2100, he says, oil will supply 20 per cent of the world's energy, natural gas another 20 per cent.

"The oft-heard notion that we are 'about to run out of fossil fuels' is quite simply a myth," he writes. "Nor is it true to say that hydrocarbon production is about to 'peak' any time soon. At least for the first half of the 21st century carbon energy demand limitations will bring no more than modest pressure to bear on the eminently plentiful and generally profitable-to-produce flows of coal, oil and natural gas that are available."

His argument continues: "To begin with, the world's presently known coal reserves of some 6,300 gigatons are equal to a nominal close-to-1,000 years' supply. ... Total [coal] use over the [next] 100 years will be of the order of about 700 gigatons ... constituting about 11 per cent of the commodity's resource base." (One gigaton is the measure of an explosive force equal to one billion tons of TNT.) (Interpolation: a Gigaton is equal to one billion tons mass in this context, not explosive yeild)

As for conventional oil, he says, annual production will rise slowly in the next generation - to about 4.5 gigatons in 2030. Current known reserves of recoverable oil now exceed 200 gigatons.

By conservative calculations, Prof. Odell says, non-conventional oil production (oil sands, shale) will increase rapidly throughout the century, reaching five gigatons a year by 2080. Total production in the entire century will reach 265 gigatons, he writes.

Over the 21st century as a whole, he says, 1,660 gigatons (of oil-equivalent energy) will be produced and used - compared with a cumulative total in the 20th century of 500 gigatons. In other words, the world will use three times as much energy in the 21st century as it did in the 20th. This threefold increase will primarily reflect "the bountiful nature of the world's endowment of carbon fuels."

Any significant reduction of carbon emissions in this century, Prof. Odell says, is highly improbable, a conclusion anticipated (he argues) by the Kyoto Protocol. Kyoto required reductions in carbon emissions from a 1990 base - when 3.5 gigatons of oil-equivalent carbon energy was consumed. Instead, by 2005, 4.7 gigatons was consumed. "In marked contrast to this 1.2 gigaton ... rise in carbon energy use," he says, "use of renewables increased by less than 0.2 gigatons oil equivalent. Of this ... 83 per cent was accounted for by nuclear power - a pseudo-renewable energy source."

As for emissions, Prof. Odell warns that the biggest risk ahead arises from the abrupt closing down of oil production or gas production infrastructure. Without prudent management of drill sites and pipelines, a kind of wildfire release of CO{-2} could occur, releasing vast quantities of carbon into the atmosphere. He rates this risk at a tick above zero.

Oh yes. Prof. Odell believes production costs for oil will run between $10 (U.S.) and $40 a barrel (inflation adjusted) through the next 25 years - meaning that the quoted price in the years ahead should "remain modest" at roughly $50 a barrel. He could be wrong, of course. But he could be right. In retrospect, optimists often are.
 
And to think the McGuinty government in Ontario considers wind turbines the great whitle hope:

http://www.popsci.com/technology/article/2010-01/wind-turbines-leave-clouds-and-energy-inefficiency-their-wake

Wind Turbines Leave Clouds and Energy Inefficiency in Their Wake Downstream wind turbines can lose up to 30 percent of their power
By Jeremy Hsu
Posted 01.22.2010 at 4:13 pm 9 Comments

Turbine Contrails: Clouds form in the wake of the front row of wind turbines at the Horns Rev offshore wind farm near Denmark.  Aeolus

Clouds stream in the wake of wind turbines arrayed at the Horns Rev offshore wind farm in this stunning photo. But David MacKay, a physicist at the University of Cambridge in the UK, sees the image as illustrating the common problem of back-row wind turbines losing power relative to the front row.

Downstream wind turbines may lose 20 percent or even 30 percent of their power compared to their fellows in front, according to a study on wake effects at Horns Rev that MacKay highlights on his blog. The paper also emphasizes that different wind directions make it practically impossible to gauge an overall "steady state" for large wind farms, unless researchers can sample wind speeds and directions at multiple points throughout the array.

This shows that wind energy may represent a highly visible form of alternative energy, but certainly not one without its quirks and controversies. Still, better technology can squeeze more juice out of each gust, and cooperative energy-sharing efforts can help offset the fickle nature of wind power,

Readers seeking more info on the energy revolution might also look at MacKay's book, Sustainable Energy -- Without the Hot Air. The work has received rave reviews from the likes of Science magazine and The Economist, and it's available for free digital reading here.

[via David MacKay and Dong Energy]
 
If oil is unavailable for import, then the United States can turn to it's massive caol deposits for liquid fuel. This new technique promises to be more cost effective than previous attempts to turn coal to liquid fuel (The writer belongs to the Church of Man Made Global Warmingtm, which explains the constant references to Carbon Dioxide).:

http://www.technologyreview.com/energy/24405/?nlid=2689

Cleaner Jet Fuel from Coal

A new process could allow Air Force jets to run exclusively on domestically produced biomass and coal.
By Kevin Bullis

The Air Force is testing a jet fuel made from coal and plant biomass that could replace petroleum-based fuel and emit less carbon-dioxide compared to using conventional jet fuels. The fuel is made with a process developed by Accelergy, based in Houston, using technology licensed from ExxonMobil Research and Engineering Company and the Energy and Environmental Research Center at the University of North Dakota.

Other recently tested experimental biofuels for jets have required that the aircraft still use at least 50 percent petroleum-based product to meet performance requirements, particularly for the most advanced military jets. But the Accelergy process produces fuels that closely resemble petroleum-based fuels, making it possible to do away with petroleum altogether. Because of this, the new process could help the Air Force meet its goal of using domestic, lower-carbon fuels for half of its fuel needs by 2016. Although the first products will be jet fuels, the process can also be adapted to produce gasoline and diesel.

The fuel has passed through an initial round of testing, including lab-scale engine tests, and is on track to be flight-tested in 18 months, says Rocco Fiato, vice president of business development at Accelergy.

Turning coal into liquid fuels is nothing new, but such processes have been inefficient and produced large amounts of CO2 emissions. Accelergy's approach is different because it uses "direct liquefaction," which is similar to the process used to refine petroleum. It involves treating the coal with hydrogen in the presence of a catalyst. Conventional technology for converting coal to liquid fuels breaks the coal down into synthesis gas, which is mostly carbon monoxide with a little bit of hydrogen; the hydrogen and carbon are then recombined to produce liquid hydrocarbons, a process that releases carbon dioxide. Because the Accelergy process skips the need to gasify all of the coal--which consumes a lot of energy--before recombining the hydrogen and carbon, it's more efficient and produces less carbon dioxide. "We don't destroy the molecule in coal. Instead we massage it, inject hydrogen into it, and rearrange it to form the desired hydrocarbons," says Timothy Vail, Accelergy's president and CEO.

The hydrogen for Accelergy's process comes from two sources--coal and biomass. Accelergy gasifies a portion of the coal they use--about 25 percent of it--as well as cellulosic biomass, from sources such as plant stems and seed husks, to produce syngas. The company then treats the syngas with steam. In this reaction, carbon monoxide reacts with water to form hydrogen and carbon dioxide. Using biomass reduces the net carbon-dioxide emissions, since the biomass absorbed CO2 from the atmosphere as the original plants grew.

The technology also uses biomass in another way. The company processes seed crops, such as soybeans or camelina, which contain large amounts of oil. After extracting that oil (which leaves behind cellulosic materials that are gasified), the oil is processed to remove oxygen atoms, forming long chains of straight hydrocarbon molecules. These are then treated to make the straight molecules into branch-like molecules that remain liquid at lower temperatures, making them useful in jet fuel.

The use of biomass reduces net carbon dioxide emissions, but so does the fact that direct liquefaction is more efficient than conventional gasification, says Daniel Cicero, the technology manager for hydrogen and syngas at the U.S. Department of Energy's National Energy Technology Laboratory (NETL), in Morgantown, WV. In gasification, only about 45 percent of the energy in the coal is transferred to the fuel produced. Accelergy claims efficiencies as high as 65 percent using direct liquefaction. Yields of fuel are also higher. Gasification methods produce about two to 2.5 barrels of fuel per ton of coal. Direct liquefaction produces over three barrels per ton of coal, and adding the biomass brings the total to four barrels per ton of coal.

All told, Fiato says, gasifying coal to produce liquid fuel produces 0.8 tons of carbon dioxide per barrel of fuel, while Accelergy's process produces only 0.125 tons of CO2 per barrel. That makes it competitive with petroleum refining, especially the refining of heavier forms of petroleum. (The fuels produce about the same amount of carbon dioxide when they're burned.)

In addition to reducing carbon emissions compared to conventional coal to liquids technology, a key advantage of the process is the ability to make high-quality jet fuels. The direct liquefaction of coal produces cycloalkanes, looped molecules that have high energy density, giving airplanes greater range. They are also stable at high temperatures, allowing them to be used in advanced aircraft.

One drawback to the process is that it costs more than refining petroleum. Indeed, Cicero says that an NETL study of coal and biomass to liquid fuels technology suggests it would not be competitive until petroleum prices stay above $86 to $93 a barrel. (The study was based on conventional gasification processes.) He says that supplying fuel to the Air Force could sustain one or two small Accelergy plants, but to move beyond this would require a price on carbon-dioxide emissions of about $35 a ton.

Copyright Technology Review 2010.
 
Distributed small scale nuclear power, faster, cheaper and more reliable:

http://pajamasmedia.com/blog/when-it-comes-to-nuclear-power-companies-should-think-small/

When It Comes to Nuclear Power, Companies Should Think Small

Posted By Will Collier On January 30, 2010 @ 12:00 am In . Column2 01, Environment, Money, Science, Science & Technology, US News, Uncategorized | 23 Comments

A few months ago, the Nuclear Regulatory Commission granted permission [1] for initial site work to begin on new nuclear reactors in the United States for the first time since the 1970s. Georgia Power, a subsidiary of the gigantic Southern Company, plans to build the two new reactors at its Vogtle nuclear plant, near Augusta.

At first glance, I was all in favor of new nuclear construction. Among other reasons, it’s high time we stopped determining energy policy on the basis of a bad Jane Fonda movie. But as a Georgia Power customer — who’s already on the hook [2] for part of the bill for the new facilities — I’m scratching my head a bit over both that price tag, and over the rationale for going back to the old model of massive, complex, and hugely expensive power plants.

The planned Votgle upgrade is estimated to cost around $14 billion, and each reactor will produce around 1250 megawatts of electricity (MWe). The new reactors will be added to two existing units which were completed during the 1980s.

The cost of those two original units, estimated at the time to be around $660 million, skyrocketed to nearly $9 billion in the wake of the post-Three Mile Island regulatory blizzard. That jump in costs, which was typical for the industry, effectively ended new nuclear plant construction for a generation.

As time passed and 70’s anti-nuclear hysteria ebbed, power companies around the country have petitioned the NRC for permission to build new reactors. Some 16 applications have been filed [3] since 2007, with more anticipated.

All the current NRC applications have one thing in common: they’re for large-scale power plants, technically improved but functionally not dissimilar from the reactors of the 1970s. Today, Jane Fonda is a punchline, Real People is long since off the air, and disco is blessedly still dead, but the big electric companies remain stuck in the ‘70s as far as their strategic planning is concerned.

While political conservatives generally look favorably upon nuclear energy, the economics remain daunting. In a now-famous paper for Reason [4], Jerry Taylor of Cato said nuclear power “is to the Right what solar is to the Left: Religious devotion in practice, a wonderful technology in theory, but an economic white elephant in fact.” Taylor referenced industry studies showing nuclear electricity costing four to five times as much per kilowatt hour than coal or gas plants, and noted the massive subsidies and loan guarantees handed out to power companies as undermining the cost rationale for nuclear power.

All of which makes me wonder, again: this is the 21st century — why are we looking at huge, multibillion-dollar facilities in the first place? It’s not like other options don’t exist.

Take for instance the Hyperion Power Module [5], or HMP. Developed at, and then spun off from, the Los Alamos National Laboratory, Hyperion is marketing the diametric opposite of the power companies’ massive and complex facilities. Hyperion’s reactor is a relatively tiny device, about the size of a dinky Smart Car [6].

Unlike large-scale plants requiring 24/7 monitoring by a small army of engineers and technicians, an HPM contains no moving parts, and is intended to operate for years with no human interaction to speak of. Hyperion reactors are actually intended to be buried underground during their service lives, with no hands-on maintenance at all between refueling cycles, which occur every 7-10 years.

Of course, a single Hyperion unit is hardly the equivalent of a Westinghouse AP1000 reactor, two of which are planned for the Votgle facility. One HPM generates only 25 MWe, while a massive AP1000 churns out an appropriately massive 1250 MWe or so.

But nobody ever said you have to buy just one. If we assume that a single new AP1000 costs about $7 billion  for 1250 MWe (which is not entirely fair as “sticker prices” go, since the $14 billion estimate for the Votgle plant upgrade includes financing costs as well as actual production), that works out to about $5.6 million per MWe.

A single HPM currently lists for $50 million [7] (and I should note here that this is already twice the price Hyperion promised [8] in its initial 2008 press releases). At 25 MWe per unit, we’re looking at $2 million per MWe, a little more than a third of the unit price of power from an AP1000.

Hyperion says its reactors aren’t intended to replace large-scale generation plants, but the engineer in me wonders, why not? HPMs are built on an assembly line, and Hyperion already has over 100 orders for them. Picking up my calculator again, I figure that in order to equal the output of one AP1000 reactor, I’d need to buy 50 HPMs.

At $50 million per unit (how about a bulk discount?), that would cost $2.5 billion. Now, I don’t have that kind of cash laying around myself, but you don’t need to be an accountant to see that $2.5 billion is a lot less than $7 billion. And that doesn’t count the untold millions I’d have to spend on the aforementioned army of maintainers for the AP1000 — although either way, you’d need a sizable team of regular power plant workers to maintain the actual power turbines.

I’m sure that these simple, back-of-the-envelope numbers don’t reflect anything like every detail of big vs. small in nuclear power, but a Hyperion or similar small-scale reactor would have to get a heck of a lot more expensive to cost as much as big, traditional plants.

There would also be other benefits, in that you wouldn’t have to locate the entire power apparatus out in the middle of nowhere. Hyperion-style reactors can’t melt down, and are designed to be buried in small plots. Why not use that easy portability to distribute your power plants all over the place? Put a couple near your city’s main hospital, a couple more in your industrial zone, with single units scattered around the suburbs and residential cores, and you’ve got a redundant system that’s far less susceptible to, say, blackouts during bad weather, as opposed to running power across hundreds of miles of transmission lines.

So, Georgia Power, Nuclear Regulatory Commission, et al — why aren’t you thinking small?


--------------------------------------------------------------------------------

Article printed from Pajamas Media: http://pajamasmedia.com

URL to article: http://pajamasmedia.com/blog/when-it-comes-to-nuclear-power-companies-should-think-small/

URLs in this post:

[1] granted permission: http://southerncompany.mediaroom.com/index.php?s=43&item=1947

[2] already on the hook: http://savannahnow.com/node/712724

[3] 16 applications have been filed: http://www.eia.doe.gov/cneaf/nuclear/page/nuc_reactors/reactorcom.html

[4] In a now-famous paper for Reason: http://www.cato.org/pub_display.php?pub_id=9740

[5] Hyperion Power Module: http://www.hyperionpowergeneration.com/index.html

[6] Smart Car: http://www.smartusa.com/smart-car-fortwo.aspx

[7] currently lists for $50 million: http://www.hyperionpowergeneration.com/about_invest.html

[8] the price Hyperion promised: http://gizmodo.com/5083522/backyard-nuclear-reactors-now-in-production-cost-25-million-each
 
While I can't claim to fully understand this, it seems to be a means of using biological process to strip CO2 and H2O into CO and H2 in a low energy and temperature version of the Fischer–Tropsch process. The link includes a short video which explains(?) this further:

http://www.carbonsciences.com/01/technology.html

CO2-to-Fuel Technology

Carbon Sciences is developing a breakthrough technology to recycle CO2 emissions into fuels such as gasoline, diesel fuel and jet fuel. Innovating at the intersection of chemical engineering and bio-engineering disciplines, we are developing a highly scalable biocatalytic process to meet the fuel needs of the world.

The fuels we use today, such as gasoline and jet fuel, are made up of chains of hydrogen and carbon atoms aptly called hydrocarbons. In general, the greater the number of carbon atoms there are in a hydrocarbon molecule, the greater the energy content of that fuel. For example, gasoline has hydrocarbons with 7 to 10 carbon atoms and jet fuel has 10 to 16 carbon atoms. Hydrocarbons are naturally occurring in fuel sources such as petroleum and natural gas. To create fuel, hydrogen and carbon atoms must be bonded together to create hydrocarbon molecules. These molecules can then be used as basic building blocks to produce various gaseous and liquid fuels.

Due to its high reactivity, carbon atoms do not usually exist in a pure form, but as parts of other molecules. CO2 is one of the most prevalent and basic sources of carbon atoms. Unfortunately, it is also one of the most stable molecules. This means that it may require a great deal of energy to break apart CO2 and extract carbon atoms for making new hydrocarbons. This high energy requirement has made CO2 to fuel recycling technologies uneconomical in the past. However, Carbon Sciences is developing a proprietary process that requires significantly less energy than other approaches that have been tried. Also, with the global demand for fuel and price of oil projected to rise continuously in the foreseeable future, the economics have changed in favor of certain innovative lower energy approaches, such as Carbon Sciences' breakthrough technology.

Breakthrough Biocatalytic Process

Some of the known approaches for CO2 to fuel recycling include (1) direct photolysis which uses intense light energy to break off the oxygen atoms in CO2, and (2) chemically reacting carbon dioxide gas (CO2) with hydrogen gas (H2) to create methane or methanol. Both of these conventional engineering approaches require immense energy due to high pressure and high temperature chemical processes. For certain applications such as military and space, the high cost of these technologies may be justifiable. However, we do not believe these approaches will be economically viable in creating transportation fuels for global consumption.
 
By innovating at the intersection of chemical engineering and bio-engineering, we have discovered a low energy and highly scalable process to recycle large quantities of CO2 into gaseous and liquid fuels using organic biocatalysts. The key to our CO2-to-Fuel approach lies in a proprietary multi-step biocatalytic process. Instead of using expensive inorganic catalysts, such as zinc, gold or zeolite, with traditional high energy catalytic chemical processes, our process uses inexpensive, renewable biomolecules to catalyze certain chemical reactions required to transform CO2 and water (H2O) into fuel molecules. Of greatest significance, our process occurs at low temperature and low pressure, thereby requiring far less energy than other approaches.

The energy efficient biocatalytic processes we are exploiting in our technology actually occur in certain micro-organisms where carbon atoms, extracted from CO2, and hydrogen atoms, extracted from H2O, are combined to create hydrocarbon molecules. Our breakthrough technology allows these processes to operate on a very large industrial scale through advance nano-engineering of the biocatalysts and highly efficient process design.

Highly Scalable CO2-to-Fuel Recycling Plant
 
The Carbon Sciences CO2-to-Fuel technology includes a complete plant level process that takes CO2 from a large emitter, such as a power plant, and produces usable fuels as the output.

The complete process includes the following major components:

1.CO2 Flue Gas Processor - Purification of CO2 stream to remove heavy particulates.

2.Biocatalyst Unit - Regeneration of biocatalysts for the CO2 recycling process.

3.Biocatalytic Reactor Matrix - The primary and largest part of the plant where mass quantities of biocatalysts work in a matrix of liquid reaction chambers, performing the multi-stage breakdown of CO2 and its transformation to basic gas and liquid hydrocarbons. These reactors are inexpensive low temperature and low pressure vessels. The number of reactors determines the size and output capacity of the plant.

4.Filtration - The liquid solutions are filtered through membrane units to extract liquid fuels. Gaseous fuels are extracted through condensers.

5.Conversion and Polishing - The output of the Filtration stage contains low hydrocarbon fuels. These hydrocarbons can then be further processed into higher fuels such as gasoline, diesel fuel and jet fuel.

The Carbon Sciences CO2-to-Fuel process can be configured to produce a variety of hydrocarbon fuels by customizing the Conversion and Polishing stage and biocatalytic formulation.
 
Here comes the sun:

http://www.technologyreview.com/energy/24521/?ref=rss&a=f

Efficient Solar Cells from Cheaper Materials
IBM researchers have greatly increased the performance of a novel thin film solar cell.
By Kevin Bullis

Researchers at IBM have increased the efficiency of a novel type of solar cell made largely from cheap and abundant materials by over 40 percent. According to an article published this week in the journal Advanced Materials, the new efficiency is 9.6 percent, up from the previous record of 6.7 percent for this type of solar cell, and near the level needed for commercial solar panels. The IBM solar cells also have the advantage of being made with an inexpensive ink-based process.

The new solar cells convert light into electricity using a semiconductor material made of copper, zinc, tin, and sulfur--all abundant elements--as well as the relatively rare element selenium (CZTS). Reaching near-commercial efficiency levels is a "breakthrough for this technology," says Matthew Beard, a senior scientist at the National Renewable Energy Laboratory, who was not involved with the work.

The IBM solar cells could be an alternative to existing "thin film" solar cells. Thin film solar cells use materials that are particularly good at absorbing light. The leading thin film manufacturer uses a material that includes the rare element tellurium. Daniel Kammen, director of the Renewable and Appropriate Energy Laboratory at the University of California, Berkeley, says the presence of tellurium could limit the total electricity such cells could produce because of its rarity. While total worldwide electricity demand will likely reach dozens of terawatts (trillions of watts) in the coming decades, thin film solar cells will likely be limited to producing about 0.3 terawatts, according to a study he published last year. In contrast, the new cells from IBM could produce an order of magnitude more power. (Interpolation: that means we could see a total of 3 Tw when the sun is shining using this technology on every available surface. Solar really is a niche market)

The new cells could also have advantages compared to cells made of copper indium gallium and selenium (CIGS), which are just starting to come to market. That's because the indium and gallium in these cells is expensive, and while the selenium used in the IBM cell is rarer than indium or gallium, its cost is a tenth of either.

A new ink-based manufacturing process solves some of the key challenges to making efficient CZTS cells. A common approach to making any type of high-quality solar material is to dissolve a precursor substance in a solvent. This isn't possible with the CZTS cells because the zinc compounds required in the new cells aren't soluble. To get around this, the researchers used a combination of dissolved materials and suspended particles, creating a slurry-like ink that could then be spread over a surface that's been heat-treated to produce the final materials. The particles prevent the material from cracking and peeling as the solvent evaporates.

The IBM researchers are also investigating ways to improve the efficiency of the new solar cells, with the goal of reaching about 12 percent in the laboratory--high enough to give manufacturers confidence that they could be mass produced and still have efficiency levels of around 10 percent, says David Mitzi, at IBM Research, who led the work. Beard recommends targeting 15 percent efficiency in the lab, and Mitzi says this should be possible by improving other parts of the solar cell besides the main CZTS material, or by doping the semiconductor with other trace elements (which is easy with the ink-based process).

What's more, commercial cells will likely use different materials for conducting electrons. The experimental cells used indium tin oxide, which is limited by the availability of indium. But Mitzi says several other conductors could work as well.

One key next step is to completely replace the selenium in the solar cells with sulfur. For the record-efficiency cell, the researchers replaced half of the selenium used in a previous experimental cell. If all of the selenium could be replaced, the cells could, in theory, supply all of the electricity needs of the world. (Provided there are suitable means for storing and redistributing power for use at night or on cloudy days.)

The new type of solar cell will have several competitors, Beard says. For example, non-crystalline silicon is cheaper to make than crystalline silicon, and the efficiency of the resulting cells is improving. Researchers are also finding ways to use less expensive grades of crystalline silicon, and large-scale production has decreased the overall cost of producing such cells, making it difficult for new solar materials to gain a foothold.

Copyright Technology Review 2010.
 
More coal to oil conversion:

http://cnews.canoe.ca/CNEWS/Environment/2010/02/22/pf-12980701.html

Scientists find way to make cheap gas from coal
By QMI AGENCY

It could be a boon for the Canadian prairies.

Researchers in Texas say they have found a way of cutting the cost of producing gasoline by two thirds, taking advantage of the lowest grade of coal available - one that is abundant beneath the Canadian prairies.

A new refining process being perfected at the University of Texas at Arlington can turn the low-cost lignite coal, also known as brown coal, into oil at a fraction of the cost of importing crude oil from abroad.

“We're improving the cost every day,” Rick Billo, the school's dean of engineering, told a local television station.

We started off some time ago at an uneconomical $17,000 a barrel. Today, we're at a cost of $28.84 a barrel.

As the price of crude oil continues to skyrocket – now overing near $80/barrel - being able to produce a barrel of oil at less than half of that price is an attractive proposition, especially for Canadian producers.

According to the Coal Association of Canada, there are major deposits of lignite coal in Southern Saskatchewan, Alberta and Manitoba, though only the Saskatchewan deposits are currently being mined.

Lignite was the source of up to 70% of Saskatchewan's electricity last year.

The University of Texas hopes to license their technology in the next few months and start building the first micro-refineries to produce the cheaper oil in the next year.

Germany, Russia and the U.S. are currently the world's leading producers of lignite coal.
 
One of the more interesting articles I read about the oil issue a few years ago were two articles from the 1960's and 1920's.

Both articles had announced that the world would run out of oil in the next 10 years.

It seems people are predicting running out of oil and global warming for the last 100 years.

It's a big planet, we'll find more.

Who knows maybe we'll "make" fossil fuels in the next 100 years.
 
More on super efficient engines. The thing to note about this technology is it may be availabel to retrofit to existing engines, which has interesting downstream effects. Imagine if every military vehicle had this sort of fuel injection system; the projected gain is 25% increase in fuel economy for diesel engines. This means you could reduce the amount of fuel needed in the AOR by 25%, which equals a further reduction in the amount of truck traffic needed to carry the fuel, which translates to additional savings. (fewer convoys, fewer armoured or patrol vehicles on the road as escorts etc.)

Well worth a follow up:

http://nextbigfuture.com/2010/03/transonic-supercritical-fuel-injection.html

Transonic Supercritical Fuel Injection Could Improve Gasoline Engines by 50-75 Percent

Transonic Combustion, based in Camarillo, CA, has developed a gasoline fuel injection system that can improve the efficiency of gasoline engines by 50 to 75 percent, beating the fuel economy of hybrid vehicles. A test vehicle the size and weight of a Toyota Prius (but without hybrid propulsion) showed 64 miles per gallon for highway driving. The company says the system can work with existing engines, and costs about as much as existing high-end fuel injection.

Transonic Combustion uses supercritical-state fuel to radically shift the technological benefits of the automotive internal combustion engine This technology was featured at the ARPA-E Innovation summit and has DOE funding.

    TSCi Fuel Injection achieves lean combustion and super efficiency by running gasoline, diesel, and advanced bio-renewable fuels on modern diesel engine architectures. Supercritical fluids have unusual physical properties that Transonic is harnessing for internal combustion engine efficiency. Supercritical fuel injection facilitates short ignition delay and fast combustion, precisely controls the combustion that minimizes crevice burn and partial combustion near the cylinder walls, and prevents droplet diffusion burn. Our engine control software facilitates extremely fast combustion, enabled by advanced microprocessing technology. Our injection system can also be supplemented by advanced thermal management, exhaust gas recovery, electronic valves, and advanced combustion chamber geometries.


Fuel efficiency improvements enabled by advanced combustion technologies of 50% or more for automotive engines (relative to spark-ignition engines dominating the road today in the U.S.) and 25% or more for heavy-duty truck engines (relative to today’s diesel truck engines) are possible in the next 10 to 15 years



Our fuel system efficiently supports engine operation over the full range of conditions – from stoichiometric air-to-fuel ratios at full power to lean 80:1 air-to-fuel ratios at cruise, with engine-out NOx at just 50% of comparable standard engines. Our real-time programmable control of combustion heat release results in dramatically increased efficiency.

Along with operating on gasoline, our technology can efficiently utilize fuels based on their chemical heat capacity independent of octane or cetane ratings. Thus, economical, highly functional mixtures of renewable plant products can be utilized which are not practical in either conventional spark or compression ignition engines. In dynamometer testing on current engine architectures, our technology has successfully run on gasoline, diesel, biodiesel, heptane, ethanol, and vegetable oil. Recently our engineers achieved seamless operation alternating between several different fuels on one of our customer’s engines in our Camarillo test facilities.


Supercritical Fuel Injection

Automotive Engineering International Feature - Supercritical fuel injection and combustion

    Recent work by Mike Cheiky, a physicist and serial inventor/entrepreneur, is focusing on raising not only the fuel mixture’s pressure but also its temperature.

    Cheiky's aim, in fact, is to generate a little-known, intermediate state of matter—a so-called supercritical (SC) fluid—which he and his co-workers at Camarillo, CA-based Transonic Combustion believe could markedly increase the fuel efficiency of next-generation power plants while reducing their exhaust emissions.

    Transonic’s proprietary TSCi fuel-injection systems do not produce fuel droplets as conventional fuel delivery units do, according to Mike Rocke, Vice President of Marketing and Business Development. The supercritical condition of the fuel injected into a cylinder by a TSCi system means that the fuel mixes rapidly with the intake air which enables better control of the location and timing of the combustion process.

    The novel SC injection systems, which Rocke calls “almost drop-in” units, include “a GDI-type,” common-rail system that incorporates a metal-oxide catalyst that breaks fuel molecules down into simpler hydrocarbon chains, and a precision, high-speed (piezoelectric) injector whose resistance-heated pin places the fuel in a supercritical state as it enters the cylinder.

    Company engineers have doubled the fuel efficiency numbers in dynamometer tests of gas engines fitted with the company’s prototype SC fuel-injection systems, Rocke said. A modified gasoline engine installed in a 3200-lb (1451-kg) test vehicle, for example, is getting 98 mpg (41.6 km/L) when running at a steady 50 mph (80 km/h) in the lab.

    The 48-employee firm is finalizing a development engine for a test fleet of from 10 to 100 vehicles, while trying to find a partner with whom to manufacture and market TSCi systems by 2014.

    “A supercritical fluid is basically a fourth state of matter that’s part way between a gas and liquid,” said Michael Frick, Vice President for Engineering. A substance goes supercritical when it is heated beyond a certain thermodynamic critical point so that it refuses to liquefy no matter how much pressure is applied.

    SC fluids have unique properties. For a start, their density is midway between those of a liquid and gas, about half to 60% that of the liquid. On the other hand, they also feature the molecular diffusion rates of a gas and so can dissolve substances that are usually tough to place in solution.

    To minimize friction losses, the Transonic engineers have steadily reduced the compression of their test engines to between 20:1 and 16:1, with the possibility of 13:1 for gasoline engines.


Patents

Thus far 3 patents (#7444230, #7546826, #7657363) have been issued to Transonic from the U.S. Patent and Trademark Office related to our technology, with another 14 patents pending.

Patent 7444230 - Fuel injector having algorithm controlled look-ahead timing for injector

    The present invention provides an injector-ignition fuel injection system for an internal combustion engine, comprising an ECU controlling a heated catalyzed fuel injector for heating and catalyzing a next fuel charge, wherein the ECU uses a one firing cycle look-ahead algorithm for controlling...


Application number: 12/464,790 - INJECTOR-IGNITION FOR AN INTERNAL COMBUSTION ENGINE

    The present invention provides a heated catalyzed fuel injector that dispenses fuel substantially exclusively during the power stroke of an internal combustion engine, wherein ignition occurs in a fast burn zone at high fuel density such that a leading surface of the fuel is completely burned...

 
CANDU reactors are old technology and seem to have only one use left: recycle old nuclear fuel. (More modern designs do not use the heavy water moderator of the CANDU, and some designs don't use water at all):

http://nextbigfuture.com/2010/03/china-loading-used-light-water-nuclear.html

China Loading Used Light Water Nuclear Reactor Fuel Into CANDU Heavy Water Reactor


The first re-use of nuclear fuel in a Candu reactor has started at Qinshan nuclear power plant in China.

    Over the next six months, another 24 of the 'natural uranium equivalent' (NUE) bundles will be used in two of the reactor's fuel channels. If successful over a one-year trial, this practice could help China get more energy from its imported uranium and reduce stocks of highly-radioactive used nuclear fuel at the same time.

    To make this first batch of NUE fuel, Qinshan managers collaborated with AECL, the Nuclear Power Institute of China and China North Nuclear Fuel Corporation. Fuel that had previously been used was processed to recover unspent uranium and this was mixed with some depleted uranium to achieve a mix with the same overall characteristics as natural uranium. Technical challenges in doing this included the highly-radioactive nature of the used fuel and achieving the right blend of depleted uranium and the recovered stocks still enriched up to around 1.6%.

    A report late in 2009 suggested that China should build another two Candu reactors as part of a used fuel managment strategy.

    A program in South Korea has pursued similar goals for some time. Dupic (Direct Use of PWR fuel in Candu) envisages the used fuel pellets from PWR fuel being broken up, heated to drive off radioactive fission products and then reformed for use in Candu fuel. Using Candu reactors in a similar way is also under investigation in Ukraine.

Canada Atomic Energy of Canada needs this to work for any meaningful future for its reactors and the company. The only other business for the makers of CANDU would be supporting the legacy fleet of reactors until they are decommissioned. This would provide them with a unique capability and niche service for extending uranium supplies. This would last until there were other better ways to extend uranium with better deep burn reactors.
 
Yet more ways to make synthetic fuels from biomass:

http://www.technologyreview.com/energy/24891/?nlid=2851&a=f

From Biomass to Chemicals in One Step
A startup's catalytic process converts biomass directly into components of gasoline.
By Katherine Bourzac

An early-stage company spun out of the University of Massachusetts, Amherst, plans to commercialize a catalytic process for converting cellulosic biomass into five of the chemicals found in gasoline. These chemicals are also used to make industrial polymers and solvents. Anellotech, which is seeking venture funding, plans to build a pilot plant next year.

Anellotech's reactors perform a process called "catalytic pyrolysis," which converts three of the structural molecules found in plants--two forms of cellulose and the woody molecule lignin--into fuels. Ground-up biomass is fed into a high-temperature reactor and blended with a catalyst. The heat causes the cellulose, lignin, and other molecules in the biomass to chemically decompose through a process called pyrolysis; a catalyst helps control the chemical reactions, turning cellulose and lignin into a mix of carbon-ring-based molecules: benzene, toluene, and xylenes.

The global market for this group of chemicals is $80 billion a year and growing at a rate of 4 percent a year, says Anellotech CEO David Sudolsky. "We're targeting to compete with oil priced at $60 a barrel, assuming no tax credits or subsidies," he says. The company's founder, George Huber, says his catalytic pyrolysis process can create 50 gallons of the chemicals per metric ton of wood or other biomass, with a yield of 40 percent. The other products of the reaction include coke, used to fuel the reactor.

"The advantage of pyrolysis is that it uses whole biomass," says John Regalbuto, an advisor to the Catalysis and Biocatalysis Program at the National Science Foundation. On average, lignin accounts for 40 percent of the energy stored in whole biomass. But because it can't be converted into sugars the way cellulose can, lignin can't be used as a feedstock for fermentation processes such as those used by some biofuels companies to convert sugarcane into fuels.

Pyrolysis is also different from gasification, another process for using whole biomass. Gasification results in a mixture of carbon and hydrogen called syngas, which can then be used to make fuel. Pyrolysis, by contrast, turns biomass into liquid fuels in a single step. And while gasification can only be done economically at a very large scale, says Regalbuto, catalytic pyrolysis could be done at smaller refineries distributed near the supply of biomass.

Pyrolysis is an efficient way to use biomass, but it's difficult to control the products of the reaction, and it's difficult to get high yields. The keys to Anellotech's process, says Huber, are a specially tailored catalyst and a reactor that allows good control over reaction conditions. Huber's group at UMass, where he is a professor of chemical engineering, was the first to develop a catalytic process for converting biomass directly into gasoline, and Anellotech's processes are based on this work.

So far, Huber has developed two generations of a reactor in the lab. In tests, the group starts with sawdust waste from a local mill. The ground-up biomass is fed into a fluidized bed reactor. Inside, a powdered solid catalyst swirls around in a mixture of gas heated to about 600 ºC. When wood enters the chamber, it rapidly breaks down, or pyrolyzes, into small unstable hydrocarbon molecules that diffuse into the pores of the catalyst particles. Inside the catalyst, the molecules are reformed to create a mixture of aromatic chemicals. The reaction process takes just under two minutes.

The company would not disclose details about the catalyst, but Huber says one of its most important properties is the size of its pores. "If the pores are too big, they get clogged with coke, and if they're too small, the reactants can't fit in," says Huber. The company's catalyst is a porous silicon and aluminum structure based on ZSM-5, a zeolite catalyst developed by Mobil Oil in 1975 and widely used in the petroleum refining industry. Sudolsky says that it can be made cheaply by contractors. Anellotech's reactors are very similar to those used to refine petroleum. But the company's reactors are designed to ensure rapid heat transfer and fluid dynamics that ensure that the reactants enter a catalyst before they turn into coke.

Stefan Czernik, a senior scientist at the National Renewable Energy Laboratory's National Bioenergy Center in Golden, CO, cautions that the process has so far only been demonstrated on a small scale, and the complexity of these reactors could mean a long road ahead for scaling them up. "It is not easy to replicate at a large scale the relationship between the chemical reaction and heat transfer as it's done in the laboratory," he says.

After demonstrating the process at a pilot plant next year, Anellotech hopes to partner with a chemical company to build a commercial scale facility in 2014. Sudolsky says the company will either license the catalytic pyrolysis process to other companies or build plants distributed near biomass sources, since transporting biomass is not economically viable.

Copyright Technology Review 2010.
 
Another approach:

http://nextbigfuture.com/2010/04/dense-plasma-physics-update-great-month.html

Dense Plasma Physics Update - A Great Month for Focus Fusion

Lawrenceville Plasma Physics reports good progress in March, 2010.

At the beginning of March, good shots (those without pre-firing and with pinches) were a bit under 50% of the shots we fired. Since mid-month, we have increased that to 90% good shots. The two time-of-flight neutron detectors have produced more evidence that we are already duplicating the high ion energies achieved with higher currents in the Texas experiments. In our best shots, ion energies were measured in the range of 40-60 keV (the equivalent of 0.4-0.6 billion degrees K). The electron beam carried about 0.5 kJ of energy and the plasmoid held about 1 kJ of energy, nearly half that stored in the magnetic field of the device. So, this is evidence that a substantial part of the total energy available is being concentrated in the plasmoids and transferred to the beams.

    We found that the control shots (with the magnetic coil turned off) were increasingly producing more neutrons (up to about 10 times) as the control shots in the beginning of our testing. It turns out the steel flanges that attach the vacuum chamber to the inner lower bus plate and the bus plate itself were both becoming permanently magnetized. This provides additional (though unintended) evidence that the predicted angular momentum effect is working. In the future, we may find it necessary to replace the flanges and bus plate with those made from non-magnetic alloys, but that will have to wait for now.

    On March 18, Lerner gave an invited presentation on the DPF to an audience of physicists and engineers at Princeton Plasma Physics Laboratory, the nation's largest fusion lab. The Princeton physicists responded with interest and some friendly questions. The atmosphere was one of collaboration, not competition.

    Finally, we received enough investment money to carry us through the end of summer, with additional funding pledged. This means we are almost halfway to our goal of raising $900K in this capital drive.

Lawrenceville Plasma Physics had eight objectives for their two year research program This work seems to show good progress on four of the eight objectives.

Advancing dense plasma focus fusion to about break even energy would enable a radical advance to fusion spaceplanes and rockets

If Lawrenceville Plasma Physics (LPP) achieves the full success, then a Focus Fusion reactor would produce electricity very differently. The energy from fusion reactions is released mainly in the form of a high-energy, pulsed beam of helium nuclei. Since the nuclei are electrically charged, this beam is already an electric current. All that is needed is to capture this electric energy into an electric circuit. This can be done by allowing the pulsed beam to generate electric currents in a series of coils as it passes through them. This is much the same way that a transformer works, stepping electric power down from the high voltage of a transmission line to the low voltage used in homes and factories. It is also like a particle accelerator run in reverse. Such an electrical transformation can be highly efficient, probably around 70%. What is most important is that it is exceedingly cheap and compact. The steam turbines and electrical generators are eliminated. A 5 MW Focus Fusion reactor may cost around $300,000 and produce electricity for 1/10th of a cent per kWh. This is fifty times less than current electric costs. Fuel costs will be negligible because a 5 MW plant will require only five pounds of fuel per year. [About 40 million kWh per year from a 5 MWe plant and 5 MWe is equal to 6705 horsepower]
 
Here, reproduced under the Fair Dealing provisions (§29) of the Copyright Act from the Globe and Mail is more on the topic:

http://www.theglobeandmail.com/news/opinions/you-can-turn-off-the-lights-or-collect-solar-energy-in-space/article1520485/
You can turn off the lights – or collect solar energy in space
Strategic prize: Space-based satellites can tap ‘an inexhaustible reservoir' of clean, renewable energy by 2050 or earlier

Neil Reynolds

Thursday, Apr. 01, 2010
Anyone can do their part for the planet – as millions of people did for an hour last month by turning off the lights. The trick is to do it without resorting to darkness.

For the moment, Japan leads the way with its ambitious program to collect solar energy in space, convert it into electromagnetic microwaves and deliver it wirelessly to precise locations on Earth. This transmission technology will do to terrestrial power lines what cellphones did to telephone poles. Funded in part by a consortium of 16 corporations (led by Mitsubishi Electric), Japan expects its prototype space-based power station to provide electricity to 300,000 Tokyo homes by 2030.

In the end, though, the United States won't be far behind – and, for competitive reasons, probably will surpass Japan in the pursuit of space-based solar power. Ostensibly at least, Tokyo lacks the military motivation of Washington – although, as a resources-bereft country, Japan must ensure its energy supply from somewhere else simply to survive.

For its part, the U.S. Defence Department's National Security Space Office (NSSO) adopted space-based energy as a strategic priority in 2007. President Barack Obama's 2010 budget, which essentially cut lunar adventures to fund economy-class spaceships, can be interpreted as a prerequisite investment in space-based energy: A power station in space, 36,000 kilometres or more above Earth, will require 120 launches (of maintenance crews) a year.

With its unclassified assessment of space-based solar power, the NSSO remains an accessible source of information on the relevant science and technology. For a bureaucratic organization in a military hierarchy, the NSSO compiled its report in a uniquely collaborative way – at no cost. The agency simply created an access-controlled website and invited the world's leading scientists to participate – and 170 did. The NSSO report reflects the scientific consensus.

The strategic prize, the NSSO concludes, is obvious: Space-based satellites can economically tap “an inexhaustible strategic reservoir” of clean, renewable energy by 2050 or earlier.

The military importance, it notes, is also obvious: “For the [Department of Defence] specifically, beamed energy from space … has the potential to be a disruptive game-changer on the battlefield.” With wireless technology, space-based solar power could deliver electricity across an entire theatre of war – right down to the individual soldier. It could dramatically reduce the chance of international conflict arising from energy shortages, and it could provide on-demand energy for humanitarian purposes in disaster zones. In short, the NSSO says, it could enable the U.S. military “to remain relevant” for the 21st century.

“The basic idea is very straightforward,” the NSSO says. “Place very large solar arrays into an intensely sunlit Earth orbit. Collect gigawatts of electrical energy and electromagnetically beam them to Earth.” The electricity could be delivered to either conventional electrical grids or directly to consumers. It could also be used to manufacture synthetic hydrocarbons.

Spread an array of solar collectors over a single square kilometre, the NSSO says, and you can collect a supply of energy – every year – “equal to the energy contained in all of the known recoverable conventional oil reserves on Earth today.”

This amount of energy “indicates that there is enormous [energy] potential for … the nations who construct and possess an SBSP capability.” One of the countries that has expressed its interest in acquiring such a capability, the NSSO says, is Canada.

Although complicated, the delivery of space-based energy would not be much more heroic than “the construction of a large modern aircraft carrier, a skyscraper or a large hydroelectric dam.” A single solar-power satellite would be 15 times the size of the International Space Station (344 metric tonnes). In comparison, the Great Pyramid at Giza has a mass of 5.9 million metric tons.

Although the space beam would require a sizable target on Earth, this receiver would be based in a desert – perhaps in South Dakota or sub-Saharan Africa. With its abundant supply of energy, though, these desert zones would be transformed into lush agricultural land. (The NSSO compares the intensity of the space beam to the heat thrown off by a campfire.)

The NSSO expresses considerable curiosity why environmentalists appear obsessed with much more difficult terrestrial energy sources that can't be as efficiently or as cleanly produced as space-based power – which, it says, would produce (on a “lifecycle” basis) one-60th of the carbon emitted by fossil fuels.

You would think that environmentalists would be thrilled to join forces with the Pentagon. As Thomas Edison put it in 1931: “I'd put my money on the sun and on solar energy.”


The business of transferring electrical power as ’beamed power’ is old hat. Canadians were doing it, the hard way, from earth to 'space,' 25+ years ago with a test vehicle called SHARP (Stationary High Altitude Research Project).
 
From last week:

Globe and Mail link


Researchers at the University of Texas at Arlington (UTA) announced last month that they have developed a clean way to turn the cheapest kind of coal - lignite, common in Texas - into synthetic crude. "We go from that [lignite coal] to this really nice liquid," Brian Dennis, a member of the research team, said in describing the synthetic crude that can be refined into gasoline.

Assuming that these Texas folk are correct, this advance in technology could represent a historic moment in energy production - for Canada as well as for the United States. Canada has huge reserves of lignite coal in Manitoba, Alberta and Saskatchewan (which already gets 70 per cent of its electricity from this common coal) - not to mention in Nova Scotia.

The Texas researchers, who worked on the project for about 18 months, expect the cost to drop further. "We're improving the cost every day. We started off some time ago at an uneconomical $17,000 a barrel. Today, we're at ... $28.84 a barrel," Rick Billo, UTA's dean of engineering, told an Austin television reporter.

Texas lignite coal sells for $18 a tonne. The coal conversion technology uses one tonne of coal to produce 1.5 barrels of crude oil. One barrel of crude produces 42 U.S. gallons of gasoline. In other words, $18 worth of coal yields 63 gallons of gasoline: 0.28 cents a gallon.


(...)
 
Cougar Daddy and E.R.

you're both taking me back to 1970 and science fiction.

E.R. - Did  you ever read Larry Niven?

And Cougar Daddy - the Club of Rome's Limits to Growth was every bit as entertaining even though it was spoonfed to us the same way Al Gore's Inconvenient Truth was.

The same logical fallacy in both cases.  If you continue doing what you're doing dire things will happen.  The ONLY solution is to do as I tell you.....

One imagination vs 6,000,000,000 imaginations (more when you include those that have died and those that have been born since 1972).

I just can`t get overly scared about much of anything anymore .... except perhaps sheep and shepherds.

 
Am I not looking at this right or can someone not do math?

Texas lignite coal sells for $18 a tonne. The coal conversion technology uses one tonne of coal to produce 1.5 barrels of crude oil. One barrel of crude produces 42 U.S. gallons of gasoline. In other words, $18 worth of coal yields 63 gallons of gasoline: 0.28 cents a gallon.

forget it.....I can't read.... ::)
 
Factor in the transportation to Refinery, costs of refining, storage, transport of finished product and then unionized labour and then what is the cost?
 
George Wallace said:
Factor in the transportation to Refinery, costs of refining, storage, transport of finished product and then unionized labour and then what is the cost?

and carbon footprint....
 
Then we let the Provincial governments bring in their HST/PST/GST and other environmental taxes and such......  >:D
 
Back
Top