• Thanks for stopping by. Logging in to a registered account will remove all generic ads. Please reach out with any questions or concerns.

A scary strategic problem - no oil

Of course it would be the Japanese who beat everyone (again) to the market:

http://gas2.org/2010/10/20/mazda2-subcompact-to-be-worlds-most-fuel-efficient-car/

[close]
Mazda2 Subcompact To Be World’s Most Fuel Efficient Car?
4 comments
October 20, 2010 in Cars

I have a soft spot in my heart for Mazda. For one, they made one of the best sports cars of the 1980′s, the RX-7, and they did it their own way via a rotary engine. Mazda has also been integral to Ford’s renewed success, as the Blue Oval’s mid-size lineup (Fusion/Milan/MKZ) are based off of the Mazda6′s architecture. Ford has recently divested itself of its stake in Mazda, though perhaps at the wrong time.

Reuters is reporting that the Mazda2 subcompact, which goes on sale in Japan next year and will eventually make its way to America, could get gas mileage of around 70 mpg. That would make it the most efficient gas-only car in the world.

I have personally seen the Mazda2 up close and personal, and it certainly is a cute car. Mazda, which has no hybrid engine systems of its own, has taken to vastly improving its line of gas and diesel engines to compete with hybrids. If these rumors are true, not only are they competing, but completely blowing the competition out of the water. A 70 mpg gas-only car would outdo every hybrid on the planet. Of course, it depends on what continent the mpg is calculated, as Japan, America, and Europe all have different standards.

The Mazda2 goes on sale in Japan in 2011, and will be priced well-below any hybrid, which makes sense since it will use less “exotic” technology. It will have a choice of petrol or diesel engines from the new SKY lineup, which promises improved fuel mileage and performance by increasing the compression ratio and squeezing the most amount of energy out of every squirt of gas. Mazda is making a name for itself outside of the herd by ignoring hybrids and improving the internal combustion engine, though how long this tactic can work to their benefit, I don’t know. You’ve got to admire their gusto for trying to be different though, and it may pay massive dividends down the road.

Oh, and as for Ford, Mazda claims they still have a strategic partnership. We will see how that pans out as well.
 
The Mazda2 is already here....I saw a number of them on the Mazda lot in Kitchener a couple of weekends ago on the way to my brother-in-law's place. 
 
Cdn Blackshirt said:
The Mazda2 is already here....I saw a number of them on the Mazda lot in Kitchener a couple of weekends ago on the way to my brother-in-law's place.

Did you notice a sticker price?
 
Sorry....didn't get that close.

Was driving by and just caught them out of the corner of my eye and thought "Hey I know what those are!"

That being said, I just searched for and found the dealership online and here's the Mazda2 Pricing page:

http://kieswettermazda.com/New_Vehicles/2011-Mazda2-V_Price.php


Cheers, Matthew.  :salute:
 
More on the Mazda 2 engine.

Perhaps a business model for them would be to sell engines built along these lines as drop in replacements for busses, trucks and high mileage vehicles like taxis.

http://www.technologyreview.com/printer_friendly_article.aspx?id=26613&channel=energy&section=

70 mpg, without a Hybrid

A new Mazda model debuting in Japan gets its high fuel economy from an improved gas engine and a lightweight design.
By Kevin Bullis

Next year, Mazda will sell a car in Japan that gets 70.5 miles per gallon (mpg), or 30 kilometers per liter. The fuel economy rating won't be nearly this good in the United States because of differing requirements, but even so, the car will likely use about as little fuel as a hybrid such as the Toyota Prius--without that car's added costs for its electric motor and batteries.

The Mazda, a subcompact called the Demio in Japan and the Mazda 2 elsewhere, will include a package of changes that improves fuel economy by about 30 percent over the current model. These include a more efficient engine and transmission, and a lighter body and suspension. The Mazda 2, and a range of new cars from other automakers that have been engineered to meet more stringent fuel economy standards, demonstrate what some experts have been saying for some time--internal combustion-powered cars are far from outdated. Indeed, improvements to gas-powered cars can reduce worldwide fuel consumption more quickly than introducing hybrids or electric vehicles, because variations on traditional engines tend to be less expensive and can be quickly implemented on more cars.

"We've been making engines for 100 years, and we keep figuring out how to make improvements in them. We will continue to figure out further improvements," says Greg Johnson, the manager of Ford's North American powerpacks. "For another 50 years, if not more, the internal combustion engine will be the primary driver." This week, Ford announced changes to its Focus model that improve its fuel economy by about 17 percent, to an estimated 40 mpg.

Mazda says the biggest source of improvement for the Mazda 2 is a new engine that compresses the fuel-air mixture in the engine far more than conventional gasoline engines do. Ordinarily, gas engines have about a 10-to-1 compression ratio. Mazda increased this to 14 to 1, a level typically seen only in diesel engines. Increasing compression has long been known to increase efficiency, but compressing the fuel-air mixture too much causes it to ignite prematurely--before the spark sets it off--a phenomenon called knocking. That decreases performance and can damage the engine. Mazda has introduced innovations to avoid knocking.

As a number of automakers, including Ford, are doing, Mazda has introduced direct injection--which involves spraying fuel directly into the engine's combustion chamber rather than into an adjacent port. Doing this cools the chamber, which helps prevent premature ignition. Mazda also modified the exhaust system--increasing the length and shape of the exhaust pipes to allow more exhaust gas to escape after combustion. Removing these hot gases also keeps the temperature down, but it has the drawback of interfering with emissions controls. That required other changes in the engine, including modifying ignition timing and the shape of the pistons.

Mazda also found that above a certain compression ratio, some of the bonds in gasoline molecules begin to break, generating heat. These reactions increase the total amount of energy released from the gasoline, improving efficiency, the company says. To take advantage of this phenomenon, the engineers set the ignition timing to occur after these bonds start to break.

Just as important for improving fuel economy were a new transmission and a redesign of the frame to use less steel, or to use lighter, high-tensile steel. Mazda also says it redesigned the suspension system to make it lighter without sacrificing performance. Mazda has also announced a diesel engine that could be about 20 percent more efficient than the new gasoline one.

The 70.5 mpg rating the car received in Japan isn't a clear indication of what Mazda 2's rating will be in the United States, which has different test procedures, safety requirements, and emissions requirements. The current version of the Mazda 2 was rated at 54 mpg in Japan, but only 35 mpg (for the manual transmission version) in the United States. Michael Omotoso, manager of the power train forecasting group at J.D. Power and Associates, estimates that the new car could be rated between 50 and 60 mpg in the U.S., giving it a chance to eclipse the 51 mpg rating of the Prius (which gets 48 mpg on the highway).

Mazda will introduce the new engine and transmission in a number of vehicles next year, although it has not announced the specific models, or when the new Mazda 2 will be available in the United States. The new engine and transmission will be introduced in the United States next year in a larger car that will get about 43 mpg.

Although the new Mazdas avoid the costly motor, power electronics, and battery pack required in a hybrid, the improvements will likely add to the cost of the cars. Volkswagen recently introduced an 83-mpg diesel vehicle that wasn't successful because of the high costs of achieving these fuel economy levels, Omotoso says. Mazda hasn't announced prices yet. "I would think they learned a lesson from Volkswagen," he says.

Copyright Technology Review 2010.
 
Thucydides said:
How not to save oil:

http://pajamasmedia.com/blog/the-gm-volt-fascism-strikes-the-auto-industry/


1 - PJ media is generally known as the hard core of the Tea PArty movement et al who are least concerned with factual accuracy and obsessed with the perils of big government.

2 - The label of "fascism" to a car demeans the victims of fascism and really makes rational discussion impossible.

3 - This isn't about a more efficient car engine, it about an economy (or series of economies) spending huge sums to ensure that the current consumption of hydrocarbons can be continued rather than investing that money in finding new ways to use mroe available forms of energy.

Take, for example, the Iraq War. Whether or not it was fought for direct access to oil reserves is up for debate, but it is very difficult to deny that it had a lot to with regional politics in a major oil-producing region. We do have rough estimates that the war will end up costing about $3 trillion US - a fraction of which could have rebuilt the US, and therefore the North American transport infrastructure (much in the same way that the Depression brought on the major highway systems that made the hyrdocarbon economy possible.)

Essentially, it's a matter of financing investment or consumption.
 
jhk87 said:
(much in the same way that the Depression brought on the major highway systems that made the hyrdocarbon economy possible.)

The Interstate Highway System was authorized by the Federal-Aid Highway Act of 1956[11] – popularly known as the National Interstate and Defense Highways Act of 1956 – on June 29.
 
Total expenditures on WPA [works project administration, part of the New Deal] projects through June 1941, totaled approximately $11.4 billion. Over $4 billion was spent on highway, road, and street projects; more than $1 billion on public buildings, including the iconic Dock Street Theatre in Charleston, the Griffith Observatory in Los Angeles, and the Timberline Lodge on Oregon's Mt. Hood;[9] more than $1 billion on publicly owned or operated utilities; and another $1 billion on welfare projects, including sewing projects for women, the distribution of surplus commodities and school lunch projects.[10] One construction project was the Merritt Parkway in Connecticut, the bridges of which were each designed as architecturally unique.[11]
 
Nuclear shipping?

http://nextbigfuture.com/2010/11/nuclear-commercial-ships-on-specific.html#more

Nuclear commercial ships on specific trade routes will be here sooner than expected

"We will see nuclear ships on specific trade routes sooner than many people currently anticipate," said Lloyd's Register CEO Richard Sadler. The organisation has been an independent service provider to the shipping industry for 250 years. There is the potential for market-based measures for controlling carbon dioxide emissions, while the entry into force of strict International Maritime Organisation controls in 2020 provides a firm deadline against which the industry can weigh the benefits of a range of technology enhancements and fuel options. But with no clear technological fix to lower emissions using traditional diesel or LPG fuels, nuclear energy is emerging as a practical option.

    In response to its members' interest in nuclear propulsion Lloyd's Register has recently rewritten its 'rules' for nuclear ships, which concern the integration of a reactor certified by a land-based regulator with the rest of the ship. A draft of the rules was put before Lloyd's technical committee two weeks ago and this represents a further step towards an international regulatory regime to ensure worldwide safety in a potential nuclear shipping sector.

The new program of joint research (Marine and energy consultants BMT Group and Enterprises Shipping and Trading have joined with start-up small reactor firm Hyperion and Lloyd's Register to "investigate the practical maritime applications for small modular reactors.") is meant to produce "a concept tanker ship design based on conventional and modular concepts," said Lloyd's. It noted that "Special attention will be paid to analysis of a vessel's lifecycle cost as well as to hull-form designs and structural layout, including grounding and collision protection."

Nuclear power looked set for a maritime role in the 1960s thanks to early vessels like the Savannah and Otto Hahn, although in the end the Savannah worked for only ten years and the Otto Hahn was repowered with diesel engines after nine years. The Japanese-built Mutsu operated from 1970 until 1992 but none of these ships was a commercial success.

Nuclear options

Cruise liner

A luxury liner has the power demand curve of a town, including peaks at morning and evening mealtimes. Conceivably a 100 MWe nuclear power system could take the baseload role with smaller diesels for peak load and back-up.

Bulk carrier

Transporters moving large cargoes like raw materials on point-to-point routes could run much faster with the extra power and low emissions from a nuclear reactor. A frequent service could be run by fewer vessels, mitigating the extra capital cost.

Supertug

Existing conventionally powered vessels could attach to a nuclear-powered tug for emissions-free passage across oceans.

What about the ports?

Nuclear powered vessels could be the subject of controversy and this would seem to make a nuclear cruise liner concept difficult due to passenger and port acceptance. However, a point-to-point cargo service would need only agreement from two states and the supertug could remain in international water. Another idea is to create a large nuclear vessel with a conventionally powered detachable section to take cargo to port.

There remain about 200 small reactors at sea in military fleets but this technology cannot easily be transferred to the civil sector due to the requirement of using low-enriched uranium (LEU). High-enriched uranium allows more compact reactor designs with weight and controllability benefits.

The reactor of the Hyperion system uses LEU and measures about 1.5 metres by 2.5 metres. It would produce about 70 MWt - enough for about 25 MWe for propulsion. Its 'battery' design simplifies refuelling to a swap-out operation every 8-10 years with the possibility of managed lease arrangements similar to aircraft engines.

However, incorporation of any reactor in a ship would require extensive radiation shielding, consideration of impact protection. A step change in crew training would be required and there is a strong case for crew to be supplied by reactor vendors.
 
No oil indeed:

http://www.nytimes.com/2010/12/28/science/28tierney.html?_r=2&ref=science

Economic Optimism? Yes, I’ll Take That Bet
By JOHN TIERNEY
Published: December 27, 2010

Five years ago, Matthew R. Simmons and I bet $5,000. It was a wager about the future of energy supplies — a Malthusian pessimist versus a Cornucopian optimist — and now the day of reckoning is nigh: Jan. 1, 2011.

The bet was occasioned by a cover article in August 2005 in The New York Times Magazine titled “The Breaking Point.” It featured predictions of soaring oil prices from Mr. Simmons, who was a member of the Council on Foreign Relations, the head of a Houston investment bank specializing in the energy industry, and the author of “Twilight in the Desert: The Coming Saudi Oil Shock and the World Economy.”

I called Mr. Simmons to discuss a bet. To his credit — and unlike some other Malthusians — he was eager to back his predictions with cash. He expected the price of oil, then about $65 a barrel, to more than triple in the next five years, even after adjusting for inflation. He offered to bet $5,000 that the average price of oil over the course of 2010 would be at least $200 a barrel in 2005 dollars.

I took him up on it, not because I knew much about Saudi oil production or the other “peak oil” arguments that global production was headed downward. I was just following a rule learned from a mentor and a friend, the economist Julian L. Simon.

As the leader of the Cornucopians, the optimists who believed there would always be abundant supplies of energy and other resources, Julian figured that betting was the best way to make his argument. Optimism, he found, didn’t make for cover stories and front-page headlines.

No matter how many cheery long-term statistics he produced, he couldn’t get as much attention as the gloomy Malthusians like Paul Ehrlich, the best-selling ecologist. Their forecasts of energy crises and resource shortages seemed not only newsier but also more intuitively correct. In a finite world with a growing population, wasn’t it logical to expect resources to become scarcer and more expensive?

As an alternative to arguing, Julian offered to bet that the price of any natural resource chosen by a Malthusian wouldn’t rise in the future. Dr. Ehrlich accepted and formed a consortium with two colleagues at Berkeley, John P. Holdren and John Harte, who were supposed to be experts in natural resources. In 1980, they picked five metals and bet that the prices would rise during the next 10 years.

By 1990, the prices were lower, and the Malthusians paid up, although they didn’t seem to suffer any professional consequences. Dr. Ehrlich and Dr. Holdren both won MacArthur “genius awards” (Julian never did). Dr. Holdren went on to lead the American Association for the Advancement of Science, and today he serves as President Obama’s science adviser.

Julian, who died in 1998, never managed to persuade Dr. Ehrlich or Dr. Holdren or other prominent doomsayers to take his bets again.

When I found a new bettor in 2005, the first person I told was Julian’s widow, Rita Simon, a public affairs professor at American University. She was so happy to see Julian’s tradition continue that she wanted to share the bet with me, so we each ended up each putting $2,500 against Mr. Simmons’s $5,000.

Just as Mr. Simmons predicted, oil prices did soar well beyond $65. With the global economy booming in the summer of 2008, the price of a barrel of oil reached $145. American foreign-policy experts called for policies to secure access to this increasingly scarce resource; environmentalists advocated crash programs to reduce dependence on fossil fuels; companies producing power from wind and other alternative energies rushed to expand capacity.

When the global recession hit in the fall of 2008, the price plummeted below $50, but at the end of that year Mr. Simmons was quoted in The Baltimore Sun sounding confident. When Jay Hancock, a Sun financial columnist, asked if he was having any second thoughts about the wager, Mr. Simmons replied: “God, no. We bet on the average price in 2010. That’s an eternity from now.”

The past year the price has rebounded, but the average for 2010 has been just under $80, which is the equivalent of about $71 in 2005 dollars — a little higher than the $65 at the time of our bet, but far below the $200 threshold set by Mr. Simmons.

What lesson do we draw from this? I’d hoped to let Mr. Simmons give his view, but I’m very sorry to report that he died in August, at the age of 67. The colleagues handling his affairs reviewed the numbers last week and declared that Mr. Simmons’s $5,000 should be awarded to me and to Rita Simon on Jan. 1, but Mr. Simmons still had his defenders.

One of his friends and fellow peak-oil theorists, Steve Andrews, said that while Mr. Simmons had made “a bet too far,” he was still correct in foreseeing more expensive oil. “The era of cheap oil has ended,” Mr. Andrews said, and predicted problems ahead as production levels off.

It’s true that the real price of oil is slightly higher now than it was in 2005, and it’s always possible that oil prices will spike again in the future. But the overall energy situation today looks a lot like a Cornucopian feast, as my colleagues Matt Wald and Cliff Krauss have recently reported. Giant new oil fields have been discovered off the coasts of Africa and Brazil. The new oil sands projects in Canada now supply more oil to the United States than Saudi Arabia does. Oil production in the United States increased last year, and the Department of Energy projects further increases over the next two decades.

The really good news is the discovery of vast quantities of natural gas. It’s now selling for less than half of what it was five years ago. There’s so much available that the Energy Department is predicting low prices for gas and electricity for the next quarter-century. Lobbyists for wind farms, once again, have been telling Washington that the “sustainable energy” industry can’t sustain itself without further subsidies.

As gas replaces dirtier fossil fuels, the rise in greenhouse gas emissions will be tempered, according to the Department of Energy. It projects that no new coal power plants will be built, and that the level of carbon dioxide emissions in the United States will remain below the rate of 2005 for the next 15 years even if no new restrictions are imposed.

Maybe something unexpected will change these happy trends, but for now I’d say that Julian Simon’s advice remains as good as ever. You can always make news with doomsday predictions, but you can usually make money betting against them.
 
Another cold fusion announcement. Well, they promise to demonstrate a large power output, so there is either a vary bold bunch staking everything on this or the most elaborate fraud ever...

http://nextbigfuture.com/2011/01/multi-kilowatt-nickel-hydrogen-cold.html#more

There will be 10 kilowatt nickel hydrogen cold fusion demonstration on January 15 in Italy and peer reviewed papers

The Journal Of Nuclear Physics (Peer Reviewed online journal) is announcing:

Saturday january 15th Sergio Focardi and Andrea Rossi will make a press conference online about the presentation of the 10 kilowatt module reactor: with 100 of such modules is made the 1 MW plant in construction.

The press conference will start at 10 a.m. Italian Time.

It is a public demonstration of a significant level of power. The Nissan Leaf electric car has an 80 kilowatt electric motor

Here is the Italian press release. il Resto del Carlino is an Italian local newspaper based in Bologna, and is one of the oldest newspapers in Italy. Circulation 165,000.

Here is the Google Translate version of the Italian press release

Here is an earlier Rossi-Focardi paper describing their experiments and what they believe is nickel being fused with hydrogen into a copper isotope


A process (international patent publication N. WO 2009/125444 A1) capable of producing large amounts of energy by a nuclear fusion process between nickel and hydrogen, occurring below 1000 K, is described.

Hydrogen/Nickel cold fusion probable mechanism

The Focardi-Rossi approach considers this shielding a basic requirement for surpassing the Coulomb barrier between the hydrogen nuclei (protons) and the Nickel lattice nuclei, resulting into release of energy, which is a fact, through a series of exothermic nuclear processes leading to transmutations, decays, etc.

The reasoning presented in this note is based on elementary considerations of

· The hydrogen atom (Bohr) in its fundamental energy state
· The Heisenberg uncertainty principle
· The high speed of nuclear reactions (10ˆ-20 sec)

The hydrogen atom (Bohr) in its fundamental state, in the absence of energy perturbations, remains indefinitely in its stationary state shown below. This is due to the in-phase wave (de Broglie), which follows the “circular” path of its single orbiting electron. The wave length and radius of the “circular” path are determined by the fundamental energy state of this atom.

When hydrogen atoms come in contact with the metal (Ni), they abandon their stationary state as they deposit their electrons in the conductivity band of the metal, and due to their greatly reduced volume, compared to that of their atom, the hydrogen nuclei (naked protons) readily diffuse into the defects of the nickel crystalline structure as well as in tetrahedral or octahedral void spaces of the crystal lattice.

It should be underlined that, in addition to the deposited hydrogen electrons, in the nickel mass included are also electrons of the chemical potential of the metal. Jointly these electrons constitute the conductivity electronic cloud, distributed in energy bands (Fermi), and quasi free to move throughout the metallic mass.

it is conceivable that, for a very short time period (e.g. 10ˆ-18 sec), a series of neutral mini atoms of hydrogen could be formed, in an unstable state, of various size and energy level, distributed within the Fermi band, which is enlarged due to the very short time (Heisenberg).

The neutral mini-atoms of high energy and very short wave length – which is in phase with the “cyclic” orbit (de Broglie) – are statistically captured be the nickel nuclei of the crystal structure with the speed of nuclear reactions (10ˆ-20 sec).

For these mini-atoms to fuse with the nickel nuclei, apart from their neutral character for surpassing the Coulomb barrier, they must have dimensions smaller than 10ˆ-14 m, where nuclear cohesion forces, of high intensity but very short range, are predominant. It is assumed that only a percentage of such atoms satisfy this condition (de Broglie).

The above considerations are based only on an intuitive approach and I trust this phenomenon could be tackled in a systematic and integrated way through the “theory of time dependent perturbations” by employing the appropriate Hamiltonian

The mechanism proposed by Focardi – Rossi, verified by mass spectroscopy data, which predicts transmutation of a nickel nucleus to an unstable copper nucleus (isotope), remains in principle valid. The difference is that inside the unstable copper nucleus, produced from the fusion of a hydrogen mini-atom with a nickel nucleus, is trapped the mini-atom electron (β-), which in my opinion undergoes in-situ annihilation, with the predicted (Focardi-Rossi) decay β+ of the new copper nucleus.

The β+ and β- annihilation (interaction of matter and anti-matter) would lead to the emission of a high energy photon, γ, (Einstein) from the nucleus of the now stable copper isotope and a neutrin to conserve the lepton number. However, based on the principle of conservation of momentum, as a result of the backlash of this nucleus, the photon energy γ is divided into kinetic energy of this nucleus of large mass (heat) and a photon of low frequency.

Furthermore, it should be noted that the system does not exhibit the Mössbauer* phenomenon for two reasons:

1. The copper nucleus is not part of the nickel crystal structure and behaves as an isolated atom in quasi gaseous state
2. Copper, as a chemical element, does not exhibit the Mössbauer phenomenon.

In conclusion, it should be underlined that the copper nucleus thermal perturbation, as a result of its mechanical backlash(heat), is transferred to its encompassing nickel lattice and propagated, by in phase phonons (G. Preparata), through the entire nano-crystal. This could explain why in cold fusion the released energy is mainly in the form of heat and the produced (low) γ radiation can be easily shielded.

Further Reading

Is the Rossi energy amplifier the first pico-chemical reactor?

The nuclear signatures that can be expected when contacting hydrogen with nickel, were derived from thermal results recently obtained (Rossi energy amplifier), using the type of reaction paths proposed as the explanation of the energy produced. The consequences of proton or neutron capture have been studied. It was shown that these consequences are not in line with the experimental observations. A novel tentative explanation is thus described. Should this explanation be true, it is proposed to call pico-chemistry the novel field thus opened.

Nuclear signatures to be expected from Rossi energy amplifier

Strong nuclear signatures are expected from the Rossi energy amplifier and it is hoped that this note can help evidence them.

It is of interest to note that in a mechanism is proposed, that strongly suppresses the gamma emission during the run (it is the same mechanism that creates very low energy neutrons, subsequently captured by the nickel. This does not suppress the emission after shut-down, which should be observed, together with the transmutations described above.
 
www.xconomy.com/boston/2010/09/14/joule-gets-biofuel-bacteria-patent/

Fair Dealings and all that:

Joule Gets Biofuel Bacteria Patent
Gregory T. Huang 9/14/10
Cambridge, MA-based Joule Unlimited, a biofuels technology company, announced today it has been granted a U.S. patent for an engineered bacterium that produces liquid hydrocarbon fuels from sunlight and carbon dioxide. The company says it is the first to patent a direct, single-step, continuous process (based on photosynthesis) for producing hydrocarbon fuels without using raw material feedstocks like sugar or corn, or other intermediate steps. Joule says this process could be made cheap enough, and could be employed at large enough scale, to help replace fossil fuels. The news was first reported by the New York Times; a CNET report also has some useful context. Other companies in the biofuels technology sector include Amyris Biotechnologies, Aurora Algae, Bio Architecture Lab, LS9, Sapphire Energy, Synthetic Genomics, and Targeted Growth. Joule says it will begin pilot production of diesel fuel by the end of this year.

Gregory T. Huang is Xconomy's National IT Editor and the Editor of Xconomy Boston. You can e-mail him at gthuang@xconomy.com, call him at 617-252-7323, or follow him at twitter.com/gthuang.

If what Joule is claiming is even remotely true- it will change everything.  By definition, this would be a carbon neutral process.
 
Thucydides said:
Another cold fusion announcement. Well, they promise to demonstrate a large power output, so there is either a vary bold bunch staking everything on this or the most elaborate fraud ever...

http://nextbigfuture.com/2011/01/multi-kilowatt-nickel-hydrogen-cold.html#more

Sorry to say the "Hydrogen/Nickel cold fusion probable mechanism" reference is just a pile of gibberish.
 
While we all wait for a plain english explanation of the "Cold Fusion" demonstration, rest assured that the market is providing incentives to uncover oil in lots of new places:

http://nextbigfuture.com/2011/01/estimates-of-north-dakotas-oil.html

Estimates of North Dakota's Bakken Oil and oil formations around the world like the Bakken

1. Harold Hamm, chairman and chief executive officer of Continental Resources Inc., said the formations in North Dakota and Montana hold about 20 billion barrels of recoverable crude, or about five times the amount previously estimated by federal geologists. The formations also hold the natural gas equivalent of 4 billion barrels of oil.

This is a follow up to a prior article about the Continental resources estimate of the recoverable oil from the Bakken formation

The U.S. Geological Survey released a study in 2008 that estimated that up to 4.3 billion barrels of oil can be recovered in the Bakken. USGS geologist Rich Pollastro said the agency hasn't seen enough data to amend its estimate.

"We think our numbers are fine," Pollastro said Thursday. "We don't see anything at this point that would radically change them."

A state study released after the USGS study found a near identical assessment as the federal report. The state has since bumped its estimate to about 11 billion barrels of oil, based on drilling success and current production rates.

Ed Murphy, the state geologist and director of the Geological Survey, said Continental's new estimate is possible.

"We know the Bakken is going up but we think (Continental's) estimate might be on the high end of what we would potentially come up with," Murphy said.

"The technology continues to improve," Hamm said.

Hamm called his company's assessment "believable" and could mean production at 1 million barrels daily by 2020. He told bankers that would make North Dakota "one of the 13 or 14 largest producing countries — not just state."

Hamm ranked as the 44th-richest American last year, with a net worth of nearly $6 billion, by Forbes magazine estimates

2. How Many "Bakkens" Will Be Found?
The Arthur Creek shale formation in the Southern Georgina (Australia) is very similar to the Bakken but with about 5 times the thickness. All 18 exploratory wells drilled so far have shown oil. Australian geologists certainly seem to think they have found another Bakken. In another 15 to 18 months we will know if this is indeed true.

It is not Australia, however, but France-- where the greatest industry anticipation and activity is building in the search for the next Bakken; in the well known Paris basin.

The Paris basin (current production of less than 15,000 bpd of conventional oil) covers the northern half of France and extends into neighboring countries. It is a vintage oil and gas basin. Over 2 thousand wells have been drilled and 52 fields discovered. It has extensive oil and gas shale deposits. The 3 oil shale formations are the Lower Lias , Amaltheus and Schistes Carton. The current focus of excitement is the Lower Lias with estimated oil in place resource base of a few billion to tens of billions of barrels. The estimates (guesses) for the oil in place in the other 2 formations are much higher.

Torreador Resources asserts that an estimated 100 billion barrels of oil have been generated from source rocks in the Paris basin, of which 30 billion are in the Lower Lias.

In addition to the Paris Basin, Hess thinks it has found a basin analogous to the Bakken in China.

There are potentially huge shale gas discoveries in Argentina, Quebec, Poland, India, the UK, off the coast of Israel, in China, British Columbia.
 
And if that drilling for oil thing isn't working:

http://powerandcontrol.blogspot.com/2011/01/biofuel-breakthrough.html

'
Biofuel Breakthrough?

I have just been notified by my friends at Talk Polywell of a break through in the biologic generation of liquid fuels. The Globe and Mail reports on the breakthrough (although my friends at Talk Polywell think the report is garbled by a not entirely science literate reporter).

In September, a privately held and highly secretive U.S. biotech company named Joule Unlimited received a patent for “a proprietary organism” – a genetically adapted E. coli bacterium – that feeds solely on carbon dioxide and excretes liquid hydrocarbons: diesel fuel, jet fuel and gasoline. This breakthrough technology, the company says, will deliver renewable supplies of liquid fossil fuel almost anywhere on Earth, in essentially unlimited quantity and at an energy-cost equivalent of $30 (U.S.) a barrel of crude oil. It will deliver, the company says, “fossil fuels on demand.”
Not only that. They can tailor the organisms to produce specific fuels using only CO2, water (fresh or salt), and sunlight.

Joule says it now has “a library” of fossil-fuel organisms at work in its Massachusetts labs, each engineered to produce a different fuel. It has “proven the process,” has produced ethanol (for example) at a rate equivalent to 10,000 U.S. gallons an acre a year. It anticipates that this yield could hit 25,000 gallons an acre a year when scaled for commercial production, equivalent to roughly 800 barrels of crude an acre a year.

By way of comparison, Cornell University’s David Pimentel, an authority on ethanol, says that one acre of corn produces less than half as much energy, equivalent to only 328 barrels. If a few hundred barrels of crude sounds modest, recall that millions of acres of prime U.S. farmland are now used to make corn ethanol.

So is this reputable or just a bunch of scammers?
Joule acknowledges its reluctance to fully explain its “solar converter.” CEO Bill Sims told Biofuels Digest, an online biofuels news service, that secrecy has been essential for competitive reasons. “Some time soon,” he said, “what we are doing will become clear.” Although astonishing in its assertions, Joule gains credibility from its co-founder: George Church, the Harvard Medical School geneticist who helped initiate the Human Genome Project in 1984.

Well how about a look at what Biofuels Digest has to say.
In Massachusetts, Joule Unlimited has won a second key patent for its genetically modified cyanobacteria that directly convert sunlight and carbon dioxide into n-alkanes, and other diesel fuel molecules. The patent is the first awarded for a bacteria that makes fuel directly from water, sunlight and CO2, as opposed to organisms that make fuels from sugar or other cellulosic biomass, such as those engineered by LS9, Amyris or Solazyme.

As reported previously in the Digest, Joule is using a genetically modified form of cyanobacteria. Two weeks ago, Joule received its first key patent for “methods and compositions for modifying photoautotrophic organisms as hosts, such that the organisms efficiently convert carbon dioxide and light into n-alkanes."

Those reporting the death of the US as a world power may have been somewhat premature. Joule is reported to be building a prototype plant in Leander, Tex. At this stage of course nothing is certain. It will probably take a couple of years to prove this out and get the "bugs" out of the system. And probably a couple of decades to scale up the idea until the production becomes a significant fraction of US liquid fuel use. Time will tell.

Here is another possible approach:

Green Algae Strategy: End Oil Imports And Engineer Sustainable Food And Fuel

Cross Posted at Classical Values
 
Another "wonder" engine design. I am a bit sceptical about the design as it has been described, and high compression can be achieved through the use of supechargers, turbochargers, comprex pressure wave devices and other external devices, as well as internally through Diesel technology. Wait and see

http://www.technologyreview.com/printer_friendly_article.aspx?id=27124

Automakers Show Interest in an Unusual Engine Design
The Scuderi engine could substantially improve fuel consumption by storing compressed air.
By Kevin Bullis

An engine development company called the Scuderi Group recently announced progress in its effort to build an engine that can reduce fuel consumption by 25 to 36 percent compared to a conventional design. Such an improvement would be roughly equal to a 50 percent increase in fuel economy.

Sal Scuderi, president of the Scuderi Group, which has raised $65 million since it was founded in 2002, says that nine major automotive companies have signed nondisclosure agreements that allow them access to detailed data about the engine. Scuderi says he is hopeful that at least one of the automakers will sign a licensing deal before the year is over. Historically, major automakers have been reluctant to license engine technology because they prefer to develop the engines themselves as the core technology of their products. But as pressure mounts to meet new fuel-economy regulations, automakers have become more interested in looking at outside technology.

Although Scuderi has built a prototype engine to demonstrate the basic design, the fuel savings figures are based not on the performance of the prototype but on computer simulations that compare the Scuderi engine to the conventional engine in a 2004 Chevrolet Cavalier, a vehicle for which extensive simulation data is publicly available, Scuderi says. Since 2004, automakers have introduced significant improvements to engines, but these generally improve fuel economy in the range of something like 20 percent, compared to the approximately 50 percent improvement the Scuderi simulations show.

There's a big difference, however, between simulation results and data from engines in actual vehicles, says Larry Rinek, a senior consultant with Frost and Sullivan, an analyst firm. "So far things are looking encouraging—but will they really meet the lofty claims?" he says. Automakers should wait to see data from an actual engine installed in a vehicle before they license the technology, he says.

A conventional engine uses a four stroke cycle: air is pulled into the chamber, the air is compressed, fuel is added and a spark ignites the mixture, and finally the combustion gases are forced out of the cylinder. In the Scuderi engine, known as a split-cycle engine, these functions are divided between two adjacent cylinders. One cylinder draws in air and compresses it. The compressed air moves through a tube into a second cylinder, where fuel is added and combustion occurs.

Splitting these functions gives engineers flexibility in how they design and control the engine. In the case of the Scuderi engine, there are two main changes from what happens in a conventional internal-combustion engine. The first is a change to when combustion occurs as the piston moves up and down in the cylinder. The second is the addition of a compressed-air storage tank.

In most gasoline engines, combustion occurs as the piston approaches the top of the cylinder. In the Scuderi engine, it occurs after the piston starts moving down again. The advantage is that the position of the piston gives it better leverage on the crankshaft, which allows the car to accelerate more efficiently at low engine speeds, saving fuel. The challenge is that, as the piston moves down, the volume inside the combustion chamber rapidly increases and the pressure drops, making it difficult to build up enough pressure from combustion to drive the piston and move the car.

The split-cycle design, however, allows for extremely fast combustion—three to four times faster than in conventional engines, Scuderi says—which increases pressure far faster than the volume expansion decreases it. He says that fast combustion is enabled by creating very high pressure air in the compression cylinder, and then releasing it into the combustion chamber at high velocities.

Having a separate air-compression cylinder makes it easy to divert compressed air into a storage tank, which can have a number of advantages.  For one thing, it's a way to address one problem with gasoline engines: they're particularly inefficient at low loads, such as when a car is cruising at moderate speeds along a level road. Under such conditions, the air intake in a conventional engine is partly closed to limit the amount of air that comes into the engine—"it's like sucking air in through a straw," Scuderi says, which makes the engine work harder.

In the new engine design, rather than shutting down air flow, the air intake is kept wide open, "taking big gulps of air," he says.  The air that's not needed for combustion is stored in the air tank. Once the tank is full, the compression piston stops compressing air. It's allowed to move up and down freely, without any significant load being put on the engine, which saves fuel. The air tank then feeds compressed air into the combustion chamber.

The air tank also provides a way to capture some of the energy from slowing down the car. As the car slows, the wheels drive the compression cylinder, filling up the air tank. The compressed air is then used for combustion as needed.

It is still far from clear whether the design can be a commercial success. Even if the simulation results translate into actual engine performance in a car, the engine may not prove to be easy and affordable to manufacture, Rinek says, especially with equipment in existing factories. The design will also have to compete with many other up-and-coming engine designs. Scuderi says the first application of the engine might not be in cars, but instead as a power generator, especially in applications where having compressed air on hand can be useful. For example, construction sites can require electricity for power saws and compressed air for nail guns.
 
Fusion energy has been 20 years away for the last 50 years, maybe these alternative approaches will finally yield success:

http://nextbigfuture.com/2011/01/magnetized-target-fusion.html#more

Magnetized Target Fusion

Discovery Magazine provides an update of the work at Los Alamos National Laboratory on magnetized target fusion

Wurden thinks his team has a shot at beating ITER to the break-even finish line, but only if he can scrounge up a little more cash. “We can’t do it with the funding we have now,” he says. “The Department of Energy sponsors all the magnetic fusion research in the country. Alternate projects like ours are at best about 10 percent of the budget, maybe $20 million divided among 10 universities and a couple of national labs.”

The Los Alamos work is similar to the work of General Fusion which we have covered extensively. We have also discussed the Los Alamos work before.

Even if NIF beats canned fusion to break-even, Wurden thinks his approach will be more practical in the long run. NIF’s lasers currently fire just two or three times a day. It takes 30 minutes just to position the fuel capsule. A commercial laser fusion reactor might have to fire about 15 times a second.

For a reality check on Shiva Star, I spoke with Jaeyoung Park, an experimental physicist who has taken leave from Los Alamos to join a small team in Santa Fe that is pursuing its own fusion research. His biggest concern is that Wurden may not be able to contain the deuterium plasma long enough. “It’s very difficult to squeeze the plasma uniformly—the squeezing has to be fast and furious,” Park says. “And heat losses might make it impossible for the plasma to achieve the high temperatures needed for fusion. But Glen is planning some significant experiments, and even if the first ones fail, the results should tell us something important.”

Wurden acknowledges those problems and brings up another for good measure. “How do you control millions of amps of current at thousands of volts?” he asks. “The switches we use are fancy things that work under high voltage. We can switch high currents maybe 20 times a day.” But a working fusion reactor based on Shiva Star would need to handle such currents once every 10 seconds.

Canada's General Fusion is also working to Magnetized Target Fusion but with a different approach

General Fusion’s magnetized target fusion reactor will incorporate a multipurpose liquid-metal lining to produce tritium, protect equipment from damage, and extract the heat that generates energy. The company hopes to achieve break-even by 2013
 
Good news/bad news. The good news is these techniques may radically reduce the need for Americans to import oil. The bad news is we sell the Americans a lot of oil...

http://news.yahoo.com/s/ap/20110209/ap_on_re_us/us_shale_oil_3

New drilling method opens vast oil fields in US
By JONATHAN FAHEY, AP Energy Writer – Wed Feb 9, 3:20 pm ET

A new drilling technique is opening up vast fields of previously out-of-reach oil in the western United States, helping reverse a two-decade decline in domestic production of crude.

Companies are investing billions of dollars to get at oil deposits scattered across North Dakota, Colorado, Texas and California. By 2015, oil executives and analysts say, the new fields could yield as much as 2 million barrels of oil a day — more than the entire Gulf of Mexico produces now.
This new drilling is expected to raise U.S. production by at least 20 percent over the next five years. And within 10 years, it could help reduce oil imports by more than half, advancing a goal that has long eluded policymakers.

"That's a significant contribution to energy security," says Ed Morse, head of commodities research at Credit Suisse.
Oil engineers are applying what critics say is an environmentally questionable method developed in recent years to tap natural gas trapped in underground shale. They drill down and horizontally into the rock, then pump water, sand and chemicals into the hole to crack the shale and allow gas to flow up.

Because oil molecules are sticky and larger than gas molecules, engineers thought the process wouldn't work to squeeze oil out fast enough to make it economical. But drillers learned how to increase the number of cracks in the rock and use different chemicals to free up oil at low cost.
"We've completely transformed the natural gas industry, and I wouldn't be surprised if we transform the oil business in the next few years too," says Aubrey McClendon, chief executive of Chesapeake Energy, which is using the technique.

Petroleum engineers first used the method in 2007 to unlock oil from a 25,000-square-mile formation under North Dakota and Montana known as the Bakken. Production there rose 50 percent in just the past year, to 458,000 barrels a day, according to Bentek Energy, an energy analysis firm.
It was first thought that the Bakken was unique. Then drillers tapped oil in a shale formation under South Texas called the Eagle Ford. Drilling permits in the region grew 11-fold last year.

Now newer fields are showing promise, including the Niobrara, which stretches under Wyoming, Colorado, Nebraska and Kansas; the Leonard, in New Mexico and Texas; and the Monterey, in California.

"It's only been fleshed out over the last 12 months just how consequential this can be," says Mark Papa, chief executive of EOG Resources, the company that first used horizontal drilling to tap shale oil. "And there will be several additional plays that will come about in the next 12 to 18 months. We're not done yet."

Environmentalists fear that fluids or wastewater from the process, called hydraulic fracturing, could pollute drinking water supplies. The Environmental Protection Agency is now studying its safety in shale drilling. The agency studied use of the process in shallower drilling operations in 2004 and found that it was safe.

In the Bakken formation, production is rising so fast there is no space in pipelines to bring the oil to market. Instead, it is being transported to refineries by rail and truck. Drilling companies have had to erect camps to house workers.

Unemployment in North Dakota has fallen to the lowest level in the nation, 3.8 percent — less than half the national rate of 9 percent. The influx of mostly male workers to the region has left local men lamenting a lack of women. Convenience stores are struggling to keep shelves stocked with food.
The Bakken and the Eagle Ford are each expected to ultimately produce 4 billion barrels of oil. That would make them the fifth- and sixth-biggest oil fields ever discovered in the United States. The top four are Prudhoe Bay in Alaska, Spraberry Trend in West Texas, the East Texas Oilfield and the Kuparuk Field in Alaska.

The fields are attracting billions of dollars of investment from foreign oil giants like Royal Dutch Shell, BP and Norway's Statoil, and also from the smaller U.S. drillers who developed the new techniques like Chesapeake, EOG Resources and Occidental Petroleum.

Last month China's state-owned oil company CNOOC agreed to pay Chesapeake $570 million for a one-third stake in a drilling project in the Niobrara. This followed a $1 billion deal in October between the two companies on a project in the Eagle Ford.

With oil prices high and natural-gas prices low, profit margins from producing oil from shale are much higher than for gas. Also, drilling for shale oil is not dependent on high oil prices. Papa says this oil is cheaper to tap than the oil in the deep waters of the Gulf of Mexico or in Canada's oil sands.
The country's shale oil resources aren't nearly as big as the country's shale gas resources. Drillers have unlocked decades' worth of natural gas, an abundance of supply that may keep prices low for years. U.S. shale oil on the other hand will only supply one to two percent of world consumption by 2015, not nearly enough to affect prices.

Still, a surge in production last year from the Bakken helped U.S. oil production grow for the second year in a row, after 23 years of decline. This during a year when drilling in the Gulf of Mexico, the nation's biggest oil-producing region, was halted after the BP oil spill.

U.S. oil production climbed steadily through most of the last century and reached a peak of 9.6 million barrels per day in 1970. The decline since was slowed by new production in Alaska in the 1980s and in the Gulf of Mexico more recently. But by 2008, production had fallen to 5 million barrels per day.

Within five years, analysts and executives predict, the newly unlocked fields are expected to produce 1 million to 2 million barrels of oil per day, enough to boost U.S. production 20 percent to 40 percent. The U.S. Energy Information Administration estimates production will grow a more modest 500,000 barrels per day.

By 2020, oil imports could be slashed by as much as 60 percent, according to Credit Suisse's Morse, who is counting on Gulf oil production to rise and on U.S. gasoline demand to fall.

At today's oil prices of roughly $90 per barrel, slashing imports that much would save the U.S. $175 billion a year. Last year, when oil averaged $78 per barrel, the U.S. sent $260 billion overseas for crude, accounting for nearly half the country's $500 billion trade deficit.

"We have redefined how to look for oil and gas," says Rehan Rashid, an analyst at FBR Capital Markets. "The implications are major for the nation."
 
Thucydides said:
Good news/bad news. The good news is these techniques may radically reduce the need for Americans to import oil. The bad news is we sell the Americans a lot of oil...

http://news.yahoo.com/s/ap/20110209/ap_on_re_us/us_shale_oil_3


There is not, really, any bad news. The Americans may well be able to satisfy more of their domestic demand from their own, domestic sources but global demand for oil continues to rise, meaning that there will still be plenty of people who want to buy our oil at whatever price the market will bear.
 
Further to Thucydides post, the Canadian connection.

There is a TV program about the North Dakota boom, and the complications to the local residents upcoming down here.

http://www.ctv.ca/generic/generated/static/business/article1898055.html

Drilling technology sparks new oil boom

SHAWN McCARTHY

Gary Williams recalls the last time the oil industry showed up in his tiny town of Waskada, Man. Crews punched holes in the prairie ground, then disappeared as suddenly as they arrived when those holes came up empty.

But that was 30 years ago. This time, it’s different. Armed with new drilling technology and eager to reap the rewards of oil’s high prices, companies are tapping complex geological formations, and the crude is flowing, adding Manitoba to Canada’s list of significant oil-producing provinces.

“It’s just a huge boost for the economy in the area,” said Mr. Williams, the town’s mayor. “We were sending our young people to Alberta for the last 10 years and now the trend is reversing and we’re seeing a lot of Alberta people here and some of our people are coming back.”

The oil-drilling boom promises what one company executive calls a “quiet revolution” in the industry. It could reduce the U.S. appetite for imported oil – including, potentially, from the oil sands. And the technological breakthrough could put the brakes on future price increases by bringing new, relatively low-cost supplies to the market – not just in North America but around the world.

Waskada, population 225 and just a few kilometres away from the U.S. border, is on the northern fringe of the prolific Bakken field, a booming unconventional oil play that could soon make North Dakota the second-largest oil-producing state after Texas. The rapid development of the Bakken – which now is now producing 350,000 barrels a day – signals a dramatic new chapter in North American oil industry, where conventional, onshore production was recently considered to be in terminal decline.

As energy companies turned away from low-priced gas, onshore oil production in the United States began reversing a 30-year decline last year. Some analysts project so-called tight-oil plays could contribute two million barrels a day of production by the middle of the decade – nearly as much as current oil sands production.

“It could potentially be a real game changer,” said Peter Tertzakian, chief energy economist at Calgary-based ARC Financial Corp.

“Peak oil in North America is likely not to be peak” given $90 per barrel prices and new technology that makes it easier to recover oil, he said.

Drill crews are being deployed across Western Canada and the United States, tapping new formations or, in many cases, reworking old ones that were first brought on stream in their grandfathers’ time.

Oil companies are adapting the same advanced drilling techniques that created the boom in shale gas: horizontal drilling and multistage hydraulic fracturing that allow them to break open the rocks at various points and capture the hydrocarbons trapped within.

The other key factor in the tight-oil boom is a high oil price, as North American crude is trading around $90 (U.S.) a barrel and international grades, near $100.

“High oil prices are definitely driving this thing,” said Stephen Sonnenberg, a leading geologist at the Colorado School of Mines. “Gas prices are suppressed, oil prices are quite high and everybody is really excited about these tight oil plays.”

Growing U.S. oil production would not have the same deflationary impact on prices that the shale gas boom has had – natural gas is a North American commodity and more sensitive to continental factors, while oil price are set on global markets.

Development of tight-oil projects will reduce the United States’ reliance on crude oil imports from the Middle East and other OPEC sources as well as Canada, meaning producers in those countries will have to look to other markets to sell their oil. Canadian companies are already attempting to increase exports in the face of stagnant American demand; rising U.S. production will put even greater pressure on them to find new markets in Asia. And it could even delay investments in more costly and challenging Arctic fields, particularly if the companies use new drilling technology to boost production around the globe.

But the tight-oil boom is also reviving the fortunes of Canadian independents who expect to squeeze considerably more oil from formations that, until very recently, were viewed as nearly played out. And it is creating another revenue stream for Alberta and Saskatchewan, and to a lesser degree, Manitoba and British Columbia.

Still, there are major challenges to achieving the much-touted production potential. As with the shale gas, there are doubts over whether the production volumes can be sustained, given the rapid decline in individual wells.

As well, the U.S. Environmental Protection Agency is reviewing the use of hydraulic fracturing in the gas industry, and the tight-oil development may well be constrained by regulators. The EPA is addressing widespread fears about the impact on local drinking water resources from the hydraulic fracturing – in which chemical-laced water is shot into rock to pry open cracks and let the hydrocarbons flow.

Companies also need to marshal an army of drilling crews and equipment to develop the fields, and will require massive investment in new pipelines to get the crude to market.

But the boom is already in full swing.

The Waskada field is tiny compared with the Bakken. Still, drilling crews have invaded the thinly populated border area, and Manitoba Energy Minister Dave Chomiak says the province could soon be producing about 50,000 barrels a day of crude, though he admits his more cautious officials forecast 40,000.

Among the handful of companies active around Waskada is Calgary-based Penn West Exploration Ltd. It plans to spend up to $175-million to drill 100 wells in the area, part of a $1-billion capital plan that is focused on tight-oil plays across Western Canada.

Penn West chief executive officer Bill Andrew said it is still too early to know how much production can be squeezed out of the rocks using the horizontal drilling and multistage hydraulic fracturing techniques that have transformed the gas industry.

“It’s not being appreciated in Canada because we have a view that it is all about the oil sands or all about shale gas,” he said. “But the quiet revolution is in tight oil.”

He compares the potential growth pattern to the early days of development in the Western Canadian sedimentary basin. Saskatchewan’s Shaunavon field was discovered in the 1950s but took decades to develop. In the past few years, though, there have been over 200 wells drilled and more than 10 million barrels of oil produced.

“And the big story is, it is not even close to being drilled to its potential,” he said. “On all these fields, we’re still doing the front-end, early-stage delineation. … It seems like the [drilling] application is adaptable – it’s adaptable to multi zones, multi areas, multi jurisdictions.”

In Canada, analysts are still trying to come to grips with the potential for the new drilling technology to boost production from previously conventional plays.

But in the United States, there are some early forecasts. Cambridge Energy Research Associates issued a forecast late last month suggesting tight-oil production could reach two million barrels a day by 2016.

Analysts from Wood Mackenzie Ltd. are somewhat more cautious – forecasting U.S. tight-oil production of 1.6 million barrels a day by 2015, and growing from there.

In addition to the Bakken, companies are targeting Texas’s Eagle Ford play, which produces both gas and oil, the Colorado-centred Niobrara, and several others in California, Texas and Oklahoma.

Oil drilling has soared. The number of crews in United States drilling for oil hit 818 last week, a 23-year high and an 83-per-cent increase from early February, 2010.

Wood Mackenzie analyst Matthew Jurecky said the big tight-oil projects are attracting significant investment capital, including acquisitions by foreign multinationals.

Unlike shale gas, which can be uneconomic at low prices, the tight-oil plays are relatively inexpensive to develop, compared with the oil sands or the ultra-deep water wells. Mr. Jurecky said the leading projects are economical at oil prices below $50 (U.S.) a barrel. At $90, companies expect very attractive rates of return.

“In plays like the Niobrara, expectations are high and money has been put in place for large-scale development there,” Mr. Jurecky said in an interview. “As well, there’s been lots of M&A [mergers and acquisitions] capital, suggesting a high degree of confidence.”

With low gas prices, many natural gas producers – including Canadian companies like Encana Corp. and Talisman Energy Inc. – are shifting their targets to the “wet gas” zones of the Marcellus, Eagle Ford and other shale gas fields. Natural gas liquids – which are counted in U.S. oil production figures – contain many of the components of crude oil but have only 60 to 70 per cent of the heat value.

One of the leaders in the tight-oil boom is Chesapeake Energy Corp., the Oklahoma City-based company that was a prime mover in the development of shale gas.

Chesapeake is shifting its focus away from gas to projects that produce oil or natural gas liquids (NGL). The company expects to increase its liquids production from 49,000 barrels a day currently to 250,000 barrels a day by 2015, which would make it one of the top five producers in the United States.

And foreign oil companies have taken note. Chinese state-owned oil company, Chinese National Offshore Oil Company (CNOOC) has bought a one-third interest in Chesapeake’s acreage in Eagle Ford and Niobrara for $3.5-billion (U.S.).

Mr. Jurecky said the current investment will be the tip of the iceberg if the tight-oil plays prove as prolific and lucrative as many believe they will be.


 
Back
Top