• Thanks for stopping by. Logging in to a registered account will remove all generic ads. Please reach out with any questions or concerns.

A scary strategic problem - no oil

Thucydides said:
There are some direct conversion schemes that are theoretically possible with nuclear fusion, but depend on using exotic reactions like 3He+3He or p+11B, which are far more difficult to initiate than D+D, and may not be technically possible for years to come.

Let alone the fact that helium-3's rare enough that the only schemes I've seen in fifteen years of reading about it involve mining it in outer space. It's cool, but...
 
Thucydides said:
High energy density allows you to extract a lot of usable energy even when (as in a car) up to 66% of the energy is flowing out the tailpipe and radiator. Low energy density means you need a ton of batteries to go 40 miles.

Even with nuclear fusion, much of the energy will go to "waste heat" turning water to steam, any thermal energy plant can only get a maximum of @ 40% of the available energy regardless of the heat source (cow dung or nuclear fusion). There are some direct conversion schemes that are theoretically possible with nuclear fusion, but depend on using exotic reactions like 3He+3He or p+11B, which are far more difficult to initiate than D+D, and may not be technically possible for years to come.

I think we can both agree that hydocarbon energy density is ludicrously low compared to nuclear fusion.  Even if you had to distribute that energy via some sort of cell system, it would still prove to be a more sustainable approach.  That said, it's gonna be a bitch to find a cycle of reactions that simultaneously have low Z, high reaction cross sections, and probably use multiple chains.  Not an easy combo.

What will be interesting to see is if those high efficiency nanomaterials that directly convert radiation into electricity ever live up to their hype.  Couple that with a clean gamma source and we're sitting pretty.
 
Here's something that would make gas stations look a bit more seedy:

Forget gas, batteries — pee is new power source
http://www.msnbc.msn.com/id/31805166/

Scientists can create cheap hydrogen from urine for use in fuel cells
By Eric Bland

updated 5:34 p.m. ET, Wed., July 8, 2009
Urine-powered cars, homes and personal electronic devices could be available in six months with new technology developed by scientists from Ohio University.

Using a nickel-based electrode, the scientists can create large amounts of cheap hydrogen from urine that could be burned or used in fuel cells. "One cow can provide enough energy to supply hot water for 19 houses," said Gerardine Botte, a professor at Ohio University developing the technology. "Soldiers in the field could carry their own fuel."

Pee power is based on hydrogen, the most common element in the universe but one that has resisted efforts to produce, store, transport and use economically.

Storing pure hydrogen gas requires high pressure and low temperature. New nanomaterials with high surface areas can adsorb hydrogen, but have yet to be produced on a commercial scale.

Chemically binding hydrogen to other elements, like oxygen to create water, makes it easier to store and transport, but releasing the hydrogen when it's needed usually requires financially prohibitive amounts of electricity.

By attaching hydrogen to another element, nitrogen, Botte and her colleagues realized that they can store hydrogen without the exotic environmental conditions, and then release it with less electricity, 0.037 Volts instead of the 1.23 Volts needed for water.

One molecule of urea, a major component of urine, contains four atoms of hydrogen bonded to two atoms of nitrogen. Stick a special nickel electrode into a pool of urine, apply an electrical current, and hydrogen gas is released.

Botte's current prototype measures 3x3x1 inch and can produce up to 500 milliwatts of power. However, Botte and her colleagues are actively trying to commercialize several larger versions of the technology.

A fuel cell, urine-powered vehicle could theoretically travel 90 miles per gallon. A refrigerator-sized unit could produce one kilowatt of energy for about $5,000, although this price is a rough estimate, says Botte.

"The waste products from say a chicken farm could be used to produce the energy needed to run the farm," said John Stickney, a chemist and professor at the University of Georgia.

 
-... and what produces urine? BEER!!

- Whoever in the past said that we coudn't drink our way out of a problem...

- Full Disclosure: I no longer drink.
 
TCBF said:
-... and what produces urine? BEER!!

- Whoever in the past said that we coudn't drink our way out of a problem...

- Full Disclosure: I no longer drink.

Yer in a world of hurt then.  It'll take you longer to produce a tank of ______________












Then again, you could be a real good buddy and be the Designated Driver on Friday to your friends in the Mess and ask for payment in _______________ in lieu of cash....  >:D
 
Working the ultimate energy source for the biosphere, photosynthesis:

http://nextbigfuture.com/2009/07/project-to-re-engineer-photosynthesis.html

Project to Re-engineer Photosynthesis in Rice

An ambitiuos project to re-engineer photosynthesis in rice, led by the International Rice Research Institute (IRRI) through a global consortium of scientists, has received a grant of US$11 million over 3 years from the Bill & Melinda Gates Foundation. As a result o research being conducted by this group, rice plants that can produce 50% more grain using less, fertilizer and less water are a step closer to reality.

Currently, more than a billion people worldwide live on less than a dollar a day and nearly one billion live in hunger. Over the next 50 years, the population of the world will increase by about 50% and water scarcity will grow. About half of the world’s population consumes rice as a staple cereal, so boosting its productivity is crucial to achieving long-term food security.

Photosynthesis, the process by which plants use solar energy to capture carbon dioxide and convert it into the carbohydrates required for growth, is not the same for all plants. Some species, including rice, have a mode of photo-synthesis (known as C3), in which the capture of carbon dioxide is relatively I inefficient. Other plants, such as maize and sorghum, have evolved a much more efficient form of photosynthesis known as C4.

According to IRRI scientist and project leader Dr. John Sheehy, in tropical climates the efficiency of solar energy conversion of crops using the so-called C4 photosynthesis is about 50% higher than that of C3 crops. Given the demands from an increasing population, combined with less available land and water, adequate future supplies of rice will need to come in large part through substantial yield boosts and more efficient use of crop inputs.

“Converting the photosynthesis of rice from the less-efficient C3 form to the C4 form would increase yields by 50%,” ; said Dr. Sheehy, adding that C4 rice would also use water twice as efficiently. In developing tropical countries, where billions of poor people rely on rice as their staple food, “The benefits of such an improvement in the face of increasing world population, increasing food prices, and decreasing natural resources would, be immense,” he added.
 
Improving the efficiency of the refining process will increase the amount of usable product out of the same amount of input:

http://nextbigfuture.com/2009/07/rive-technology-working-to-increase-oil.html

Rive Technology Working to Increase Oil Refining Efficiency 7-9% by 2011

Holey catalyst: Rive Technology is designing a zeolite catalyst with pores larger than those found in conventional zeolites, which are widely used in petroleum and petrochemical production. The larger pores allow the catalysts to handle a wide range of compounds. Credit: Rive Technology

Rive Technology will help refiners increase production of transportation fuels and process less desirable crudes with its innovative catalyst technology. Mesopores (>4 nanometers) in zeolite enable larger molecules to be cracked. Petroleum refiners would obtain a higher yield of desirable products such as gasoline, diesel fuel, and propylene, and less of undesirable products like heavy cycle oil and coke.

"By the end of the year, we hope to have hit upon the optimum mix of these things," says Dougherty. "We hope to be in commercial refineries in the second half of 2011." The plan is to license the recipe to commercial manufacturers of petroleum catalysts, such as BASF or W.R. Grace.

Rive’s proprietary catalyst – RiveCat – is focused on the most important conversion process in the refinery – fluid catalytic cracking (FCC). The FCC process converts or “cracks” the long-chain hydrocarbons found in crude oil into smaller, more valuable molecules such as those that comprise transportation fuels.

RiveCat is more accessible to the bulky hydrocarbon molecules found in FCC feedstock, allowing more of the feedstock to get “cracked”, especially when processing low quality crudes. As result, refiners produce a more valuable slate of products from a barrel of crude and increase throughput in the refinery, leading to higher profit margins. Refiners are also able to purchase cheaper, lower quality crudes and process them economically.

Refiners can utilize RiveCat without significant capital investment or changes in operating conditions, allowing them to immediately improve refining yields and profits.


MIT Technology review has details.

Andrew Dougherty, vice president of operations at Rive, says that the catalyst could increase the proportion of petroleum processed by as much as 7 to 9 percent.

The company's technology is based on zeolites--tiny pore-studded particles made of a mix of aluminum, oxygen, and silicon that are a mainstay of the petroleum and petrochemical industries. Heated and mixed in with crude petroleum, zeolites act as a catalyst, breaking apart the complex hydrocarbon molecules of crude into simpler hydrocarbons that make gasoline, diesel, kerosene, and other desirable products in the process known as fluid catalytic cracking. By making zeolites with pores larger than those in conventional ones, Rive hopes to create catalysts that handle a higher proportion of hydrocarbons.

Typically, the openings of pores in zeolites are less than a nanometer wide, which limits the range of hydrocarbon that can get into the porous catalysts. But Javier Garcia Martinez, a cofounder of Rive and now a professor at the University of Alicante, in Spain, came up with a way to control the size of the openings while working as a postdoctoral fellow at MIT's Nanostructured Materials Research Laboratory. He mixes the constituents of the zeolites in an alkaline solution, then adds a surfactant--a soaplike liquid. The surfactant makes bubbles, and the zeolites form around the bubbles. Then he burns away the surfactant, leaving behind zeolites with openings two to five nanometers wide--big enough to let in larger hydrocarbon molecules. By varying the chemistry of the surfactant, Garcia Martinez can control the size of the pore openings.

Dougherty also sees Rive's zeolites being used in hydrocracking, a refining technique that employs high-pressure hydrogen to create a low-sulfur diesel. Hydrocracking is a small market, but with the U.S. Environmental Protection Agency trying to reduce sulfur emissions, it's a growing one, he says. With its ability to choose pore size, the company might also make catalysts for processing tar sands, which contain extremely dense petroleum. Further down the road, the material might also be used to process biofuels, according to the company.
 
Another biofuel scheme. Once again, we need to see how this actually scales up:

http://www.technologyreview.com/business/23073/A Biofuel Process to Replace All Fossil Fuels

A startup unveils a high-yield process for making fuel from carbon dioxide and sunlight.
By Kevin Bullis

A startup based in Cambridge, MA--Joule Biotechnologies--today revealed details of a process that it says can make 20,000 gallons of biofuel per acre per year. If this yield proves realistic, it could make it practical to replace all fossil fuels used for transportation with biofuels. The company also claims that the fuel can be sold for prices competitive with fossil fuels.

Joule Biotechnologies grows genetically engineered microorganisms in specially designed photobioreactors. The microorganisms use energy from the sun to convert carbon dioxide and water into ethanol or hydrocarbon fuels (such as diesel or components of gasoline). The organisms excrete the fuel, which can then be collected using conventional chemical-separation technologies.

If the new process, which has been demonstrated in the laboratory, works as well on a large scale as Joule Biotechnologies expects, it would be a marked change for the biofuel industry. Conventional, corn-grain-based biofuels can supply only a small fraction of the United States' fuel because of the amount of land, water, and energy needed to grow the grain. But the new process, because of its high yields, could supply all of the country's transportation fuel from an area the size of the Texas panhandle. "We think this is the first company that's had a real solution to the concept of energy independence," says Bill Sims, CEO and president of Joule Biotechnologies. "And it's ready comparatively soon."

The company plans to build a pilot-scale plant in the southwestern U.S. early next year, and it expects to produce ethanol on a commercial scale by the end of 2010. Large-scale demonstration of hydrocarbon-fuels production would follow in 2011.

So far, the company has raised "substantially less than $50 million," Sims says, from Flagship Ventures and other investors, including company employees. The firm is about to start a new round of financing to scale up the technology.

The new approach would also be a big improvement over cellulose-based biofuels. Cellulosic materials, such as grass and wood chips, could yield far more fuel per acre than corn, and recent studies suggest these fuel sources could replace about one-third of the fossil fuels currently used for transportation in the United States. But replacing all fossil fuels with cellulose-based biofuels could be a stretch, requiring improved growing practices and a vast improvement in fuel economy.

Algae-based biofuels come closest to Joule's technology, with potential yields of 2,000 to 6,000 gallons per acre; yet even so, the new process would represent an order of magnitude improvement. What's more, for the best current algae fuels technologies to be competitive with fossil fuels, crude oil would have to cost over $800 a barrel says Philip Pienkos, a researcher at the National Renewable Energy Laboratory in Golden, CO. Joule claims that its process will be competitive with crude oil at $50 a barrel. In recent weeks, oil has sold for $60 to $70 a barrel.

Joule's process seems very similar to approaches that make biofuels using algae, although the company says it is not using algae. The company's microorganisms can be grown inside transparent reactors, where they're circulated to ensure that they all get exposed to sunlight, and they are fed concentrated carbon dioxide--which can come from a power plant, for example--and other nutrients. (The company's bioreactor is a flat panel with an area about the size of a sheet of plywood.) While algae typically produce oils that have to be refined into fuels, Joule's microorganisms produce fuel directly--either ethanol or hydrocarbons. And while oil is harvested from algae by collecting and processing the organisms, Joule's organisms excrete the fuel continuously, which could make harvesting the fuel cheaper.

David Berry, one of the company's founders and a board member, says the organism they use was selected and modified to work well in a bioreactor, and the bioreactor was designed with the specific organism in mind. He adds that the company carefully considered issues such as the organism's response to heat, and the reactor was built to keep the heat within bearable limits. Overheating has been a problem with bioreactors in the past.

The company will likely face many challenges as it attempts to scale up its process. Other companies, such as Green Fuels, have failed to produce biofuels economically in bioreactors because of the high cost of the reactors compared to the amount of fuel produced. Another challenge is keeping the microorganisms producing fuel at a steady rate. Algae populations can bloom and grow so quickly that they outrun the supply of nutrients or sunlight, leading to a collapse of the population, says Jim Barber of Barber Associates, who was formerly CEO of Metabolix, which produces chemicals from renewable resources. "You get a burst and then they all die off," he says.

Joule Biotechnologies will also face stiff competition. It is not the only company developing photosynthetic organisms that excrete fuel. Synthetic Genomics, which recently announced a research partnership with ExxonMobil, has developed organisms that excrete fuel, as has Algenol, which recently announced a partnership with Dow.



 
Like I said near the beginning of this thread, the end of oil does not mean the end of energy, simply that economic pressures will result in a changeover to something new. In the 1500's, for example, England was rapidly approaching "Peak Forest" as demand for wood rapidly outstripped the supply of forests. (For Elizabethan Englishmen, the regeneration time of a forest was equal to about two or three generations, so forests were effectively non renewable). England turned to coal to replace charcoal and wood as heating fuels.

In the mid 1800's, the United States was experiencing "Peak Whale", as demand for whale oil outstripped the supply of whales, but Americans turned to fossil fuels to replace whale oil.

Looking over the thread, I notice that new process and devices that use current energy sources more efficiently are being spurred by market forces, new supplies of traditional oil are coming on stream in response to higher prices and alternatives are also becoming viable as demand increases (new processes for oil sands, shale oil, bio oils and methane hydrates will provide hydrocarbon energy for centuries to come), and of course, new technologies make nuclear fission power more affordable and nuclear fusion  energy rapidly approaching feasibility at long last. Almost none of these factors are driven by governments or bureaucracies (the market for SUV's was effectively killed in one month last sumer when oil rose past $170/bbl, not due to government regulations or CAFE standards, which take a decade or more to flow through the economy due to capital turnover [i.e. old cars still stay on the road despite new standards])

Maybe what we really need to ask is "what will life be like with vast amounts of cheap energy?"

http://nextbigfuture.com/2009/08/mr-fusion-scenario-what-if-there-is.html

Mr Fusion Scenario : What if there is cheap and abundant Nuclear Fusion Power ?
What if Nuclear Fusion Power became cheap and abundant ?

Note: several technologies that could work out for providing commercial nuclear fusion would not lead to cheap and abundant nuclear fusion. They would have power that is about the same price as current 3rd generation nuclear fission. The regular ITER project is such a system. For low cost and more availability, there needs to be factory mass produced nuclear fusion generators. There are designs for factory mass produced deep burn (burn most of the fuel) nuclear fission which could be cheaper than many forms of nuclear fusion. Cheap nuclear power needs to be as common as small planes. Production volumes need to be a few thousand per year or more.

How it could happen and how cheap could the energy be?

1. Lawrenceville Plasma Physics succeeds as they have described A Focus Fusion reactor would produce electricity very differently. The energy from fusion reactions is released mainly in the form of a high-energy, pulsed beam of helium nuclei. Since the nuclei are electrically charged, this beam is already an electric current. All that is needed is to capture this electric energy into an electric circuit. This can be done by allowing the pulsed beam to generate electric currents in a series of coils as it passes through them. This is much the same way that a transformer works, stepping electric power down from the high voltage of a transmission line to the low voltage used in homes and factories. It is also like a particle accelerator run in reverse. Such an electrical transformation can be highly efficient, probably around 70%. What is most important is that it is exceedingly cheap and compact. The steam turbines and electrical generators are eliminated. A 5 MW Focus Fusion reactor may cost around $300,000 and produce electricity for 1/10th of a cent per kWh. This is fifty times less than current electric costs. Fuel costs will be negligible because a 5 MW plant will require only five pounds of fuel per year. [About 40 million kWh per year from a 5 MWe plant and 5 MWe is equal to 6705 horsepower]

2. Inertial electrostatic confinement (bussard/IEC) fusion is targeting commercialization at 2-5 cents per kWh.

However, many people can make the simple fusor technology which is being scaled up. Material and components costs go up, but future manufacturing capability (nanofactories) and superconductor technology could make the full commercial scale IEC fusion reactors cheap. A 100 MWe reactor for $6 million would be comparable to the Focus Fusion reactor scenario. The hobbyist nature of the simple fusor suggests that even though the high power systems would involve a lot more safety issues and costs, reasonably skilled and dedicated teams of engineers would be able to replicate any IEC fusor success.

3. DARPA had a funded project for Chip-Scale High Energy Atomic Beams.

Develop 0.5 MeV [mega electron-volt] proton beams and collide onto microscale B-11 target with a fusion Q (energy ratio) > 20, possibly leading to self-sustained fusion. (Interpolation; this means a fully functional fusion powerplant capable of powering a car or light truck would be about the size of a deck of cards or pack of cigarettes including associated systems.)

There is progress towards a 1 meter long 10 GeV particle accelerator using plasma wakefield technology

If the distance and power were linear, then a 1 millimeter long system would generate 10 MeV particles. You would then need to work out miniturizing the Laser system and the targets. Laser technology is advancing quickly and better targets could be made from advancing nanotechnology.

Cheap and Abundant Access to Space

IEC fusion at the 2 cents per kWh level would be providing $27/kg access single stage access to orbit.

This kind of single stage to orbit ship would still cost $2-5 billion. High availability of cheap graphene, carbon nanotube or diamondoid or nanofactory capability would greatly reduce the costs and simplify the production of the spaceship because of superior materials and manufacturing.

Easy access to space with a lot of high powered ships and equipment means easy space mining.

One NASA report estimates that the mineral wealth of the asteroids in the asteroid belt might exceed $100 billion for each of the six billion people on Earth.

Fully developing the capabilities of nuclear fusion and nanofactories and accessing these resources in the solar system is the end of scarcity scenario.

Also, use the mundane singularity technology like cement jet printing buildings.

Nuclear Bombs and Weapons would be Easy

If you have a nuclear fusion generator then you can generate a lot of neutrons. With a lot of neutrons you can transmute uranium isotopes.

Non-electric uses for nuclear fusion.

If you have fusion powered transportation around the solar system, then you can make all kinds of kinetic energy weapons. ie bombarding things with accelerated asteroids.

So What Would Be Safe ?

Live in the cheap mobile fusion spaceships.

Have ones big enough for a few tens of thousands of people or move around in fleets.

Use metamaterials (invisibility) or at least alter the albedo (space camoflage) to make them harder to spot. (The solar system is a big place, we are still spotting objects bigger than Pluto at about the distance of Pluto.)

Initially the hard to spot spaceships would be like nuclear missile submarines now, your deterrent force, but eventually a large fraction of the population would be mobile in the solar system for commerce and for safety. There would also be less strategic purpose in going after those people who were still on Earth.

In the long range scenario with nanofactories and cheap fusion, then you could not just manufacture big ships with rotating sections for gravity and carrying plenty of supplies but you would have manufacturing capability and resources to make decoy/redundant ships/colonies. The fully capable redundant ships would also be places to move to if for some reason some of the primary ships were damaged.
 
Using ultracapacitors to replace or supplement batteries:

http://www.technologyreview.com/energy/23289/

Ultracaps Could Boost Hybrid Efficiency
Recent studies point to the potential of ultracapacitors to augment conventional batteries.
By Kevin Bullis

Energy storage devices called ultracapacitors could lower the cost of the battery packs in plug-in hybrid vehicles by hundreds or even thousands of dollars by cutting the size of the packs in half, according to estimates by researchers at Argonne National Laboratory in Argonne, IL. Ultracapacitors could also dramatically improve the efficiency of another class of hybrid vehicle that uses small electric motors, called microhybrids, according to a recent study from the University of California, Davis.

The use of ultracapacitors in hybrids isn't a new idea. But the falling cost of making these devices and improvements to the electronics needed to regulate their power output and coordinate their interaction with batteries could soon make them more practical, says Theodore Bohn, a researcher at Argonne's Advanced Powertrain Research Facility.

Although batteries have improved significantly in recent years, the cost of making them is the main the reason why hybrids cost thousands of dollars more than conventional vehicles. This is especially true of plug-in hybrids, which rely on large battery packs to supply all or most of the power during short trips. Battery packs are expensive in part because they degrade over time and, to compensate for this, automakers oversize them to ensure that they can provide enough power even after 10 years of use in a vehicle.

Ultracapacitors offer a way to extend the life of a hybrid vehicle's power source, reducing the need to oversize its battery packs. Unlike batteries, ultracapacitors don't rely on chemical reactions to store energy, and they don't degrade significantly over the life of a car, even when they are charged and discharged in very intense bursts that can damage batteries. The drawback is that they store much less energy than batteries--typically, an order of magnitude less. If, however, ultracapacitors were paired with batteries, they could protect batteries from intense bursts of power, Bohn says, such as those needed for acceleration, thereby extending the life of the batteries. Ultracapacitors could also ensure that the car can accelerate just as well at the end of its life as at the beginning.

Reducing the size of a vehicle's battery pack by 25 percent could save about $2,500, Bohn estimates. The ultracapacitors and electronics needed to coordinate them with the batteries could cost between $500 and $1,000, resulting in hundreds of dollars of net savings.

Ultracapacitors would also make it possible to redesign batteries to hold more energy. There is typically a tradeoff between how fast batteries can be charged and discharged and how much total energy they can store. That's true in part because designing a battery to discharge quickly requires using very thin electrodes stacked in many layers. Each layer must be separated by supporting materials that take up space in the battery but don't store any energy. The more layers used, the more supporting materials are needed and the less energy can be stored in the battery. Paired with ultracapacitors, batteries wouldn't need to deliver bursts of power and so could be made with just a few layers of very thick electrodes, reducing the amount of supporting material needed. That could make it possible to store twice as much energy in the same space, Bohn says.

Ultracapacitors could also be useful in a very different type of hybrid vehicle called a microhybrid, says Andrew Burke, a research engineer at the Institute of Transportation Studies at UC Davis. As designed today, these vehicles use small electric motors and batteries to augment a gasoline engine, allowing the engine to switch off every time the car comes to a stop and restart when the driver hits the accelerator. A microhybrid's batteries can also capture a small part of the energy that is typically wasted as heat during braking. Because ultracapacitors can quickly charge and discharge without being damaged, it would be possible to design microhybrids to make much greater use of an electric motor, providing short bursts of power whenever needed for acceleration. They could also collect more energy from braking. According to computer simulations performed by Burke, such a system would improve the efficiency of a conventional engine by 40 percent during city driving. Conventional microhybrids only improve efficiency by 10 to 20 percent.

In both plug-in hybrids and microhybrids, ultracapacitors would offer improved cold weather performance, since they do not rely on chemical reactions that slow down in the cold. "In very cold weather, you have to heat the battery, or you can't drive very fast--you'd have very low acceleration," Bohn says. In contrast, ultracapacitors could provide fast acceleration even in cold temperatures.

Mark Verbrugge, director of the materials and processes lab at GM, says that of the two uses for ultracapacitors, it will be easier to use them in microhybrids. In this case, he says, ultracapacitors would simply replace batteries, since they store enough energy to augment the gasoline engine without the help of batteries. In plug-in hybrids, which require much more energy, ultracapacitors would need to be paired with batteries, and this would require complex electronics to coordinate between the two energy storage devices. "By and large, you never want to add parts to a car," he says. "You want the simplest system possible" so that there are fewer things to go wrong.

For ultracapacitors to be practical in microhybrids, Verbrugge says, the cost of making them has to decrease by about half, which may be possible because many parts of the manufacturing process for large ultracapacitors aren't yet automated. But to justify the added complexity in plug-in hybrids, he says, the entire system would have to cost significantly less than using batteries alone.

The researchers at Argonne have already taken steps toward proving that ultracapacitors can provide these savings, having shown that they reduce the heat stress placed on batteries by a third. They are continuing to test ultracapacitors to demonstrate that they can make batteries last longer, which would allow automakers to use smaller batteries and save money.

Copyright Technology Review 2009.
 
Alberta's heavy oil and oil sands are about to get lots of competition from new light oil production:

http://nextbigfuture.com/2009/09/iraqs-rumaila-oil-field-could-double.html

Iraq's Rumaila Oil Field Could Double Iraq Oil Production and a Big Oil Find in the Gulf of Mexico

Business Week reports on the BP (British Petroleum) project to modernize the Rumaila oil field to nearly double its production and restore Iraq's power in OPEC.

A lot of the underproduction of "easy oil reserves" is in Iraq and Nigeria. Brazil, Russia, China, United States, Kazakhstan and Canada have key oil megaprojects that are non-OPEC over the next 5 years. Saudi Arabia continues to develop large fields, but OPEC production is held back as part the control of oil prices. New Oil production technology (like THAI/Capri and electrothermal stimulation) are key to unlocking vast oilsand reserves and further improvement of multistage wells are needed for the economic development of Bakken oilfields.

    Rumaila, is a monster, producing 960,000 barrels per day now—nearly half of Iraq's current output. The winners, BP (BP) and China's CNPC, plan to bring the field to plateau production of 2.85 million barrels per day within six years. That would make it one of the most prolific fields in the world. However, the companies may have deliberately made high estimates so as to try to win the contracts.

    BP also thinks it understands Rumaila well, having originally discovered the field in the 1950s and having worked on it with the Iraqis during the past five years. BP also thinks Rumaila closely resembles the giant Samatlor field in western Siberia, which it has successfully managed through its TNK-BP Russia subsidiary. Finally, through CNPC the partners will have access to a Chinese supply chain to bring in the low-cost equipment needed, including onshore drilling rigs. An Iraqi state company will have a 25% stake, while BP and CNPC will share a 75-25 split of the rest.

    The top production targets bid by the international oil community on the six Iraqi oil fields on offer add up to 8.2 million barrels per day. If achieved, that level of output would put Iraq in a rarefied league with Saudi Arabia as a major oil exporter. Potential is one thing, of course, and actual production is another.

2. BP announced today a giant oil discovery at its Tiber Prospect in the deepwater Gulf of Mexico.

    The well, located in Keathley Canyon block 102, approximately 250 miles (400 kilometres) south east of Houston, is in 4,132 feet (1,259 metres) of water. The Tiber well was drilled to a total depth of approximately 35,055 feet (10,685 metres) making it one of the deepest wells ever drilled by the oil and gas industry.

    BP Plc, Europe’s second-largest oil company, “giant” discovery at the Tiber Prospect in the Gulf of Mexico that may contain more than 3 billion barrels, after drilling the world’s deepest exploration well.

    The latest discovery will help BP, already the biggest producer in the Gulf of Mexico, boost output in the region by 50 percent to 600,000 barrels of oil equivalent a day after 2020.

FURTHER READING
Oil megaprojects list at wikipedia.

The Rumaila project is not included on the oil megaprojects list at this time.

Vankor Field in Russia came online August, 2009 and is to produce 60 thousand barrels per day in 2009 and 220 thousand barrels per day in 2010 and 315 thousand barrels per day in 2011.

Brazil, Russia, China, United States, Kazakhstan and Canada have key oil megaprojects that are non-OPEC over the next 5 years.
 
And even more oil. At this rate we should be seeing oil priced around .50/litre

http://nextbigfuture.com/2009/09/evidence-three-forks-formation-is.html

Evidence Three Forks formation is Separate from Bakken Which Would Mean a Lot more Oil

Testing done in the Bakken shale area found a "stacked play," meaning one oil formation is on top of another, which could allow more oil to be recovered at a lower cost in a smaller area with less environmental damage, said Continental Resources Hamm said the testing showed two distinct formations. He said the Three Forks well initially fetched 140 barrels daily. The Bakken well fetched about 1,200. State officials said in July that production results from 103 wells in the Three Forks-Sanish formation show some wells recovering more than 800 barrels a day, considered "as good or better" than some in the Bakken, where the record is thought to be more than 4,000 barrels a day.

    State geologist Ed Murphy called Continental's findings interesting but said more wells are needed before researchers know for sure the characteristics and potential of the Three Forks formation.

    The company's tests and other promising results from Three Forks wells have fueled speculation that the formation could add billions of barrels to government reserve estimates.

    Continental, which is marking 20 years in North Dakota, also trademarked the process of drilling multiple wells from one pad, the area cleared for drilling machinery. It plans to drill two wells into the Bakken and another two into the Three Forks from one pad, which means the well site's footprint will be cut from 20 acres to six acres, Hamm said.

    The company estimates its ECO-Pad process will cut drilling and well completion costs, which run as high as $7 million in the Bakken, by about 10 percent. The process could be in place by the end of the year.

The company also plans to use a single drill rig that can be moved to different sites on a pad, which will require only one road and fewer power lines, pipelines and other infrastructure, he said.

Seeking Alpha has the transcript of the August 2009 conference call for continental resources.

    This test was very important to us and I believe we did (inaudible) is stacking two laterals and established not even with unrealistically tight spacing the Middle Bakken and Three Forks/Sanish reservoirs are separate and need be developed individually. Consequently in terms of testing we have seen what we effectively need to see. So given the extensive number of wells that we and others have completed across playing both zones, as I said earlier, Continental is now transitioning into the developmental mode with a staggered drilling pattern that we will use to harvest the two reservoirs.

    The most effective way to drain these two tanks so to speak is to drill north south oriented Middle Bakken well and then step over to about 660 feet east or west and drill Three Forks/Sanish well in the same orientation and then step over another 660 feet and drill the next Middle Bakken well working your way out across play. We think this development plan dovetails very well with the ECO-pad concept that the NDIC approved this last week. Continental has developed an innovative new approach for drilling multiple wells around the same old drilling pad specifically the two Middle Bakken and two Three Forks/Sanish wells per ECO-pad.

    The key advantages we think are very apparent. We drilled four wells from one ECO-pad minimizing the environmental impact. One ECO-pad will have about 70% less space as the surface footprint area than four conventional drilling pads. Instead of four pads, basically we use about 5 acres each up there for (inaudible) drilling platform and therefore we will be drilling four wells sequentially from a single 6-acre ECO-pad.

    The NDIC granted ECO-pads an exemption from setback requirements on section [ph] property lines. We'll be drilling fence to fence from 1280 acreage spacing unit to the next, instead of leaving about 1100 feet or more untouched rock between these two 1200 acre space units. So we will be utilizing all the reservoirs within our space unit.
 
Attempting to take energy from literally "nothing". Of course tampering with the fundimental structure of the Universe and Space/Time might not be such a good idea in the long run.....

http://www.scientificamerican.com/article.cfm?id=darpa-casimir-effect-research

Research in a Vacuum: DARPA Tries to Tap Elusive Casimir Effect for Breakthrough Technology
DARPA mainly hopes that research on this quantum quirk can produce futuristic microdevices

By Adam Marcus


Named for a Dutch physicist, the Casimir effect governs interactions of matter with the energy that is present in a vacuum. Success in harnessing this force could someday help researchers develop low-friction ballistics and even levitating objects that defy gravity. For now, the U.S. Defense Department's Defense Advanced Research Projects Agency (DARPA) has launched a two-year, $10-million project encouraging scientists to work on ways to manipulate this quirk of quantum electrodynamics.

Vacuums generally are thought to be voids, but Hendrik Casimir believed these pockets of nothing do indeed contain fluctuations of electromagnetic waves. He suggested, in work done in the 1940s with fellow Dutch physicist Dirk Polder, that two metal plates held apart in a vacuum could trap the waves, creating vacuum energy that, depending on the situation, could attract or repel the plates. As the boundaries of a region of vacuum move, the variation in vacuum energy (also called zero-point energy) leads to the Casimir effect. Recent research done at Harvard University, Vrije University Amsterdam and elsewhere has proved Casimir correct—and given some experimental underpinning to DARPA's request for research proposals.

Investigators from five institutions—Harvard, Yale University, the University of California, Riverside, and two national labs, Argonne and Los Alamos—received funding. DARPA will assess the groups' progress in early 2011 to see if any practical applications might emerge from the research. "If the program delivers, there's a good chance for a follow-on program to apply" the research, says Thomas Kenny, the DARPA physicist in charge of the initiative.

Program documents on the DARPA Web site state the goal of the Casimir Effect Enhancement program "is to develop new methods to control and manipulate attractive and repulsive forces at surfaces based on engineering of the Casimir force. One could leverage this ability to control phenomena such as adhesion in nanodevices, drag on vehicles, and many other interactions of interest to the [Defense Department]."

Nanoscale design is the most likely place to start and is also the arena where levitation could emerge. Materials scientists working to build tiny machines called microelectromechanical systems (MEMS) struggle with surface interactions, called van der Waals forces, that can make nanomaterials sticky to the point of permanent adhesion, a phenomenon known as "stiction". To defeat stiction, many MEMS devices are coated with Teflon or similar low-friction substances or are studded with tiny springs that keep the surfaces apart. Materials that did not require such fixes could make nanotechnology more reliable. Such materials could skirt another problem posed by adhesion: Because surface stickiness at the nanoscale is much greater than it is for larger objects, MEMS designers resort to making their devices relatively stiff. That reduces adhesion (stiff structures do not readily bend against each other), but it reduces flexibility and increases power demands.

Under certain conditions, manipulating the Casimir effect could create repellant forces between nanoscale surfaces. Hong Tang and his colleagues at Yale School of Engineering & Applied Science sold DARPA on their proposal to assess Casimir forces between miniscule silicon crystals, like those that make up computer chips. "Then we're going to engineer the structure of the surface of the silicon device to get some unusual Casimir forces to produce repulsion," he says. In theory, he adds, that could mean building a device capable of levitation.

Such claims emit a strong scent of fantasy, but researchers say incremental successes could open the door to significant breakthroughs in key areas of nanotechnology, and perhaps larger structures. "What I can contribute is to understand the role of the Casimir force in real working devices, such as microwave switches, MEMS oscillators and gyroscopes, that normally are made of silicon crystals, not perfect metals," Tang says.

The request for proposals closed in September. The project received "a lot of interest," Kenny says. "I was surprised at the creativity of the proposals, and at the practicality," he adds, although he declined to reveal how many teams submitted proposals. "It wasn't pure theory. There were real designs that looked buildable, and the physics looked well understood."

Still, the Casimir project was a "hard sell" for DARPA administrators, Kenny acknowledges. "It's very fundamental, very risky, and even speculative on the physics side," he says. "Convincing the agency management that the timing was right was difficult, especially given the number of programs that must compete for money within the agency."

DARPA managers certainly would be satisfied if the Casimir project produced anything tangible, because earlier attempts had failed. Between 1996 and 2003, for example, NASA had a program to explore what it calls Breakthrough Propulsion Physics to build spacecraft capable of traveling at speeds faster than light (299,790 kilometers per second). One way to do that is by harnessing the Casimir force in a vacuum and using the energy to power a propulsion system. The program closed with this epitaph on its Web site: "No breakthroughs appear imminent."

One of many problems with breakthrough propulsion based on the Casimir force is that whereas zero-point energy may be theoretically infinite, it is not necessarily limitless in practice—or even minutely accessible. "It's not so much that these look like really good energy schemes so much as they are clever ways of broaching some really hard questions and testing them," says Marc Millis, the NASA physicist who oversaw the propulsion program.

The DARPA program faces several formidable obstacles, as well, cautions Jeremy Munday, a physicist at California Institute of Technology who studies the Casimir effect. For starters, simply measuring the Casimir force is difficult enough. These experiments take many years to complete, adds Munday, who recently published a paper in Nature (Scientific American is part of the Nature Publishing Group) describing his own research. What's more, he says, although several groups have measured the Casimir force, only a few have been able to modify it significantly. Still, Munday adds, the exploratory nature of the program means its goals and expectations are "quite reasonable."

Tang is pragmatic about his efforts, given the unlikelihood that Casimir force will ever provide much energy to harness. "The force is really small," he says. "After all, a vacuum is a vacuum."

Yet sometimes the best science can hope for is baby steps. "To come up with anything that can lead to a viable energy conversion or a viable force producing effect, we're not anywhere close," Millis says. "But then, of course, you don't make progress unless you try.
 
More oil from Sask and North Dakota:

http://nextbigfuture.com/2009/11/north-dakota-oil-production-projected.html

North Dakota Oil Production Projected to be 350,000 Barrels of Oil Per Day in 2010

Forbes reports that North Dakota's oil production is expected to approach 350,000 barrels next year, an increase of more than 50 percent, because of a major pipeline expansion and the anticipated startup of a shipping terminal near Stanley (SXE) that will be able to haul 60,000 barrels a day by rail to refineries near Cushing, Okla.

The latest statistics for North Dakota oil production are for Sept, 2009. They report 238,003 barrels of oil per day. Production has been increasing by 5,000 to 10,000 barrels of oil per day each month. If this trend is continuing then November, 2009 production would be 248,000-258,000 barrels of oil per day and would be in the range of 255,000-265,000 barrels of oil per day at the end of 2009.

Bakken oil production (Sask, ND, Montana) would be in the 500,000 barrel of oil per day range in 2011-2012 and onwards.

The 500,000 bpd is over 3 times what was coming from the Bakken two years ago and double the estimate of whether Bakken could move the needle for US production.

465,000 bpd from Montana and ND would be 14 million barrels of oil per month.

US production of oil is 162 million barrels per month.

So over 8% of US oil production.

The oil production technology for the Bakken is still improving and they are talking about possibly getting to 30% of the oil in place. 400 billion barrels of oil in place. That would be 120 billion barrels. So the 6-8 billion barrels of reserves talk is a snapshot.

167 billion barrels of oil in-place in the North Dakota portion of the Bakken and not including Three Forks Sanish oil.

It also combines with Gulf of Mexico oil for the USA.

Eventually offshore drilling in California would be allowed (if oil problems became more serious) and currently off limits Alaska oil.

Saskatchewan is producing 65,000 barrels of oil per day from its part of the Bakken

    Analysts have calculated that Bakken plays will break even with oil at about $30 (U.S.) a barrel. That calculation is part of the reason why valuations in the area have been high. Crescent Point, for example, paid $142,643 (Canadian) per producing barrel of oil equivalent for TriAxon – double the average for Canadian energy transactions this year.

    Saskatchwan oil companies have yet to find a way to wring more than about 20 per cent of the oil from the ground. New techniques are promising – underground water injections, for example, could boost recovery rates to over 30 per cent – but the Saskatchewan plays retain technological risk. Crescent Point, for example, has told investors it has the potential to more than double its reserves – and risk that new ground won't be as productive.

    North Dakota's current production now exceeds 238,000 barrels a day, which ranks the state behind only Texas, Alaska and California. The state's output supplies about 2 percent of the nation's domestic crude oil output.

    If oil prices stay above $60 a barrel and contemplated oil transportation projects become reality, the state could be producing 400,000 barrels of oil daily within five years, said Lynn Helms, director of the state Department of Mineral Resources.

Canadian Business reports: Billionaire oilman Harold Hamm believes North Dakota's oil reserves are double the federal government's estimates.

He said the U.S. Geological Survey's estimate of 4.2 billion barrels of oil in the Bakken shale formation could be "100 percent off."

    Hamm is the chairman and chief executive officer of Continental Resources Inc., an independent oil and gas company based in Enid, Okla. His company was one of the first to tap the Bakken formation in North Dakota's oil patch 20 years ago.

    The Bakken formation encompasses some 25,000 square miles in North Dakota, Montana, Saskatchewan and Manitoba. About two-thirds of the acreage is in western North Dakota.

    Hamm also said he believes domestic reserves are growing, and not just in North Dakota.

    More expansion being planned by the pipeline companies Enbridge Inc. ( ENB ) and Kinder Morgan Energy Partners (EPL ) LP would allow another boost that could put the state's daily production at 400,000 barrels, Helms said.

Recovery Rate and Well Differences Between Saskatchewan and North Dakota Bakken

From Bakken Discussion Group

    A typical Bakken section is generally recognized by third party reserve evaluators as containing approximately 4.0 mmbbls of original- oil-in-place with proved plus probable reserve recovery estimated at 12.5%. PetroBakken's internal assessments, based on ongoing strong production performance combined with increased well density and frac intensity is ultimately expected to increase reserve recovery to up to 22.5%. PetroBakken will control 440 net sections of land, with an estimated ultimate recovery factor of 22.5%, the potential recoverable resource could approach 400 mmbbls.

Recovery rate estimates in North Dakota have been about 1-2%.

    The increased recovery rate seems to be based upon "using multi-leg horizontal drilling technology that reduces inter-well distance between horizontal legs from 400 metres to 200 metres". Currently in North Dakota the smallest horizontal drilling unit is 640 acres. If the ND oil companies downsize to the point where there is a horizontal leg every 200 meters, then that would result in something like 7 wells per 640 acres. Reducing interwell distance could allow recovery rates of 22.5% vs the something like 10% currently being advertised. The economics of one well that costs $5 million and recovers 10% is a lot different than the economics of 7 wells costing $35 million and recovering 22.5%.

    The bakken is much closer to the surface in Canada such that it is cheaper to drill each one of those wells.

    The Canadian bakken is a different animal, with decent intergranular porosity and permeability, more like a conventional reservoir and much shallower.
 
Destroying energy rather than creating it:

http://www.minyanville.com/articles/corn-ethanol-biofuels/index/a/24400

Overhyped Products: Corn Ethanol
Scott Reeves  Sep 28, 2009 8:40 am 
   
There’s just one problem with corn-based ethanol: It takes 29% more fossil energy to produce a gallon of ethanol than the ethanol release when burned as fuel.

The disparity between energy input and output makes ethanol the triumph of politics over logic.

Uncle Sam mandates the use of ethanol as a fuel additive and pushes it as an alternative of imported oil. Politicians of both parties have long promoted ethanol as a way to reduce the nation’s dependence on imported oil.

At a public forum in 2007, President Bush made the standard case for ethanol:

“First of all, I'm guilty on promoting ethanol. And the reason is, is because I think it's in our interests to diversify away from oil. And the reason why it's -- I know that's hard for a Texan to say. But the reason why we've got to diversify away from oil is that we end up with dependency on oil from certain parts of the world where people don't particularly like us...           
And so, I promoted ethanol, and still believe it's important for the future.”

Last March, California Democratic House Speaker Nancy Pelosi said she supported increasing the ethanol-to-gasoline blend rate to 15% from 10% in an effort to reduce dependence on oil imports. “It seems to me we should be able to do that,” Pelosi told reporters after addressing the National Farmers Union in Washington.

What seemed like a foolproof business plan fell flat with investors, who did the math and concluded that corn-based ethanol makes no long-term sense.

VeraSun and Pacific Ethanol (PEIX) have been pounded. Cascade Investment, a firm owned by Microsoft (MSFT) chairman Bill Gates, sold its 21% stake in Pacific Ethanol in 2007.

Ethanol isn’t fancy or magical. It’s an alcohol produced by a distilling process similar to that used to make hard liquor. Blending ethanol with gasoline allows oil companies to boost octane more cheaply than additional refining.

Despite the hype, ethanol doesn’t produce a net energy gain because corn production requires large amounts of fertilizer, herbicides, and pesticides. The manufacture and application of these chemicals consumes large amounts of energy. The corn must be harvested and hauled to production plants to be distilled into alcohol, which requires more energy. Then the ethanol must be distributed to users by rail and truck. After all that, it’s time to think about the air pollution and wastewater created by ethanol production plus the potential problem of chemical-laden runoff from the cornfields.

Increasing acreage devoted to corn won’t tip the balance in ethanol’s favor because the new land is likely to be less productive than land already cultivated, increasing the cost of production -- especially fertilization. The use of additional energy needed to make marginal land productive would be so great that a study by the Massachusetts Institute of Technology concludes that ethanol production expansion would boost greenhouse gas emissions above current levels.

Using alcohol as a fuel isn’t new. Nicholas Otto, the German inventor best known for developing the internal combustion engine, used ethanol as the fuel for one of his engines in 1876.

What’s new is the unintended consequence of a federal energy program. The Clean Air Act of 1990, designed to reduce air pollution by replacing MTBE with ethanol, instead shovels money to favored companies such as Archer Daniels Midland (ADM), a diversified agricultural company.

“The Archer Daniels Midland Corporation has been the most prominent recipient of corporate welfare in recent US history,” James Bovard wrote in a report for the Cato Institute, a libertarian think tank based in Washington, DC. “ADM [has] lavishly fertilized both political parties with millions of dollars in handouts and in return [has] reaped billion-dollar windfalls from taxpayers and consumers.”

Ethanol made from cellulose, the fibrous material found in plants, contains less energy than fuel derived from corn. If forest or grassland is cleared to plant crops used to make ethanol, it’s usually done by burning off existing vegetation. This releases large amounts of carbon dioxide.

Some say the problem could be resolved, at least in part, by using agricultural waste as the feedstock for ethanol or by growing grass on marginal land that won’t support commercial crops. But that will require new technology because only sugars and seeds can now be distilled efficiently into alcohol. Chevron (CVX) is working with major universities in an effort to develop plants that make better feedstock for cellulosic ethanol and to improve processing methods.

Oil now provides about 40% of the world’s total energy and from 2000 to 2007, the developing world accounted for 85% of the growth in world demand, the Wall Street Journal reports. Oil will be increasingly important in China and India. This means money will continue to flow to some unsavory characters and manic price swings will persist. Last year, the price of a barrel of oil ranged from $147.27 in July to $32.40 in December. Such fluctuations make it difficult to plan and invest in alternative fuels.

Ethanol supporters say subsidies are needed to level the playing field. But US oil subsidies total about $1 billion a year, or six to eight times less than ethanol subsidies.

For now, politics trumps the market. In March 2008, the US Energy Information Administration estimated that US ethanol production capacity was 7.2 billion gallons per year with an additional 6.2 billion gallons of capacity under construction.

In 2007, the US consumed 6.8 billion gallons of ethanol and 500 million gallons of biodiesel. The Energy Independence and Security Act of 2007 expanded the Renewable Fuels Standard to require that 36 billion gallons of ethanol and other biofuels be blended into gasoline, diesel, and jet fuel by 2022.

So, don’t expect an outbreak of rationality in Washington anytime soon -- especially as long as the Iowa caucus comes first in the presidential nomination process and farm states can swing the election or determine which party holds the majority in Congress.
 
The ultimate in personal empowerment: www.nextbigfuture.com/2009/12/compact-proton-beam-accelerators-and.html

The article on today's page talks about compact fusion generators. Given the nature of what they are proposing (microscopic fusion reactors engraved on a chip) I could see a realized device with thousands of parallel units, control systems and so on being about the size of a laptop, and evolving to iPhone size from there (remember, I'm talking about the complete device with all the associated systems. Tha actual fusion reactions would take place in an area the size of a laptop CPU or cell phone sim card).

Mr Fusion would not work on old banana peels, but other than having to find a supply of Boron fuel and a compact ion source to get started, something like this would keep you pretty self sufficient in energy regardless of what you do. Seeing that energy use is a key indicator of national and personal wealth, being able to access huge amounts of electrical energy in cheap, portable and compact form would boost GDP and personal income by an order of magnitude.

 
More Canadian oil plays in Alberta and Saskatchewan. Smart investors might look at these plays, especially if "green" fanatics are trying to shut down the Alberta tar sands:

http://nextbigfuture.com/2009/12/cardium-and-viking-oil-plays.html

Cardium and Viking Oil Plays

Multistage horizontal drilling is opening previously ignored section of the Cardium Pembina oil reserve

Pembina, with an estimated 7.8 billion barrels of original oil in place, is Canada's largest conventional onshore oilfield. Despite extensive secondary recovery through waterfloods, less than 1.4 billion barrels has been produced. The scale of the remaining prize continues to draw plenty of interest

The new Cardium oil play in Alberta is rapidly approaching the stature of Saskatchewan’s famous Bakken play.

    Both the Bakken and the Cardium are “tight” or “unconventional” plays, where the oil is hosted in a rock, as opposed to a more porous, and usual sand formation.

    They were well known but uneconomic zones until a few years ago, when advancements in horizontal drilling and fracing technologies allowed them to be exploited. The Bakken is ranked by most Canadian analysts as the most profitable oil play in the country now, with Cardium as #2.

    With the Cardium in particular, there is very little geological risk. It has been drilled through thousands of times to get to the oil in the more porous, productive zone below it. The market loves these low risk plays that are very “repeatable” – each new well is likely to produce just as the one before it.

    Thirdly, these new technologies are continually improving the economics in these formations. Four years later, companies are still increasing production from Bakken wells, and increasing the overall amount of oil recovered from the formations. The Cardium is a younger play, only a year old, and as management teams tweak the way they drill and frac these wells, it may one day get even closer to Bakken economics.

The Viking oil play in Southwest Saskatchewan stands at approximately 6 billion barrels, implying that the play is second only to the Cardium in OOIP among non-oil sands resources. Similar to the Cardium, the Viking is a legacy oil pool that has been developed since the 1950s with older technology, and that now stands to be rejuvenated by virtue of advancements in horizontal multi-stage fraccing techniques.

Mid-Continent shale may have as much as 500 billion barrels of oil. Bakken Shale oil production alone may reach 500,000 barrels per day in 2011. The Three Forks is rumored to contain just as much oil as the Bakken.
 
More alternatives:

http://www.wired.com/magazine/2009/12/ff_new_nukes/all/1

Uranium Is So Last Century — Enter Thorium, the New Green Nuke

The thick hardbound volume was sitting on a shelf in a colleague’s office when Kirk Sorensen spotted it. A rookie NASA engineer at the Marshall Space Flight Center, Sorensen was researching nuclear-powered propulsion, and the book’s title — Fluid Fuel Reactors — jumped out at him. He picked it up and thumbed through it. Hours later, he was still reading, enchanted by the ideas but struggling with the arcane writing. “I took it home that night, but I didn’t understand all the nuclear terminology,” Sorensen says. He pored over it in the coming months, ultimately deciding that he held in his hands the key to the world’s energy future.

Published in 1958 under the auspices of the Atomic Energy Commission as part of its Atoms for Peace program, Fluid Fuel Reactors is a book only an engineer could love: a dense, 978-page account of research conducted at Oak Ridge National Lab, most of it under former director Alvin Weinberg. What caught Sorensen’s eye was the description of Weinberg’s experiments producing nuclear power with an element called thorium.

At the time, in 2000, Sorensen was just 25, engaged to be married and thrilled to be employed at his first serious job as a real aerospace engineer. A devout Mormon with a linebacker’s build and a marine’s crew cut, Sorensen made an unlikely iconoclast. But the book inspired him to pursue an intense study of nuclear energy over the next few years, during which he became convinced that thorium could solve the nuclear power industry’s most intractable problems. After it has been used as fuel for power plants, the element leaves behind minuscule amounts of waste. And that waste needs to be stored for only a few hundred years, not a few hundred thousand like other nuclear byproducts. Because it’s so plentiful in nature, it’s virtually inexhaustible. It’s also one of only a few substances that acts as a thermal breeder, in theory creating enough new fuel as it breaks down to sustain a high-temperature chain reaction indefinitely. And it would be virtually impossible for the byproducts of a thorium reactor to be used by terrorists or anyone else to make nuclear weapons.

Weinberg and his men proved the efficacy of thorium reactors in hundreds of tests at Oak Ridge from the ’50s through the early ’70s. But thorium hit a dead end. Locked in a struggle with a nuclear- armed Soviet Union, the US government in the ’60s chose to build uranium-fueled reactors — in part because they produce plutonium that can be refined into weapons-grade material. The course of the nuclear industry was set for the next four decades, and thorium power became one of the great what-if technologies of the 20th century.

Today, however, Sorensen spearheads a cadre of outsiders dedicated to sparking a thorium revival. When he’s not at his day job as an aerospace engineer at Marshall Space Flight Center in Huntsville, Alabama — or wrapping up the master’s in nuclear engineering he is soon to earn from the University of Tennessee — he runs a popular blog called Energy From Thorium. A community of engineers, amateur nuclear power geeks, and researchers has gathered around the site’s forum, ardently discussing the future of thorium. The site even links to PDFs of the Oak Ridge archives, which Sorensen helped get scanned. Energy From Thorium has become a sort of open source project aimed at resurrecting long-lost energy technology using modern techniques.

And the online upstarts aren’t alone. Industry players are looking into thorium, and governments from Dubai to Beijing are funding research. India is betting heavily on the element.

The concept of nuclear power without waste or proliferation has obvious political appeal in the US, as well. The threat of climate change has created an urgent demand for carbon-free electricity, and the 52,000 tons of spent, toxic material that has piled up around the country makes traditional nuclear power less attractive. President Obama and his energy secretary, Steven Chu, have expressed general support for a nuclear renaissance. Utilities are investigating several next-gen alternatives, including scaled-down conventional plants and “pebble bed” reactors, in which the nuclear fuel is inserted into small graphite balls in a way that reduces the risk of meltdown.

Those technologies are still based on uranium, however, and will be beset by the same problems that have dogged the nuclear industry since the 1960s. It is only thorium, Sorensen and his band of revolutionaries argue, that can move the country toward a new era of safe, clean, affordable energy.

Named for the Norse god of thunder, thorium is a lustrous silvery-white metal. It’s only slightly radioactive; you could carry a lump of it in your pocket without harm. On the periodic table of elements, it’s found in the bottom row, along with other dense, radioactive substances — including uranium and plutonium — known as actinides.

Actinides are dense because their nuclei contain large numbers of neutrons and protons. But it’s the strange behavior of those nuclei that has long made actinides the stuff of wonder. At intervals that can vary from every millisecond to every hundred thousand years, actinides spin off particles and decay into more stable elements. And if you pack together enough of certain actinide atoms, their nuclei will erupt in a powerful release of energy.

To understand the magic and terror of those two processes working in concert, think of a game of pool played in 3-D. The nucleus of the atom is a group of balls, or particles, racked at the center. Shoot the cue ball — a stray neutron — and the cluster breaks apart, or fissions. Now imagine the same game played with trillions of racked nuclei. Balls propelled by the first collision crash into nearby clusters, which fly apart, their stray neutrons colliding with yet more clusters. Voilè0: a nuclear chain reaction.

Actinides are the only materials that split apart this way, and if the collisions are uncontrolled, you unleash hell: a nuclear explosion. But if you can control the conditions in which these reactions happen — by both controlling the number of stray neutrons and regulating the temperature, as is done in the core of a nuclear reactor — you get useful energy. Racks of these nuclei crash together, creating a hot glowing pile of radioactive material. If you pump water past the material, the water turns to steam, which can spin a turbine to make electricity.

Uranium is currently the actinide of choice for the industry, used (sometimes with a little plutonium) in 100 percent of the world’s commercial reactors. But it’s a problematic fuel. In most reactors, sustaining a chain reaction requires extremely rare uranium-235, which must be purified, or enriched, from far more common U-238. The reactors also leave behind plutonium-239, itself radioactive (and useful to technologically sophisticated organizations bent on making bombs). And conventional uranium-fueled reactors require lots of engineering, including neutron-absorbing control rods to damp the reaction and gargantuan pressurized vessels to move water through the reactor core. If something goes kerflooey, the surrounding countryside gets blanketed with radioactivity (think Chernobyl). Even if things go well, toxic waste is left over.

When he took over as head of Oak Ridge in 1955, Alvin Weinberg realized that thorium by itself could start to solve these problems. It’s abundant — the US has at least 175,000 tons of the stuff — and doesn’t require costly processing. It is also extraordinarily efficient as a nuclear fuel. As it decays in a reactor core, its byproducts produce more neutrons per collision than conventional fuel. The more neutrons per collision, the more energy generated, the less total fuel consumed, and the less radioactive nastiness left behind.

Even better, Weinberg realized that you could use thorium in an entirely new kind of reactor, one that would have zero risk of meltdown. The design is based on the lab’s finding that thorium dissolves in hot liquid fluoride salts. This fission soup is poured into tubes in the core of the reactor, where the nuclear chain reaction — the billiard balls colliding — happens. The system makes the reactor self-regulating: When the soup gets too hot it expands and flows out of the tubes — slowing fission and eliminating the possibility of another Chernobyl. Any actinide can work in this method, but thorium is particularly well suited because it is so efficient at the high temperatures at which fission occurs in the soup.

In 1965, Weinberg and his team built a working reactor, one that suspended the byproducts of thorium in a molten salt bath, and he spent the rest of his 18-year tenure trying to make thorium the heart of the nation’s atomic power effort. He failed. Uranium reactors had already been established, and Hyman Rickover, de facto head of the US nuclear program, wanted the plutonium from uranium-powered nuclear plants to make bombs. Increasingly shunted aside, Weinberg was finally forced out in 1973.

That proved to be “the most pivotal year in energy history,” according to the US Energy Information Administration. It was the year the Arab states cut off oil supplies to the West, setting in motion the petroleum-fueled conflicts that roil the world to this day. The same year, the US nuclear industry signed contracts to build a record 41 nuke plants, all of which used uranium. And 1973 was the year that thorium R&D faded away — and with it the realistic prospect for a golden nuclear age when electricity would be too cheap to meter and clean, safe nuclear plants would dot the green countryside.

The core of this hypothetical nuclear reactor is a cluster of tubes filled with a fluoride thorium solution. 1// compressor, 2// turbine, 3// 1,000 megawatt generator, 4// heat exchanger, 5// containment vessel, 6// reactor core.
Illustration: Martin Woodtli

When Sorensen and his pals began delving into this history, they discovered not only an alternative fuel but also the design for the alternative reactor. Using that template, the Energy From Thorium team helped produce a design for a new liquid fluoride thorium reactor, or LFTR (pronounced “lifter”), which, according to estimates by Sorensen and others, would be some 50 percent more efficient than today’s light-water uranium reactors. If the US reactor fleet could be converted to LFTRs overnight, existing thorium reserves would power the US for a thousand years.

Overseas, the nuclear power establishment is getting the message. In France, which already generates more than 75 percent of its electricity from nuclear power, the Laboratoire de Physique Subatomique et de Cosmologie has been building models of variations of Weinberg’s design for molten salt reactors to see if they can be made to work efficiently. The real action, though, is in India and China, both of which need to satisfy an immense and growing demand for electricity. The world’s largest source of thorium, India, doesn’t have any commercial thorium reactors yet. But it has announced plans to increase its nuclear power capacity: Nuclear energy now accounts for 9 percent of India’s total energy; the government expects that by 2050 it will be 25 percent, with thorium generating a large part of that. China plans to build dozens of nuclear reactors in the coming decade, and it hosted a major thorium conference last October. The People’s Republic recently ordered mineral refiners to reserve the thorium they produce so it can be used to generate nuclear power.

In the United States, the LFTR concept is gaining momentum, if more slowly. Sorensen and others promote it regularly at energy conferences. Renowned climatologist James Hansen specifically cited thorium as a potential fuel source in an “Open Letter to Obama” after the election. And legislators are acting, too. At least three thorium-related bills are making their way through the Capitol, including the Senate’s Thorium Energy Independence and Security Act, cosponsored by Orrin Hatch of Utah and Harry Reid of Nevada, which would provide $250 million for research at the Department of Energy. “I don’t know of anything more beneficial to the country, as far as environmentally sound power, than nuclear energy powered by thorium,” Hatch says. (Both senators have long opposed nuclear waste dumps in their home states.)

Unfortunately, $250 million won’t solve the problem. The best available estimates for building even one molten salt reactor run much higher than that. And there will need to be lots of startup capital if thorium is to become financially efficient enough to persuade nuclear power executives to scrap an installed base of conventional reactors. “What we have now works pretty well,” says John Rowe, CEO of Exelon, a power company that owns the country’s largest portfolio of nuclear reactors, “and it will for the foreseeable future.”

Critics point out that thorium’s biggest advantage — its high efficiency — actually presents challenges. Since the reaction is sustained for a very long time, the fuel needs special containers that are extremely durable and can stand up to corrosive salts. The combination of certain kinds of corrosion-resistant alloys and graphite could meet these requirements. But such a system has yet to be proven over decades.

And LFTRs face more than engineering problems; they’ve also got serious perception problems. To some nuclear engineers, a LFTR is a little … unsettling. It’s a chaotic system without any of the closely monitored control rods and cooling towers on which the nuclear industry stakes its claim to safety. A conventional reactor, on the other hand, is as tightly engineered as a jet fighter. And more important, Americans have come to regard anything that’s in any way nuclear with profound skepticism.

So, if US utilities are unlikely to embrace a new generation of thorium reactors, a more viable strategy would be to put thorium into existing nuclear plants. In fact, work in that direction is starting to happen — thanks to a US company operating in Russia.

Located outside Moscow, the Kurchatov Institute is known as the Los Alamos of Russia. Much of the work on the Soviet nuclear arsenal took place here. In the late ’80s, as the Soviet economy buckled, Kurchatov scientists found themselves wearing mittens to work in unheated laboratories. Then, in the mid-’90s, a savior appeared: a Virginia company called Thorium Power.

# Uranium-Fueled Light-Water Reactor
# Fuel Uranium fuel rods
# Fuel input per gigawatt output 250 tons raw uranium
# Annual fuel cost for 1-GW reactor $50-60 million
# Coolant Water
# Proliferation potential Medium
# Footprint 200,000-300,000 square feet, surrounded by a low-density population zone
# Seed-and-Blanket Reactor
# Fuel Thorium oxide and uranium oxide rods
# Fuel input per gigawatt output 4.6 tons raw thorium, 177 tons raw uranium
# Annual fuel cost for 1-GW reactor $50-60 million
# Coolant Water
# Proliferation potential None
# Footprint 200,000-300,000 square feet, surrounded by a low-density population zone
# Liquid Fluoride Thorium Reactor
# Fuel Thorium and uranium fluoride solution
# Fuel input per gigawatt output 1 ton raw thorium
# Annual fuel cost for 1-GW reactor $10,000 (estimated)
# Coolant Self-regulating
# Proliferation potential None
# Footprint 2,000-3,000 square feet, with no need for a buffer zone

Founded by another Alvin — American nuclear physicist Alvin Radkowsky — Thorium Power, since renamed Lightbridge, is attempting to commercialize technology that will replace uranium with thorium in conventional reactors. From 1950 to 1972, Radkowsky headed the team that designed reactors to power Navy ships and submarines, and in 1977 Westinghouse opened a reactor he had drawn up — with a uranium thorium core. The reactor ran efficiently for five years until the experiment was ended. Radkowsky formed his company in 1992 with millions of dollars from the Initiative for Proliferation Prevention Program, essentially a federal make-work effort to keep those chilly former Soviet weapons scientists from joining another team.

The reactor design that Lightbridge created is known as seed-and-blanket. Its core consists of a seed of enriched uranium rods surrounded by a blanket of rods made of thorium oxide mixed with uranium oxide. This yields a safer, longer-lived reaction than uranium rods alone. It also produces less waste, and the little bit it does leave behind is unsuitable for use in weapons.

CEO Seth Grae thinks it’s better business to convert existing reactors than it is to build new ones. “We’re just trying to replace leaded fuel with unleaded,” he says. “You don’t have to replace engines or build new gas stations.” Grae is speaking from Abu Dhabi, where he has multimillion-dollar contracts to advise the United Arab Emirates on its plans for nuclear power. In August 2009, Lightbridge signed a deal with the French firm Areva, the world’s largest nuclear power producer, to investigate alternative nuclear fuel assemblies.

Until developing the consulting side of its business, Lightbridge struggled to build a convincing business model. Now, Grae says, the company has enough revenue to commercialize its seed-and-blanket system. It needs approval from the US Nuclear Regulatory Commission — which could be difficult given that the design was originally developed and tested in Russian reactors. Then there’s the nontrivial matter of winning over American nuclear utilities. Seed-and-blanket doesn’t just have to work — it has to deliver a significant economic edge.

For Sorensen, putting thorium into a conventional reactor is a half measure, like putting biofuel in a Hummer. But he acknowledges that the seed-and-blanket design has potential to get the country on its way to a greener, safer nuclear future. “The real enemy is coal,” he says. “I want to fight it with LFTRs — which are like machine guns — instead of with light-water reactors, which are like bayonets. But when the enemy is spilling into the trench, you affix bayonets and go to work.” The thorium battalion is small, but — as nuclear physics demonstrates — tiny forces can yield powerful effects.

Richard Martin (rmartin@newwest.net), editor of VON, wrote about the Large Hadron Collider in issue 12.04.
 
Wearable solar cells! Just the thing to have on when cought in a sudden shower!:

http://www.futurepundit.com/archives/006816.html

Microphotovoltaic Cells Could Embed In Clothing

Very small solar cells open up the possibility of many applications.

    Sandia National Laboratories scientists have developed tiny glitter-sized photovoltaic cells that could revolutionize the way solar energy is collected and used.

    The tiny cells could turn a person into a walking solar battery charger if they were fastened to flexible substrates molded around unusual shapes, such as clothing.

Such cells could be placed on irregular building shapes, vehicle surfaces, and surfaces where conventional PV can't attach.

    Sandia lead investigator Greg Nielson said the research team has identified more than 20 benefits of scale for its microphotovoltaic cells. These include new applications, improved performance, potential for reduced costs and higher efficiencies.

    “Eventually units could be mass-produced and wrapped around unusual shapes for building-integrated solar, tents and maybe even clothing,” he said. This would make it possible for hunters, hikers or military personnel in the field to recharge batteries for phones, cameras and other electronic devices as they walk or rest.

The much lower use of silicon should cut costs since silicon is a major portion of the cost of silicon-based PV. This suggests these cells might be able to compete on cost versus the cheaper CdTe and CIGS thin film PV that is currently underselling silicon PV on price.

    “So they use 100 times less silicon to generate the same amount of electricity,” said Okandan. “Since they are much smaller and have fewer mechanical deformations for a given environment than the conventional cells, they may also be more reliable over the long term.”

The conversion efficiency is pretty high - higher than the cheap thin films.

    Offering a run for their money to conventional large wafers of crystalline silicon, electricity presently can be harvested from the Sandia-created cells with 14.9 percent efficiency. Off-the-shelf commercial modules range from 13 to 20 percent efficient.

New discoveries for making better solar cells keep getting announced by research labs while a growing assortment of PV makers compete with new approaches for cutting manufacturing and installation costs. Some day PV is going to become a cheap way to generate electricity.

By Randall Parker at 2009 December 26 08:08 PM  Energy Solar

The true down side of such applications is a large fraction of the PV cells will be shaded by the wearer, so a fairly compled control softwear will be required to regulate the variable energy output of your T shirt (e-shirt?).
 
Back
Top