• Thanks for stopping by. Logging in to a registered account will remove all generic ads. Please reach out with any questions or concerns.

A scary strategic problem - no oil

I dont think that fossil fuels are going to be depleted anytime soon. Certainly not in our lifetime. Thirty years ago we saw predictions of oil reserves being depleted by 2007, reserve estimates have actually increased. The bottom line is that no one knows how much oil is left and some experts think oil may be created by the planet itself. While reserves havent declined demand has increased and will continue to increase due to China and India.

In a world without oil Canada and the US are well placed to replace the middle east. Canada with its oil sands/coal/oil shale and the US with huge coal/oil shale reserves. The future looks very bright for north america. ;D
 
Interesting how the facts contradict the conventional wisdom:



Wednesday, February 13, 2008
Gas And Asses
If you would like, you could consider the "Asses" in the title of this post to refer to an alternate method of transportation.

In a prior post I reviewed the trend for US crude supplied as it relates to recessions. This graph shows crude trends, and under normal circumstances, supply doesn't fall unless we enter a recession:

But gas is a different story, although consumption does drop for recessions. Here are some recession periods: (go to link to see all the graphs)


and:

and now:


There's an overall trend here it's easy to miss, so take a look at this:

Gasoline consumption in the US has been falling for years. All those who scream about Hummers and wasteful consumers may be discussing their neighbors, but they aren't discussing the US as a whole. For about 9 years gasoline consumption has been dropping, and in 2007 consumption was the lowest for ten years.

Gas consumption in the US is not price invariant at all. Sales of gas-efficient cars have been steadily increasing, and those fuel-efficient cars will stay on the road for years to come. It's likely that this trend has multiple causes, among which are real declining incomes for a large section of the population, a growing number of retirees who are not forced to drive to work, home workers, and sincere efforts to conserve by a portion of the population. It may also reflect a shift in jobs toward major urban areas in which mass transport is a viable option for more workers.

Regardless, the idea that gas taxes need to be raised to force conservation is a stunningly stupid one. It appears that current gas prices are causing conservation, and that wasteful consumption is restricted to portions of the population that can frankly afford to pay higher taxes without changing their habits. It also appears that US efforts to conserve are not going to affect world trends much - the growth is coming from other areas.

Keeping the overall trend in mind, it is even more remarkable that retail sales show such a growth of retail spending on groceries, gas and pharmaceuticals. The bottom line is that other types of consumption are being suppressed by high spending on necessities. The high spending on necessities is the product of declining real incomes in a large portion of the population, and high inflation for food and fuel.

So is it likely that the stimulus plan will generate much actual economic stimulus? I would say not. The bulk of the individual checks won't hit until May or so, and far more than half of that money will likely be used to pay down other debt or outstanding bills. If people are now using gift cards to buy groceries and gas, they are pushed hard enough to do the same with those checks. Probably it will be a windfall for credit card companies and utilities, but I doubt we'll see much of it in the stores. As the tax refund checks start to come back, we should see some discretionary spending this spring.

A large portion of inflation is coming from the high price of diesel fuel. I expect inflation to keep rising through at least the next few months.

Can government monetary policy do much to restrain inflation of the type we are seeing? I would argue not. There is increased world consumption of necessities, and those are the goods with rising costs. It's rather clear that if anything, US conditions are tight enough to restrain inflation by restraining reseller margins, but of course that has a natural limit.

// posted by MaxedOutMama @ 2/13/2008 10:49:00 AM
 
I apologize if this was posted before, I didn't read through all 18 pages before hitting the 'reply' button.

There is a documentary out there called 'A Crude Awakening' - essentially a movie about oil & fossil fuels.  Its definately a good rental from your local Blockbuster, as it contains a ton of information & relays it in a very easy to understand method.  Some of the key commentators are past CEO's of oil companies & current CEO's of oil companies.

It really puts the whole issue of the world running out of oil in perspective, as well as the pro's & cons of the various alternative energy sources out there.
 
From the Economist:

http://www.economist.com/science/tq/displaystory.cfm?story_id=10715508

Ending a dammed nuisance

Feb 19th 2008
From Economist.com
A new generation of free-standing turbines will liberate hydroelectricity from its dependence on dams

IN TODAY’S green world, hydroelectric dams are often unwelcome. Though their power is renewable and, on the face of it, carbon-free, there are lots of bad things about them, too. Blocking a river with a dam also blocks the movement of fish upstream to spawn and the movement of silt downstream to fertilise fields. The vegetation overwhelmed by the rising waters decays to form methane—a far worse greenhouse gas than carbon dioxide. The capital cost is huge. And, not least, people are often displaced to make way for the new lake. The question, therefore, is whether there is a way to get the advantages without suffering the disadvantages. And the answer is that there may be.

The purpose of a dam is twofold. To house the turbines that create the electricity and to provide a sufficient head of water pressure to drive them efficiently. If it were possible to develop a turbine that did not need such a water-head to operate, and that could sit in the riverbed, then a dam would be unnecessary. Such turbines could also be put in places that could not be dammed—the bottom of the sea, for example. And that is just what is starting to happen, with the deployment of free-standing underwater turbines.

The big disadvantage of free-standing turbines is that they are less efficient in transforming the mechanical energy of water into electrical energy than turbines in dams are. They are also subject to more wear and tear than turbines protected by huge amounts of concrete. They can be hard to get at to repair and maintain. And the generators they run, being electrical machines, need to be protected from the water that surrounds the rest of the turbine.

A discouraging list. But in the past three decades computing power has become cheaper, helping developers to simulate the behaviour of water and turbine blades—something that is hard to do with paper, pen and formulas. Moreover, prototypes can be built directly from the computer model. All this has helped scientists and industry to solve the weaknesses inherent in free-standing turbines.

The first new design was by Alexander Gorlov, a Russian civil engineer who worked on the Aswan High Dam in Egypt. He later moved to America where, with the financial assistance of the Department of Energy, he produced the first prototype of a turbine that could extract power from free-flowing currents “without building any dam”. The Gorlov Helical Turbine as it is known, allows you to use any stream, whatever the direction of its flow. The vertical helical structure, which gives the device its name, provides a stability that previous designs lacked. It increases the amount of energy extracted from a stream from 20% to 35%. In addition, as the shaft is vertical the electric generator can be installed on one end above the water—without any need of waterproof boxes.

In 2001 Mr Gorlov won the Edison patent award for his invention, and the turbines have now been commercialised by Lucid Energy Technologies, an American company, and are being tested in pilot projects in South Korea and North America.

A second design is by Philippe Vauthier, another immigrant to America (he was originally a Swiss jeweller). The turbines made by his company, UEK, are anchored to a submerged platform. They are able to align themselves in the current like windsocks at an aerodrome so that they find the best position for power generation. As they are easy to install and maintain, they are being used in remote areas of developing countries, as well.

Finally, a design by OpenHydro, an Irish company, is not just a new kind of turbine but also a new design of underwater electric generator. Generators (roughly speaking) consist of magnets moving relative to coils. Why not have the magnets encapsulated in the external, fast moving part of a turbine? The turbine is then installed in an external housing, containing the coils. The result looks like an open-centre turbine contained within a tube. OpenHydro’s generators do not need lubricant, which considerably reduces the need for maintenance, and are said to be safer for marine life.

These new designs, combined with the fashion for extracting energy from the environment by windmills and solar cells, means money that previously shied away from the field is now becoming available. According to New Energy Finance, a specialist consultancy, investments in companies proposing to make or deploy free-standing turbines have risen from $13m in 2004 to $156m in 2007. Projects already underway include the installation by American Verdant Power of a tidal-turbine in New York's East River and pilot projects in Nova Scotia with UEK, OpenHydro and Canadian Clean Current.

And that, optimists hope, is just the beginning. Soon, many more investors will be searching for treasures buried in the ocean sea beds—or, rather, flowing above them.
 
An old idea gets recycled. While these plans will make only the smallest dent in the overall energy consumption figures, I still think these are good ideas because they provide an economic incentive to deal with difficult waste problems:

http://www.timesonline.co.uk/tol/news/environment/article3492378.ece

The power of cow dung can be electric
Chris Ayres in Los Angeles

It's not so much green energy as brown power: a dairy farm in California said yesterday that it had found a new way to generate electricity for households — using a vat of liquid cow manure, 33ft deep and big enough to cover five football fields.

“When most people see a pile of manure, they see a pile of manure. We saw it as an opportunity for farmers, for utilities, and for California,” said David Albers, a partner in the Vintage Dairy, near Fresno, which has 5,000 cows and calls its new facility the Vintage Dairy Biogas project.

As cow manure decomposes it produces methane, a greenhouse gas more damaging than carbon dioxide. Scientists say that controlling methane emissions from animals such as cows will be hugely important in preventing climate change.

Methane can also be captured and treated to produce renewable gas, which can be used instead of coal to run electricity-generating plants — the excretion of a single cow can produce about 100 watts of power.

Although other farms in California already generate natural gas from cow dung, this marks the first time that it has actually been supplied via a pipeline to a utility company, PG&E Corp.

The pipeline will allow PG&E to generate power for about 1,200 homes a day in California's agricultural heartland.

The energy is certainly renewable — as long as no count is made of the energy consumed during the farming of the grain that is fed to the cows — but no one could call it clean.

In addition to being a partner in the Vintage Dairy, Mr Albers is also president of BioEnergy Solutions, the company that funded and built the so-called digester, which turns the cow faeces into gas and saves farmers the cost of disposing of the waste.

BioEnergy Solutions now intends to build digesters at other farms, ultimately generating enough gas to supply 50,000 homes.

The digester works by mixing the manure with microbes. This breaks down the faeces and the resulting gases are then captured.

At the Vintage farm, the digester prevents about 1,500 tons of methane gas from escaping into the atmosphere every year. It also helps to prevent groundwater pollution, a common side-effect of manure storage.

Mr Albers described PG&E as a customer and declined to give details of their agreement.

California's regulators — encouraged by the Governor, Arnold Schwarzenegger, a recent green convert — have ordered PG&E and other utilities to make renewable energy at least 20 per cent of their electricity supplies by 2010.

PG&E expects to reach 14 per cent this year, thanks in small part to Mr Albers's vat of dung.
 
I can't say I read all 18 pages of this thread, I can't say I read all of the first page. I pretty much only read the initial post. But, I thought I would point out something about this paragraph:

redleafjumper said:
The OPEC countries and especially Saudi Arabia have the largest reserves of oil.  Other nations, particularly Russia also have large reserves.  Canada doesn't have huge actual oil reserves, but our tar sands and oil put us in second place in the world after Saudi Arabia for being able to produce oil, even if the process is expensive.

True: Canada is the 2nd largest oil producer, next to Saudi Arabia
False: Canada doesn't have huge actual oil reserves

"It is estimated that there are 2.5 trillion barrels of oil trapped in the tar sands around Fort McMurray, Alberta. That far exceeds the reserves of even Saudi Arabia."
"Over the next 10 years another $87 billion is likely to be spent. By then, production will reach about 2 million barrels of oil each day. This is comparable to the major oil producers in the Middle East."

- Griffin, Ricky W., Ronald J. Ebert, and Frederick A. Starke. Business, Sixth Canadian Edition. Toronto, 2008.

So, if it were 10 years later from today, and these figures are anywhere near realistic, Canada would be staring 3422 years of producing oil from the largest oil reserve in the world. That's just one oil reserve in Alberta, up in Fort Mac. I don't know how big of a percentage of Canada's oil is estimated to be in Fort Mac, but regardless.... I think we're gonna be OK. This whole "the world is gonna run out of oil in 50 years" thing, I think thats a bit...well... wrong.

Someone may want to check my calculations though....2.5 trillion is a big number, could have easily messed up somewhere.
2.5 x 10(exponent 12) / 2 million = 1,250,000 days
1,250,000 days / 365.25 days per year = 3422.31 years

seems to work though.



 
Ballz,

I don't disagree with your post except that my paragraph differentiates between actual oil reserves and the type of reserves as found in the tar sands - reading that whole sentence clearly shows that point.  Canada does have huge oil deposits; most are just trapped in the tar sands.  That situation requires a considerably different extraction process than merely pumping oil out of the ground.  Also while it has become a long thread, it is useful to read at least some of the subsequent posts.

Cheers,
 
redleafjumper said:
Ballz,

I don't disagree with your post except that my paragraph differentiates between actual oil reserves and the type of reserves as found in the tar sands - reading that whole sentence clearly shows that point.  Canada does have huge oil deposits; most are just trapped in the tar sands.  That situation requires a considerably different extraction process than merely pumping oil out of the ground.  Also while it has become a long thread, it is useful to read at least some of the subsequent posts.

Cheers,

I jumped the gun there, I thought you were speaking as if it was a disadvantage to have it in tar sands, or that it wasn't worth as much... The major point I was leading into was not about Canada's oil, or Saudi's oil, etc... My point is that we're not going to run out of oil in 50 years.
 
Trying to mandate a market will create all kinds of downstream problems

http://www.technologyreview.com/Biztech/20226/?nlid=921

The Mess of Mandated Markets
New federal biofuel standards passed last year will distort the development of innovative technologies.
By David Rotman

Few things prompt Washington policymakers to forget their professed belief in the efficiency of free markets faster than $100-a-barrel oil prices--or even the threat of them. In one of the most notable recent examples, as the price of crude oil edged toward the $100 mark late last year, the U.S. Congress passed, and President Bush quickly signed, the Energy Independence and Security Act of 2007.

Among its various provisions, the energy bill prescribes a minimum amount of biofuel that gasoline suppliers must use in their products each year through 2022. The new mandates, which significantly expand the Renewable Fuels Standard of 2005, would more than double the 2007 market for corn-derived ethanol, to 15 billion gallons, by 2015. At the same time, the bill ensures the creation of a new market for cellulosic biofuels made from such sources as prairie grass, wood chips, and agricultural waste. The standards call for the production of 500 million gallons of cellulosic biofuel by 2012, one billion gallons by 2013, and 16 billion gallons by 2022.

Not surprisingly, the ethanol industry is very happy. The Biotechnology Industry Organization, a Washington-based trade association whose members include both large manufacturers and startup companies developing new cellulosic technologies, suggests that "this moment in the history of transportation fuels development can be compared to the transition from whale oil to kerosene to light American homes in the 1850s." The new push for biofuels, the trade association continues, is "larger than the Apollo project or the Manhattan project" and will require the construction of 300 biofuel plants, each with a capacity of 100 million gallons, at a cost of up to $100 billion.

In short, the federal government has legislated the growth of a sizable industry. The often stated aim of the biofuel standards is to reduce greenhouse-gas emissions and dependence on foreign oil. And biofuels, particularly cellulosic ones, could arguably play a significant role in achieving both those goals (see "The Price of Biofuels," January/­February 2008). But quite apart from the value of ethanol and other biofuels, the creation of markets by federal law raises fundamental questions about the best way to implement a national energy policy. Can legislated markets survive economic conditions and policy priori­ties that change over the long term? And what role should the government play in promoting specific technologies?

Mandated consumption levels break the "one-to-one link" between market demand and the adoption of a technology, says Harry de Gorter, an associate professor of applied economics and management at Cornell University: "As an economist, I don't like it. Economists like to let the markets determine what [technology] has the best chances." The new biofuel mandates are "betting on a particular technology," he says. "It is almost impossible to predict the best technology. It is almost inevitable that [mandates] will generate inefficiencies." While de Gorter acknowledges that some economists might justify mandated markets as a way to promote a desired social policy, he questions the strategy's effectiveness. "Historically, there are no good examples of it working in alternative energy," he says.

One reason economists tend to be wary of mandated consumption levels is that they can have unintended consequences for related markets. Producing 15 billion gallons of conventional ethanol will require farmers to grow far more corn than they now do. And even with the increased harvest, biofuel production will consume around 45 percent of the U.S. corn crop, compared with 22 percent in 2007. The effects on the agricultural sector will be various and complex.

Perhaps most obvious will be the impact on the price of corn--and, indirectly, of food in general. Since it became apparent that the biofuel standards would become law, the price of corn has risen 20 percent, to around $5.00 a bushel, says Bruce Babcock, director of the Center for Agricultural and Rural Development at Iowa State University. He expects that prices will probably stay around that level for at least the next three years. Because corn is the primary feed for livestock in this country, that means higher prices for everything from beef to milk and eggs. (Less than 2 percent of the nation's corn crop is eaten directly by humans; more than 50 percent feeds animals.) High corn prices could also make it harder to switch to cellulosic biofuels, because farmers will be reluctant to grow alternative crops. With the price of corn so high, says Babcock, "who is going to replace corn with prairie grass?"

At Purdue University, Wallace Tyner, a professor of agricultural economics, has calculated how different types of government policies, including the new mandated consumption levels, will affect the economics of corn ethanol. One of his most striking findings (though one that would surprise few agricultural experts) is that the fuel struggles to compete with oil on cost, in part because of extreme sensitivity to the commodity price of corn.

Because ethanol is generally blended with gasoline at a concentration of 10 percent, its market value is directly tied to the price of oil. But Tyner's analysis illustrates the complexity of the interplay between the markets for oil, corn, and ethanol. In the absence of government subsidies or mandates, according to his model, no ethanol is produced until oil reaches $60 a barrel. But with oil at that price, ethanol is profitable only as long as corn stays around $2.00 a bushel, which limits production of the biofuel to around a half-billion gallons a year. As oil prices increase, so does ethanol production. But production levels continue to be limited by the price of corn, which rises along with both the demand for ethanol and the price of oil (farmers use a lot of gasoline). Even when oil reaches $100 a barrel, ethanol production will reach only about 10 billion gallons a year if there are no subsidies; and even then, ethanol is profitable only if corn prices stay below $4.15 a bushel. If oil hits $120 a barrel, ethanol production will, left to market forces, reach 12.7 billion gallons--still more than two billion short of the federal mandate.

In other words, the federally mandated consumption levels mean ethanol will not, for the foreseeable future, be truly cost-­competitive with gasoline. Indeed, says Tyner, setting the ethanol market at 15 billion gallons will mean an "implicit tax" on gasoline consumers, who will have to pay to sustain the high level of biofuel production. When oil costs $100 a barrel, the consumer will pay a relatively innocuous "tax" of 42 cents per gallon of ethanol used (the additional price at the pump will usually be only a few pennies for blends that are 10 percent ethanol). But at lower oil prices, the additional cost of ethanol will be far more noticeable. If oil falls to $40 a barrel, the implicit tax for ethanol will be $1.05 a gallon--or $15.77 billion for all the nation's gasoline users. "If the price of oil drops substantially, is Congress going to say, 'We didn't really mean it'?" asks Tyner. "It gets really messy."

History provides a lesson about the messi­ness of predicting the market for an energy technology. Almost three decades ago, as the price of oil reached $40 a barrel and many experts worried that it was headed for $80 or even $100, President Jimmy Carter signed the Energy Security Act of 1980. As is the case today, the high price of oil was straining the U.S. economy, and the Middle East was unstable. One key provision of the 1980 legislation created the U.S. Synthetic Fuels Corporation, which was meant to establish a domestic industry that produced liquid fuel from tar sands, shale, and coal. Despite the unknowns surrounding the economics of producing synthetic fuels on a large scale, engineers estimated that they could be produced for $60 a barrel. An initial production target was set at 500,000 barrels a day. But in the early 1980s, the price of oil fell to $20 a barrel. With no prospect of producing synthetic fuels at a price competitive with that of oil, the Synthetic Fuels Corporation was finally shuttered in 1986.

The corporation "didn't fail because of the technology," says John Deutch, who was undersecretary of energy in 1980 and is now an Institute Professor of chemistry at MIT. Rather, he says, it failed because "it focused on production goals, and that turned out to be a bad thing because the market prices went down." Deutch believes that instead of targeting specific production levels, government should participate in the development of alternative fuel technologies by helping to assess their economics and determine whether they meet environmental expectations.

The Synthetic Fuels Corporation and today's Renewable Fuels Standard differ in many ways. But the efforts behind them do reflect a common theme: the federal government's attempt to select a particular tech­nology and create a market for it. The "harsh reality" is that such measures "are unlikely to be effective over the long term," Deutch says. "And nowhere is this more obvious than in ethanol." He and other experts, such as de Gorter and Iowa State's Babcock, would prefer to see technology-neutral policies, such as a carbon or greenhouse-gas tax, that would allow the markets to choose the most cost-effective way of meeting political and environmental goals.

Besides creating the synthetic-fuels program, the 1980 energy bill also included a Biomass Energy and Alcohol Fuels Act, which provided $600 million to the Departments of Energy and Agriculture for research into biofuels made from cellulose or biomass. But that funding was slashed in subsequent years. And while the Energy Department is again aggressively funding research on biofuels, and the 2007 energy bill includes several measures supporting such work, overall federal funding for energy research and development has never fully rebounded from the cuts made during President Reagan's administration. It's one reason that, almost three decades after Jimmy Carter's energy bill, the United States still has no effective answer to high-priced imported oil. (Interpolation. The other reason is that in terms of physics and chemistry, there is no alternative to oil.)

Distorting the markets through federal mandates for biofuels won't help. What might: a well-considered federal policy that financially supports the development of promising new energy technologies and offers technology-neutral incentives for replacing petroleum.

David Rotman is Technology Review's editor.
Copyright Technology Review 2008.
 
Something like this (if it works as advertised) can provide a nice economic incentive to do something with garbage rather than landfilling. Ethanol is not an ideal biofuel, but if there is an inexpensive way to make it without burning food, then I'll buy that.

http://www.technologyreview.com/Energy/20199/?nlid=925

Ethanol from Garbage and Old Tires
A versatile new process for making biofuels could slash their cost.
By Kevin Bullis

As he leads a tour of the labs at Coskata, a startup based in Warrenville, IL, Richard Tobey, the company's vice president of research and development, pauses in front of a pair of clear plastic tubes packed with bundles of white fibers. The tubes are the core of a bioreactor, which is itself the heart of a new tech¬nology that Coskata claims can make ethanol out of wood chips, household garbage, grass, and old tires--indeed, just about any organic material. The bioreactor, Tobey explains, allows the company to combine thermochemical and biological approaches to synthesizing ethanol. Taking advantage of both, he says, makes Coskata's process cheaper and more versatile than either the technologies widely used today to make ethanol from corn or the experimental processes designed to work with sources other than corn.

Tobey's tour begins at the far end of the laboratory in two small rooms full of pipes, throbbing pumps, and pressurized tanks--all used to process synthesis gas (also known as syngas), a mixture of carbon dioxide, carbon monoxide, and hydrogen. This is the thermo¬chemical part of Coskata's process: in a well-known technique called gasi¬¬fication, a series of chemical reactions carried out at high temperatures can produce syngas from almost any organic material. Ordi¬narily, chemical catalysts are then used to convert the syngas into a mixture of alcohols that includes ethanol. But making such a mixture is intrinsically inefficient: the carbon, hydrogen, and oxygen that go into the other alcohols could, in principle, have gone into ethanol instead. So this is where Coskata turns from chemistry to biology, using microbes to convert the syngas to ethanol more efficiently.

Down the hall from the syngas-¬processing equipment, Tobey shows off the petri dishes, flasks, and sealed hoods used to develop species of bacteria that eat syngas. The bioreactors sit at the far end of the room. Inside the bioreactors' tubes, syngas is fed directly to the bacteria, which produce a steady stream of ethanol.

Coskata's technology could be a big deal. Today, almost all ethanol made in the United States comes from corn grain; because cultivating corn requires a lot of land, water, and energy, corn-derived ethanol does little to reduce greenhouse-gas emissions and can actually cause other environmental damage, such as water pollution. Alternative etha¬nol sources, such as switchgrass, wood chips, and municipal waste, would require far fewer resources. But so far, technology for processing such materials has proved very expensive. That's why Coskata's low-cost technique has caught the attention of major investors, including General Motors, which earlier this year announced a partnership with the startup to help deploy its technology on the commercial scale worldwide.

Sipping Ethanol

Combining thermochemical and biological approaches in a hybrid system can make ethanol processing cheaper by increasing yields and allowing the use of inexpensive feedstocks. But Coskata's process has another advantage, too: it's fast. Though others have also developed syngas-fed bioreactors, Tobey says, they have been too slow. That's because the bacteria are suspended in an aqueous culture, and syngas doesn't dissolve easily in water. Coskata's new bioreactor, however, delivers the syngas to the bacteria directly.
The thin fibers packed into the bioreactor serve two functions. First, they act as scaffolding: the bacteria grow in biofilms on the outside of the fibers. Second, they serve as a delivery mechanism for the syngas. Even though each fiber is not much bigger than a human hair, Tobey says, it acts like a tiny plastic straw. The researchers pump syngas down the bores of the hollow fibers, and it diffuses through the fiber walls to reach the bacteria. Water flows around the outside of the fibers, delivering vitamins and amino acids to the bacteria and carrying away the ethanol the bacteria produce. But the water and the syngas, Tobey says, never meet.
Coskata has also improved the last steps of the process, in which the ethanol is sepa¬rated from the water. Ordinarily, this is done using distillation, which is expensive and consumes 30 percent as much energy as burning the ethanol will release. Coskata instead uses a modified version of an existing technology called vapor permeation. Vapor permeation uses hydrophilic membranes to draw off the water, leaving pure ethanol behind. It also consumes half as much energy as distillation per gallon of fuel. Vapor permeation is difficult to use with most biological manufacturing processes, Tobey says, because biomass fed to the microörganisms washes out with the water and can clog up the system. But in Coskata's process, the bacteria feed only on syngas, not on biomass. So no extra filtration is required to make vapor permea¬tion work.

Better Bugs

Coskata continues working on its bacteria, trying to increase the amount of etha¬nol they can produce. The company now uses varieties of Clostridium, a genus that includes a species that make botulism toxin and another that processes manure on farms. Coskata has started building an automated system for screening new strains of Clostridium according to their ability to make ethanol. Along the way, it has had to develop techniques for protecting its bacteria from being exposed to oxygen; the bacteria are anaerobic, and oxygen kills them at about the same concentrations at which carbon monoxide kills humans. The automated system should allow the company to sort through 150,000 new strains a year, up from a few thousand now.

The researchers can go only so far by sorting through random variations, however. Eventually, Tobey hopes to begin manipu¬lating the microbes' genes directly, activating only those that improve ethanol production. Such engineering is fairly common now, but the Clostridium bacteria that Coskata uses haven't been studied much. So although Tobey knows what chemical steps the bacteria use to transform syngas into ethanol, he doesn't yet know the details of how genes regulate this process, and what role these genes play in the general processes that keep the bacteria alive. What's more, effective ways of manipulating the genes in these particular bacteria haven't yet been developed.

Even as Coskata continues to improve its microbes, it is planning to move the fuel production process out of the lab and scale it up to the commercial level. With the help of GM and other partners, the company will build a facility that's able to produce 40,000 gallons of ethanol per year. Coskata representatives say construction will begin within the year. The company's bioreactors should make it easy to adapt the technology to a larger scale, Tobey says; they can simply be lined up in parallel to achieve the needed output volumes. The next two or three years will reveal whether Coskata's process can start to replace significant amounts of gasoline with cheap ethanol.

Copyright Technology Review 2008.
 
All the oil there ever was, was created in a brief moment in time 7 million years ago when massive global warming hit the planet (via a "killer" asteriod).  Massive amounts of animal and plant matter were laid low and pressed under tons of ash and dust.  In order for new oil to be created guess what has to happen?  Most oil alternates right now cost 1.2 barrels of oil to make for each barrel they save!
 
fraserdw said:
All the oil there ever was, was created in a brief moment in time 7 million years ago when massive global warming hit the planet (via a "killer" asteriod).  Massive amounts of animal and plant matter were laid low and pressed under tons of ash and dust.  In order for new oil to be created guess what has to happen?  Most oil alternates right down cost 1.2 barrels of oil to make for each barrel they save!

This is Gospel?
 
Simple geology will tell you this theory is crock: most oil and natural gas deposits are found in sedimentary layers dating over wide ranges of time (although generally speaking the deeper the deposit the older the deposit was formed). Coal was formed as far back as  400 million years ago (fossils preserved in the coal deposits are proof of this), and most oil is thought to have been formed between 10 and 160 million years ago.

Since the stuff in the oil sands is considered to be the remnants of a deposit up to 18 trillion barrels of hydrocarbons, I don't think asteroid strikes had a lot to do with it.
 
More metrics for the "Green" crowd to ponder:

http://www.telegraph.co.uk/opinion/main.jhtml?view=DETAILS&grid=A1YourView&xml=/opinion/2008/03/23/do2303.xml

Wind power costs inflate

A further huge question mark has been raised over the Government's plan to build 7,000 offshore wind turbines round Britain's coasts, to help meet its EU target of 15 percent of our electricity from â renewables' by 2020.

The director of renewable generation for Centrica, our largest windfarm developer, last week revealed that the cost of this plan to create 33,000 megawatts (MW) of capacity has doubled in three years, from £40 billion to £80 billion.

But since, thanks to fluctuations in the wind, offshore turbines generate on average only 27.5 per cent of capacity, the actual power produced by these turbines would be only 9,000MW, putting its price at £8.8 million per MW.

The latest nuclear power station being built in Finland at a cost of £2.7 billion will produce 1600MW, 24 hours a day, representing £1.7 million per MW. In other words, six nuclear power stations could produce more electricity than all those windfarms for only a fifth of the price.

If Centrica really wants to help Britain keep its lights on, it could, for £80 billion, build 30 "carbon-free" nuclear power stations to generate 48,000MW of electricity, more than the average 47,000MW now produced by all Britain's power plants.

But since this would not count towards meeting our EU renewables target, to do anything so sensible would put us in serious breach of EU law.

Stand by for those lights to go out.
 
New engine design. The announced 27% improvement in fuel economy is very exciting, if it can be verified in independent testing. Some buses, m113's and the old AVGP series of vehicles used a two stroke engine (6V53 and 53T series if I recall) so the idea of powering large vehicles this way isn't fantastic at all:

http://www.technologyreview.com/Energy/20494/?nlid=976

Nonelectric Hybrid Engines
A novel hybrid engine could slash fuel consumption.
By Duncan Graham-Rowe

A new kind of hybrid vehicle could offer reduced fuel consumption to consumers concerned about gas prices. Mechanical engineers in the United Kingdom have developed a novel kind of combustion engine that is able to switch between being a two-stroke and a four-stroke engine. The system, they say, can reduce fuel consumption by 27 percent.

The improved fuel consumption essentially comes from downsizing the engine, says Neville Jackson, technology director of Ricardo UK, an engineering firm in Shoreham-on-Sea that developed the new engine. "A smaller engine has less internal friction and delivers better fuel consumption," he says.

But small car engines, which are usually based on a four-stroke design, don't offer a lot of power. They can be particularly problematic when operated at low speeds with a high load, such as when accelerating uphill. Such conditions can even make a small engine stall if the driver doesn't downshift.

"Four strokes are most efficient at full throttle; with two strokes, it's the opposite," says Robert Kee, a mechanical engineer who specializes in combustion engines at Queen's University, in Belfast, Northern Ireland.

The difference between two- and four-stroke engines is that the latter carry out the four stages of air intake, compression, combustion, and exhaust in four strokes of a piston. A two-stroke engine, in contrast, does this in just two piston strokes.

Two-stroke engines are intrinsically simpler by design and have higher power-to-weight ratios at high loads and low speeds because they get twice as many power strokes per revolution. But traditional two-stroke engines require oil to be mixed in with the fuel, and therefore produce higher emissions. Because of this, they aren't typically used in cars. Instead, they're used for lightweight applications such as chainsaws, lawnmowers, and some motorbikes.

But now, researchers at Ricardo have developed a piston head that operates in both two- and four-stroke mode, and it can switch automatically between the two modes, depending on the needs of the engine. This allows a smaller engine to handle the low-speed, high-load conditions without stalling.

"This is an interesting concept," says Martti Larmi, head of the Internal Combustion Engine Laboratory at Helsinki University of Technology, in Finland.

The main challenge in building such an engine is perfecting the scavenging process, he says, when the residual gases from the previous combustion cycle are replaced with fresh air and fuel.

"You need some kind of pressure on the intake side to push out the gases that have already burned," says Larmi.

In a traditional two-stroke engine, the force of the fuel and air intake drives out the exhaust. Unfortunately, this process causes some unburned fuel to be lost as exhaust, resulting in higher emissions. Four-stroke engines force the spent fumes out of the cylinder through a cam-controlled valve using an upward stroke of the piston. During the following downstroke, fresh air and fuel are injected into the cylinder while the exhaust valve is closed.

Ricardo's engine, called 2/4SIGHT, uses valves like a four-stroke engine, but in two-stroke mode, the engine keeps both the intake and exhaust valves open at the same time so that the fuel and air in the cylinder are replenished each cycle, rather than every other cycle.

There has been a lot of interest in developing a low-emission two-stroke engine. But it's a difficult configuration to perfect because there is little time to get the fuel-air mix in and the exhaust out, says Larmi. "The danger here is that the fresh air intake can go directly out through the exhaust outlet," he says.

Ricardo is using a couple of tricks to get around this problem. First, the design of Ricardo's piston head uses reverse tumbling, a process in which the air intake is directed away from the exhaust valve, to reduce the chances of it flowing straight out of the cylinder. Ricardo has also swapped the cam-controlled valves for electro-hydraulic valves, which, along with the fuel injector, can be controlled by software.

Car manufacturers have showed an interest in building this sort of hybrid engine in the past, says Kee. "But there are a lot of challenges," he says. Indeed, both Toyota and Ricardo looked at this issue in the late 1980s and early '90s.

But in the past, the technology simply wasn't there. According to Ricardo, the only reason the company is able to make a viable system now is because of the software that controls the gas exchange and engine modes. "The engine's control system monitors driver demand," says Jackson. When more torque is required than would be possible in four-stroke mode, it switches, he says. However, the company will not reveal details about when, in the engine cycle, the mode is switched.

Ricardo's prototype, an adapted 2.1-liter V6 engine, has been tested by researchers at the University of Brighton and has been found to be able to produce the kind of performance one would normally expect from a three-to-four-liter engine. Based on the New European Driving Cycle, which is a standard performance test designed to gauge engine efficiency and emissions under typical car usage, the prototype has demonstrated fuel savings of 27 percent, and it reduces emissions by a similar amount. The next phase is to try to incorporate a prototype engine into a working vehicle, says Jackson.

Copyright Technology Review 2008.
 
The reason most plants are not configured this way is the cost/benefit ratio is very low individually. The capital costs to convert systems is probably too high (except in special cases), but new plants could be configured this way:

http://www.theatlantic.com/doc/200805/recycled-steam.

by Lisa Margonelli
Waste Not

Forty years ago, the steel mills and factories south of Chicago were known for their sooty smokestacks, plumes of steam, and throngs of workers. Clean-air laws have since gotten rid of the smoke, and labor-productivity initiatives have eliminated most of the workers. What remains is the steam, billowing up into the sky day after day, just as it did a generation ago.

The U.S. economy wastes 55 percent of the energy it consumes, and while American companies have ruthlessly wrung out other forms of inefficiency, that figure hasn’t changed much in recent decades. The amount lost by electric utilities alone could power all of Japan.

A 2005 report by the Lawrence Berkeley National Laboratory found that U.S. industry could profitably recycle enough waste energy—including steam, furnace gases, heat, and pressure—to reduce the country’s fossil-fuel use (and greenhouse-gas emissions) by nearly a fifth. A 2007 study by the Mc­Kinsey Global Institute sounded largely the same note; it concluded that domestic industry could use 19 percent less energy than it does today—and make more money as a result.

Economists like to say that rational markets don’t “leave $100 bills on the ground,” but according to McKinsey’s figures, more than $50 billion floats into the air each year, unclaimed by American businesses. What’s more, the technologies required to save that money are, for the most part, not new or unproven or even particularly expensive. By and large, they’ve been around since the 19th century. The question is: Why aren’t we using them?

One of the few people who’s been making money from recycled steam is Tom Casten, the chairman of Recycled Energy Development. Casten, a former Eagle Scout and marine, has railed against the waste of energy for 30 years; he says the mere sight of steam makes him sick. When Casten walks into an industrial plant, he told me, he immediately begins to reconfigure the pipes in his head, totting up potential energy savings. Steam, of course, can be cycled through a turbine to generate electricity. Heat, which in some industrial kilns reaches 7,000F, can be used to produce more steam. Furnace exhaust, commonly disposed of in flares, can be mixed with oxygen to create the practical equivalent of natural gas. Even differences in steam pressure between one industrial process and another can be exploited, through clever placement of turbines, to produce extra watts of electricity.

By making use of its “junk energy,” an industrial plant can generate its own power and buy less from the grid. A case in point is the ArcelorMittal steel mill in East Chicago, Indiana, where a company called Primary Energy/EPCOR USA has been building on-site energy plants to capture heat and gases since 1996. Casten, Primary Energy’s CEO from 2003 to 2006, was involved in several proj­ects that now sell cheap, clean power back to the mill. (interpolation: you can see this isn't quite as easy as made out)

As a result of Primary Energy’s proj­ects, the mill has cut its purchases of coal-fired power by half, reduced carbon emissions by 1.3 million tons a year, and saved more than $100 million. In March, the plant won an EPA Energy Star award. Its utilities manager, Tom Riley, says he doesn’t foresee running out of profitable proj­ects anytime soon. “You’d think you might,” he says, “but you can always find more … Energy efficiency is a big multiplier.”

Casten wants to help everyone see such possibilities, so he’s been combining EPA emissions figures with Google Earth images to let investors “peer” into smokestacks and visualize the wasted energy. Recycled Energy Development recently received $1.5 billion in venture funding, which should enable it to expand its reach greatly. Casten gives a whirlwind tour of the targets: natural-gas pipelines, he says, use nearly a tenth of the gas they carry to keep the fuel flowing. Capture some of the heat and pressure they lose, and the U.S. could take four coal-fired power plants offline (out of roughly 300). Another power plant could be switched off if energy were collected at the country’s 27 carbon-black plants, which make particles used in the manufacture of tires. And so on through facilities that make silicon, glass, ethanol, and orange juice, until, Casten hopes, he has throngs of competitors. “I always thought that if we were successful, people would emulate us and I’d be happy at the end of the day. I just didn’t think it would take 30 years.”

Yet in fact, Casten still has few competitors, and the improvements he’s made remain rare in American industry. With pressure growing to reduce greenhouse-gas emissions, the age of recycled steam may seem closer now than it has in the past, but because of a variety of cultural, financial, and—especially—regula­tory barriers, its arrival is no sure thing.

The first barrier is obvious from a trip through ArcelorMittal’s four miles of interconnected pipes, wires, and buildings. Steel mills are noisy, hot, and smelly—all signs of enormous inter­dependent energy systems at work. In many cases, putting waste energy to use requires mixing the exhaust of one process with the intake of another, demanding coordination. But engineers have largely been trained to focus only on their own processes; many tend to resist changes that make those processes more complex. Whereas European and Japanese corporate cultures emphasize energy-saving as a strategy that enhances their competitiveness, U.S. companies generally do not. (DuPont and Dow, which have saved billions on energy costs in the past decade, are notable exceptions. Arcelor­Mittal’s ownership is European.)

In some industries, investments in energy efficiency also suffer because of the nature of the business cycle. When demand is strong, managers tend to invest first in new capacity; but when demand is weak, they withhold investment for fear that plants will be closed. The timing just never seems to work out. McKinsey found that three-quarters of American companies will not invest in efficiency upgrades that take just two years to pay for themselves. “You have to be humbled,” Matt Rogers, a director at McKinsey, told me, “that with a creative market economy, we aren’t getting there,” even with high oil prices.

Some of these problems may fade if energy costs remain high. But industry’s inertia is reinforced by regulation. The Clean Air Act has succeeded spectacularly in reducing some forms of air pollution, but perversely, it has chilled efforts to reuse energy: because many of these efforts involve tinkering with industrial exhaust systems, they can trigger a federal or local review of the plant, opening a can of worms some plant managers would rather keep closed.

Much more problematic are the regu­lations surrounding utilities. Several waves of deregulation have resulted in a hodgepodge of rules without providing full competition among power generators. Though it’s cheaper and cleaner to produce power at Casten’s proj­ects than to build new coal-fired capacity, many industrial plants cannot themselves use all the electricity they could produce: they can’t profit from aggressive energy recycling unless they can sell the electricity to other consumers. Yet by­zan­tine regulations make that difficult, stifling many independent energy recyclers. Some of these competitive disadvantages have been addressed in the latest energy bill, but many remain.

Ultimately, making better use of energy will require revamping our operation of the electrical grid itself, an undertaking considerably more complicated than, say, creating a carbon tax. For the better part of a century, we’ve gotten electricity from large, central generators, which waste nearly 70 percent of the energy they burn. They face little competition and are allowed to simply pass energy costs on to their customers. Distributing generators across the grid would reduce waste, improve reliability, and provide at least some competition.

Opening the grid to competition is one of the more important steps to take if we’re serious about reducing fossil-­fuel use and carbon emissions, yet no one’s talking about doing that. Democratic legislators are nervous about creating incentives for cleaner, cheaper generation that may also benefit nuclear power. Neither party wants to do the dirty work of shutting down old, wasteful generators. And of course the Enron debacle looms over everything.

Technocratic changes to the grid and to industrial plants don’t easily capture the imagination. Recycling industrial energy is a solution that looks, well, gray, not green. Steel plants, coated with rust, grime, and a century’s worth of effluvia, do not make for inspiring photos. Yet Casten, pointing to the 16 heat-recycling contraptions that sit on top of the coke ovens at the East Chicago steel plant, notes that in 2004 they produced as much clean energy as all the grid-connected solar panels in the world. Green power may pay great dividends years from now. Gray power, if we would embrace it, is a realistic goal for today.
 
New oil find in Brazil.

From Stratfor.

The Geopolitical Diary:  Blue-Skying Brazil

Brazil is a rising power politically, economically and militarily. Not only is it South America’s largest country in terms of population, economic heft, military strength and land area, its geopolitical power is expanding while most of its traditional competitors — namely Argentina and Venezuela — are contracting.

But while Brazil is almost certain in the next few years to evolve into a regional hegemon — a step up from the region’s most powerful state — it is still difficult to see Brazil playing a leading role on the world stage. South America’s geography is too fractured for any power to control the whole space, and the continent is too remote from the world’s power centers — 7,000 miles from Buenos Aires to Brussels, more than 10,000 miles from Santiago to Singapore — for any of its powers ever to be a major global player.

Unless, that is, something changes. And for a few hours on Monday, it appeared that that something had indeed changed.

Initial reports from the Brazilian government asserted that a new oil find in the Carioca offshore block contains 33 billion barrels of crude. Within a few hours, however, an announcement that seemed to have global implications fizzled. By nightfall Petroleo Brasileiro, the state-influenced (and quite competent) national oil firm, had formally denied that test drilling had even reached the depth necessary to confirm or deny the presence of oil — much less a mammoth find.

Offshore region rich in oil
Brazil only began exploring the region in question in 2007, and it already has generated probable finds of at least 13 billion barrels of oil equivalent. Many, many more discoveries not only are possible, they are likely. What has been found to date already has doubled Brazil’s reserves.

This crude will not come online cheaply or quickly, however, and much uncertainty remains in these heady early days of exploration in Brazil’s ultradeep. But with potential discoveries of this size it is worth exploring a possible future.

Brazil has recently become self-sufficient in oil production — not counting the recent (and likely future) finds. And that got our analytical team thinking.

‘What if’ exercise
What would a world look like with a Latin American Saudi Arabia? How would things change on the global scene? At Stratfor we undertake what we term “blue sky” exercises from time to time, albeit typically in a much more compact geography and on a much shorter time line. These exercises help us think outside the tactical minutiae of day-to-day events, and prevent us from becoming too wed to our own predictions. It is not every day that something happens that can change global economic and political interactions on such a grand scale.

So rather than tightly edit our analysts’ responses to this question, here are some of their responses in the raw:

Should Brazil become a significant oil producer, global interest in Latin America will increase in proportion — not only from the United States, but also China, Russia, Europe and others. Competition for access to — and potentially control of — the resources, for security of the shipping routes, and for influence over the Brazilian government and energy companies also would rise. A resource-powerful Brazil, coupled with China’s labor, India’s tech and labor pool, and Russia’s energy and arms could also revive the BRIC (Brazil, Russia, India, and China) concept, perhaps making it a more viable bloc of formerly second-tier players, and bringing some counterbalance to U.S. global hegemony.

Brazil is too far away from energy consumers like India and China to tap without great cost. The United States is a much closer consumer. In time this would lessen U.S. energy dependence on the Middle East, especially Saudi Arabia — leaving that region for other energy consumers, like the aforementioned India and China. Such a shift largely would regionalize energy routes, leaving the United States looking at its own hemisphere for energy supplies, Europe to the former Soviet Union, and Asia to the Middle East (leaving Africa as a swing player). Though this may look like a more peaceable reality, it would be far from it, and could actually lead to more instability as no power would have much of an interest in stabilizing energy supplies going to other regions.

Canada’s tar sands hold anywhere from 800 billion to 1.2 trillion barrels of oil. Oil shale deposits in the U.S. Rocky Mountains are estimated at around 800 billion barrels. The success of tapping these deposits is uncertain, and technological and economic factors must play out, but in 15 to 20 years, substantial oil flows from Brazil, coupled with these potential new sources of North American oil (though more difficult to extract and expensive), and only moderate efficiency gains could guarantee almost complete energy independence for the entire Western Hemisphere.

A legitimate and proximate alternative oil source means the primary geopolitical motivation for immense U.S. investment in military operations in the Middle East begins to slowly evaporate. Though mastery of the world’s oceans remains a core geopolitical imperative for Washington, the disproportionate focus of the U.S. Navy on the Persian Gulf and the maintenance of the Strait of Hormuz becomes far less critical. Suddenly freeing the energy and capability the Pentagon would lead to a very robust and flexible — but far more evenly distributed — global U.S. naval presence. This could also be just the opening for the Navy, which in many ways has failed to re-evaluate its post-Cold War stance, to fundamentally remake itself for the 21st century.

The region with most to worry about from this development is the Middle East. From Washington’s view, getting oil from a relatively friendly and stable country to its south is far, far preferable than dealing with the chaos of the distant Middle East. Saudi Arabia and the other major Gulf powers will become distant not only from their biggest energy customer, but also from their biggest security guarantor. With a diminished U.S. interest in the Middle East, regional fault lines are more likely to erupt, spelling more instability for this already largely volatile region. Israel in particular has much to lose as it sees its regional security framework — which is built around having the United States deeply involved in the Middle East — weaken, and its alliance with the United States strained as a result
 
Fuel cell technology advances some more. Liquid hydrocarbon fuels have high energy density and are reletively stable and easy to handle underr normal conditions (what do you think you are doing at the self serve gas station?), and alcohol based fuels have similar advantages. The energy density of a high energy battery isn't that great, but so long as fuel is flowing through the fuel cell, you have a current (and the energy density of the fuel is what counts).

http://www.technologyreview.com/Energy/20813/page1/

More-Powerful Fuel Cells
A cheap polymer material increases the power output of methanol fuel cells by 50 percent.
By Katherine Bourzac

Methanol fuel cells have the potential to replace batteries as a lightweight power source for portable electronic devices. But fuel-cell materials are expensive, and fuel cells that consume methanol are inefficient. In particular, the membranes used in methanol fuel cells are expensive and waste fuel. Now researchers at MIT have developed a cheap membrane material that increases the power output of methanol fuel cells by 50 percent.

The energy density of a methanol fuel cell "compares to the best high-energy-density batteries," says Robert Savinell, a chemical engineer at Case Western Reserve University, in Cleveland, who was not involved in the research. And because they weigh less than batteries, methanol fuel cells are a promising power source for portable electronics. For the military, tanks of methanol for refilling fuel cells would be lighter than extra batteries that would have to be carried on long missions. The energy density of methanol fuel cells could also be an advantage in portable consumer electronic devices such as laptops and iPods. But commercialization of methanol fuel cells has been limited because of their price: they require a thick internal membrane made of an expensive polymer. And even with this expensive material, they use fuel inefficiently.

To overcome these limitations, Paula Hammond, a chemical engineer at MIT, has made a fuel-cell membrane out of layers of polymers whose electrochemical properties can be precisely tuned to prevent fuel waste. Indeed, says Savinell, Hammond has solved a problem that chemists have been trying to overcome for years.

Methanol fuel cells have two compartments separated by a membrane. On one side, methanol is stripped of protons and electrons. The protons are carried through the membrane to the other compartment, where they are combined with oxygen to form water. The electrons, which can't cross the membrane, are forced into an external current that can be used to power electronic devices.

Because water is being created inside the fuel cell, the membrane is wet. Methanol, which is very soluble in water, is absorbed by conventional fuel-cell membranes and can cross over to the other side. This wastes fuel and makes the cathode, the oxidizing end of the cell, work harder. "Everyone's concerned about methanol crossover," says Merlin Bruening, a chemist at Michigan State University. Researchers have tried many different approaches to improving methanol fuel-cell membranes, but all have entailed trade-offs. "The challenge is to maintain stability and conductivity [to protons]," while decreasing methanol crossover, says Bruening.

Hammond synthesizes fuel-cell membranes using a technique called layer-by-layer assembly. She starts with a very thin membrane of the polymer used in conventional fuel cells. She dips it into a water solution of a positively charged polymer, then into one of a negatively charged polymer; the process is repeated until many layers are built up. The result, explains Hammond, is "a polymer backbone that resists the permeation of methanol" while still conducting protons.

The resulting 100-nanometer-thick membrane conducts two orders of magnitude less methanol than conventional, 50-micrometer-thick membranes do. And fuel cells incorporating it have a greater power output.

Hammond says that methanol is a better candidate to power portable fuel cells than hydrogen because it's a liquid and not nearly as flammable. "It's a dense power source that's safe to carry around," she says.

Savinell says that Hammond's work could have applications beyond methanol fuel cells. By picking the right polymers and varying assembly conditions including pH, says Savinell, "you can customize and optimize [the films] for any application." Layer-by-layer films might be used to improve the conductivity of hydrogen fuel-cell membranes and to increase the efficiency of ethanol fuel cells. Ethanol is safer than methanol but has similar drawbacks as a feedstock for fuel cells: ethanol seeps across the polymer membranes.

"The real promise is the power of the technology to make new materials," says Savinell. Hammond is now working on new fuel-cell membranes that contain none of the expensive conventional polymer.




Copyright Technology Review 2008.
 
The current energy crisis in the uS is the sole fault of the democrat led Congress.Gas prices have increased dramatically from 2006.Congress refuses to allow offshore drilling and drilling in Alaska.Finally just last week the Senate blocked development of Colorado's immense oil shale reserves thought to be 2-3 trillion barrels.They want to sue OPEC.They want alternative energy that isnt currently available.They have blocked nuclear power which is safe and clean energy.If we simply built more nuclear plants for power existing oil supplies would force a dramatic decline in oil prices.Until the politicians show some backbone towards the green lobby things will only get worse in the US at least.
 
http://www.powerlineblog.com/archives2/2008/05/020589.php

With 94% of the world's oil supply locked up by foreign governments, most of which are hostile to the United States, the relatively puny American oil companies do not have access to enough crude oil to significantly affect the market and help bring prices down. Thus, Exxon Mobil, a small oil company, buys 90% of the crude oil that it refines for the U.S. market from the big players, i.e, mostly-hostile foreign governments. The price at the U.S. pump is rising because the price the big oil companies charge Exxon Mobil and the other small American companies for crude oil is going up.

This is obviously a tough situation for the American consumer. The irony is that it doesn't have to be that way. The United States--unlike, say, France--actually has vast petroleum reserves. It would be possible for American oil companies to develop those reserves, play a far bigger role in international markets, and deliver gas at the pump to American consumers at a much lower price, while creating many thousands of jobs for Americans. This would be infinitely preferable to shipping endless billions of dollars to Saudi Arabia, Russia and Venezuela.
 
Back
Top