Hurricane winds can rupture undersea pipes

WASHINGTON (UPI) — U.S. researchers say they’ve determined undersea forces produced by strong hurricanes are powerful enough to rupture underwater oil pipelines.

The scientists at the U.S. Naval Research Laboratory said the pipelines could crack or rupture unless they are buried or their supporting foundations are built to withstand hurricane-induced currents.

“Major oil leaks from damaged pipelines could have irreversible impacts on the ocean environment,” the researchers said, noting a hurricane’s winds can raise waves 66 feet or more above the ocean surface.

Based on unique measurements taken during a powerful hurricane, the researchers said their study is the first to show hurricanes propel underwater currents with enough force to dig up the seabed, potentially creating underwater mudslides and damaging pipes or other equipment resting on the bottom.

They said they’re not sure what strengths of forces underwater oil pipelines are built to withstand. However, “Hurricane stress is quite large, so the oil industry better pay attention,” said Hemantha Wijesekera, who led the study.

The findings are to appear in the June10 issue of the journal Geophysical Research Letters.

Sourced and published  by Henry Sapiecha

Coupled Water Tower/Wind Turbine Controller
Andras Tanczos
Helsinki, Finland


altA jointed water tower/wind turbine controller stores wind energy in the water towers of the drinking water network. At strong winds, the extra electrical energy generated by the wind turbine can be used to pump water into the water tower. When there is no wind, this energy can be released with a hydro-turbine, and the water goes back to the wells. The pump of the water tower and the hydro-turbine are used to control the water level in the reservoir. The electricity from the wind turbine is used for pumping the water or for supplying the electrical grid. The controller can also be installed on existing water towers and water tanks placed on top of buildings.

Sourced and published by Henry Sapiecha 8th Sept 2009


Claim to save hugely in heating bills




When the combustion process is improved more value is then gained from the wood used. Excessive smoke is unburnt fuel. SmartBurn enables this fuel (smoke) to be burnt in the fire instead of being released into the atmosphere.    SmartBurn reduces Carbon emissions (as soot and sap).

Chimney Before SmartBurn Chimney After SmartBurn Before  SmartBurn After SmartBurn

Each SmartBurn prevents approximately 15 kg of smoke haze and      particulate emissions from entering the atmosphere.

SmartBurn contains a mixture of non-toxic natural ingredients and for best results SmartBurn should be replaced every 3 months.

SmartBurn is also effective in lounge open fireplaces and kitchen stoves.

SmartBurn is proudly Australian Invented, Manufactured and Owned.

This exciting technology has been Internationally Patented and the name SmartBurn has been Trademarked.


Sourced and published by Henry Sapiecha 29th May 2009


Scientists find source of carbon lava


ALBUQUERQUE (UPI) — U.S. and French scientists say they have discovered the origin of carbon-based lavas erupting from a Tanzanian volcano.

The researchers, led by the University of New Mexico, analyzed gas samples collected from inside the active crater of Tanzania’s Oldoinyo Lengai volcano — the only volcano that is actively producing carbon-based lavas. The geochemical analyses revealed a very small degree of partial melting of minerals in the Earth’s upper mantle is the source of the rare carbon-derived lava.

Although carbon-based lavas, known as carbonatites, are common, the Oldoinyo Lengai volcano, located in the East African Rift in northern Tanzania, is the only place on Earth where they are actively erupting. The researchers said the lava expelled from the volcano is highly unusual in that it contains nearly no silica and greater than 50 percent carbonate minerals. Typically lavas contain high levels of silica, which increases their melting point to above 1,652 degrees Fahrenheit. The lavas of the Oldoinyo Lengai volcano erupt as a liquid at approximately 1,004 degrees Fahrenheit.

The research by the scientists from the University of New Mexico, the Scripps Institution of Oceanography at the University of California-San Diego and the Research Center for Petrographics and Geochemicals in Nancy, France, appears in the journal Nature.

Copyright 2009 by United Press International

Sourced and published by Henry Sapiecha 18th May 2009


Study shows cooling lengthens tool’s life


WEST LAFAYETTE, Ind. (UPI) — A Purdue University researcher says he’s discovered cooling cutting tools can result in a longer life, with sharper cutting capability.

Professor Rado Gazo found cryogenically treating router bits, as well as cooling them while they cut, increased the tools’ lives — in some cases doubling them.

He said cryogenically treating the bits to harden them, blowing cooled air on them during use — or doing both — improved the life of the tools and kept cuts clean longer.

Cryogenic treating requires cooling the tools to minus 300 degrees Fahrenheit and then bringing them back to ambient temperature.

Gazo said he used a router bit that had not been cryogenically frozen or exposed to cool air during use as a control and cut more than 100 miles of tool path in a medium-density fiberboard.

Bits that were not frozen, but were subjected to 40-degree and 20-degree Fahrenheit air during use, had as much as a 25 percent increased tool life.
A bit cryogenically frozen, but not cooled during use, showed an increased tool life of about 65 percent over the control.

The research appears in the early online edition of the Journal of Materials Processing Technology.

Copyright 2009 by United Press International

Sourced and published by Henry Sapiecha 9th May 2009

Robots Meet Reality

Andy Greenberg , 11.08.07, 6:00AM ET


In Pictures:
Putting Robots To Work

Drive all by themselves. But TerraMax’s autonomous driving technology could save lives by doing more mundane chores, including automatically following another car in a convoy or providing a warning system aimed at preventing a human driver from making dangerous mistakes.

In Pictures: Putting Robots To Work

TerraMax is the largest–and easily the most terrifying–of the 11 robotic vehicles that participated in the final race of DARPA’s Urban Challenge in early November, a milestone event that showcased the robotic cars’ ability to follow complex routes and negotiate traffic completely under their own control through 60 city miles. (See: “Viva La Robot Revolution!”) The race, sponsored by the Defense Advanced Research Projects Agency, the Pentagon’s research wing, offered $3.5 million in prizes designed to springboard the robotics industry and help fulfill Congress’s ambitious mandate that one-third of all military vehicles be unmanned by the year 2015.

But the race also underscored how far away that goal still is: At one point, two robotic SUVs collided. Another mistook a driveway for a road. TerraMax itself came within inches of plowing into a concrete pillar and had to be taken off the course.

Taken together, all of these imperfections prove to many roboticists that the dream of a totally driverless fleet of military vehicles is still too complex–both technically and politically–to be more than science fiction. But that doesn’t mean it’s a waste of time. What DARPA’s race really demonstrated, they argue, is that robotic driving technology is ready to work together with human drivers–not to replace them.

“This was a fun event, but it clearly shows that the world is not ready for autonomous driving,” says Sebastian Thrun, the head of the Stanford team whose robotic Passat, “Junior,” took the competition’s second-place prize. In the near term, Thrun says, these autonomous driving technologies should be put to work in warning systems and automatic stopping controls, devices that he says could reduce the 95% of vehicular deaths that are caused by human error. Thrun points out that more than 42,000 automobile casualties occur in the United States every year. “It’s a number that keeps me up at night,” Thrun says. “If we could cut that in half, it would be an incredible achievement.”

The key to applying imperfect robotic technology to present problems, says autonomous-driving researcher Jay Gowdy, is to combine humans’ ability to understand their surroundings with a robot’s ability to measure and react consistently.

“We’re not building autonomous chauffeurs,” says Gowdy. “We’re building robotic horses.” Like a horse, Gowdy says, a robotic car of the near future might control much of the moment-by-moment decision-making that goes into getting from point A to point B. But if the robotic car were “spooked,” he says, a human driver could take control.


That kind of robotic integration is well on its way. Gowdy works for Natick, Mass.-based Cognex (nasdaq: CGNX news people ), a company that has developed lane-departure warnings systems that “watch” the lane lines on the road. Installed in trucks, those sensors can alert a sleepy driver who is weaving out of his or her lane.

Adaptive cruise control, pioneered by companies like Mercedes-Benz and Lexus, uses the same laser and radar scanners installed on DARPA’s robotic cars to maintain a set distance from other vehicles on a highway. Sensor developers like IBEO and its parent company, SICK, in Walkirch, Germany, are working on electronic eyes that could one day help cars spot–and so avoid–pedestrians, animals or other obstacles.

Off-road, where traffic doesn’t complicate matters, robotic driving is even more practical. Caterpillar Construction (nyse: CAT news people ), which sponsored the three top teams in this year’s DARPA’s challenge, now equips some of its bulldozers with a combination of GPS and laser scanners to allow for semi-autonomous earth-moving. The driver has merely to guide the vehicle back and forth, and the blade robotically positions itself to create a perfectly flat surface.

Red Whittaker, the head of Carnegie Mellon’s Tartan Racing Team, whose robotic Chevy Tahoe called “Boss” took the top prize of $2 million in the most recent DARPA race, cites another off-road application: farming. Whittaker, who farms about 300 acres of land in his spare time, points out that Trimble, the company that created global-positioning satellite systems for many of the robots in the race, also sells a system called “EZ Steer,” a small steering-wheel attachment that robotically guides tractors. “Farmland goes for miles–you want straight, even, careful rows. You don’t want to compact the land you’re driving on, so you drive in the same tracks year after year after year,” he says. “A good guidance system creates much higher quality and higher performance.”


If all of these developing technologies mean that DARPA’s dollars are funding commercial applications more than military advances, it wouldn’t be the first time, says Stanford’s Sebastian Thrun. The Internet, Thrun points out, was also originally sponsored by DARPA, with technology built by university and industry teams. “Did the military intend to foster porn-surfing on the Web?” he asks. “I doubt it.”

Whether DARPA’s autonomous driving initiative spurs more military or civilian spin-offs isn’t as important as simply making driving safer, Thrun says.

“A life saved is a life saved,” he says. “In these moments of disruptive technology, everyone benefits.”

In Pictures: Putting Robots To Work

Sourced and published by Henry Sapiecha 1st May 2009

How To Save The Biodiesel


Government dithering and high commodity prices make for a tough environment.


BURLINGAME, Calif.–Can the biodiesel industry be saved? It’s remotely possible–but not unless the government steps in to jump-start the besieged market.

Biodiesel, a low-carbon fuel usually made with soy, palm or canola oil, first grabbed the spotlight a few years ago. That was when Congress started promoting the green fuel as a replacement for traditional diesel. Private-equity firms started pumping hundreds of millions of dollars into companies like Seattle’s Imperium Renewables and Green Earth Fuels, of Houston, hoping to get in on the ground floor of a nascent market.

Federal government mandates and tax breaks, driven by the broader goal of fighting pollution and cutting reliance on foreign oil, were supposed to create a mass market, even though biodiesel was often more expensive than regular diesel fuel.

It hasn’t happened. Starting in mid-2007, prices of the canola and soy oils used to make biodiesel soared. That pushed up the cost of the green fuel and wounded producers’ bottom lines. With oil peaking at $147 a barrel last summer, biodiesel still made economic sense for some customers, since regular diesel prices climbed to an average $4.77 a gallon. Biodiesel didn’t look bad by comparison.


But then petroleum prices tanked. That widened the price gap and made the green option uneconomical for even the most die-hard environmentalists. Commodity prices have since come down, but not enough to bridge the gap. The recession has damped demand for energy overall and made it nearly impossible for fledgling clean-fuel ventures, including biodiesel makers, to get credit to expand.

“The market conditions are very, very tough right now,” says Joe Jobe, head of the National Biodiesel Board in Jefferson City, Mo. Of the nation’s 176 biodiesel operators, “it’s very difficult to say how many of them are still operating.”

The industry’s woes illustrate the hazards of building a business around the prices of two volatile, and often unrelated, commodities–in this case, raw vegetable oil and petroleum. They also show that not all green fuels are created equal. Lots of environmentalists have hopped off the biodiesel bandwagon, charging that increased demand for commodities like palm oil will lead to deforestation and, in turn, even more greenhouse-gas emissions from countries like Malaysia and Indonesia.

Sourced and published by Henry Sapiecha 16th April 2009

Electricity In The Air

Wireless power technologies are moving closer to becoming viable options.

This year probably won’t be the tipping point for wireless electricity. But judging from all the new techniques and applications of this awe-inspiring technology, getting power through the airwaves could soon be viable.

Fulton Innovations showcased blenders that whir wirelessly and laptops that power up without a battery at the Consumer Electronics Show (CES) earlier this month. The devices are all powered by electromagnetic coils built into the charging surface, and there’s not a plug in sight.

Fulton’s wireless electricity technology is called eCoupled, and the company hopes it can be used across a wide rage of consumer devices. Fulton was one of half a dozen companies that wowed consumers at CES.

In Pictures: 10 Wireless Electricity Technologies

ECoupled uses a wireless powering technique called “close proximity coupling,” which uses circuit boards and coils to communicate and transmit energy using magnetic fields. The technology is efficient but only works at close ranges. Typically, the coils must be bigger than the distance the energy needs to travel. What it lacks in distance, it makes up in intelligence.

In conjunction with the Wireless Power Consortium, Fulton, a subsidiary of Amway, has developed a standard that can send digital messages back and forth using the same magnetic field used to power devices. These messages are used to distinguish devices that can and can’t be charged wirelessly, and to relay information like power requirements or how much battery is left in a device.


Using this technique, an industrial van parked outside the Fulton booth at CES charged a set of power tools from within its carrying case. The van was tricked out by Leggett & Platt (nyse: LEG news people )–a diversified manufacturing company based in Carthage, Mo., and an eCoupled licensee–and is designed to solve its customers’ biggest headache: arriving at the job site with a dead set of tools. Fulton, which teamed up with Bosch to design the setup, already has test vehicles rolling around in the field and plans to sell them to utility and other industrial companies by the end of the year.

Texas Instruments (nyse: TXN news people ) announced last November that it will manufacture a chip set that will reduce the manufacturing cost of integrating eCoupled wireless power into consumer electronic devices.

Sourced and published by Henry Sapiecha 16th April 2009

Fiddling With The Earth’s




Scientists, including Obama’s science advisor, get tied in knots over geoengineering.

Oil and gas are so deliciously tempting that humans are having no success in slowing down global warming the way scientists agree we should, by going easy at the fossil fuel buffet.

So like surgeons who use liposuction to deal with obesity, scientists are considering ways to deal with the consequences of our unhealthy carbon diet. They are thinking about blowing soot into the stratosphere, hanging sunshades in space and sprinkling the oceans with fertilizer to create blooms of carbon-sucking phytoplankton.

These approaches are aimed at cooling the earth by either allowing less sunlight in or letting more heat bounce back to space by removing heat-trapping gases like carbon dioxide. The big idea–fighting or reversing atmospheric changes with large-scale tinkering of the earth–is called geoengineering, and it’s tying scientists in knots.

President Obama’s science advisor, John Holdren, got twisted up himself last week. In his first interview since he was appointed, he mentioned to the Associated Press that he and the administration had discussed geoengineering approaches. Holdren later had to write an e-mail clarifying his position in response to fears that he and the administration were considering planning something specific. They aren’t.

“I said that the approaches that have been surfaced so far seem problematic in terms of both efficacy and side effects, but we have to look at the possibilities and understand them because if we get desperate enough it will be considered,” Holdren wrote.

This highlights why geoengineering is such an extraordinarily touchy scientific subject and why there is such deep ambivalence in the scientific community about it. Almost no one thinks that humans should be trying to change the atmosphere on a global scale. But then again, aren’t we already doing that by removing carbon from the ground in the form of fossil fuels and depositing it in the atmosphere as carbon dioxide on a massive scale? And what if we don’t solve the problem in time?



What complicates things is that the scientists who are most concerned with the pace of global warming and the destruction that might ensue are the ones who are forcing themselves to think about radical solutions. It terrifies them because they know better than anyone that the climate is massively complex and that unintended consequences lurk everywhere.

Nobel laureate Paul Crutzen, best known for his work on ozone depletion, has advanced the idea of injecting sulfur particles into the atmosphere to reflect sunlight away from earth. James Lovelock, a hero to early environmentalists who proposed the Gaia hypothesis, has advocated placing long, vertical wave-driven pipes in the ocean that would pump nutrient-rich water to the surface to fertilize algae that would consume carbon dioxide.

Sourced and published by Henry Sapiecha 16th April 2009

Sugar-Based Biofuels



Madison, Wis.,-based Virent Energy Systems has a low-temperature, low-pressure, catalytic process for turning carbohydrates (sugars) into gasoline, diesel and other fuels. Its 70 employees now make a gallon or so daily. Targeting gasoline as its first fuel, Virent hopes within five years to raise that production to 10 million to 15 million gallons annually. Virent has pulled in more than $30 million in venture funding and has strategic relationships with the likes of Cargill, Honda Motor and Royal Dutch Shell.

Sourced and published by Henry Sapiecha 31 st March 2009