Scientists have unveiled a new initiative, dubbed the Institute for Advanced Architecture, to lay the groundwork for a supercomputer that would be more than 1,000 times faster than any current offering.
Commercial supercomputer makers have recently begun to flirt with petaflop performance, meaning computers capable of completing 1,000 trillion floating-point calculations (flops) per second. The Sandia and Oak Ridge national lab scientists aim to leapfrog that benchmark by several orders of magnitude and are targeting one million trillion calculations per second, known as exascale computing.
(Exa is the metric prefix for quintillion, or 1018.)
"Both the [Department of Energy's] Office of Science and the National Nuclear Security Administration have identified exascale computing as a critical need in roughly the 2018 timeframe," said Sudip Dosanjh, the project's head. "We certainly think that there is a national competitiveness issue."
Ultrafast computers are integral to simulating complex systems, like the Earth's climate, nuclear warhead explosions or the protein interactions inside cells. They continue to progress, thanks to the well-known -- though often questioned -- Moore's Law, which has allowed chip makers to pack twice as much power into the same amount of space about every two years. More power has meant more so-called flops, a common measurement of computing speed. Ten years ago, Sandia's ASCI Red became the first teraflop computer, and in December 2000, Wired called 100-teraflop performance "unheard of."
Now, though, new challenges have presented themselves. The researchers say that moving data from the supercomputer's thousands of processors into its memory will require them to design new architectures that reduce the need to move data around.
"Some people say that flops are almost free, that really what you are paying for is moving the data," Dosanjh said.
In addition, power and reliability require new solutions when you've got thousands or millions of processors.
"The power budget for all computers seems to be going up rapidly. We need a machine you can afford to run," Dosanjh said, and one that actually works. With a million computing nodes working together, the odds are high that one of them will break, over the course of even a small calculation.
With current technologies, "an exascale computer might only stay running for a few minutes," said Dosanjh.
The Sandia-Oak Ridge collaboration has $7.4 million in fiscal year 2008 funding from the National Nuclear Security Administration and the Department of Energy, but it's not just nuclear weapons research that is driving the push for faster supercomputers. Researchers of many stripes have come to depend on the inexorable upward scaling of computing power.
Gavin Schmidt, a climate modeler at NASA Goddard, said that he's built the regularity of computational upgrades into the way he designs his climate simulations, which are so computing-intensive they can take several months of processing to complete.
"Generally speaking we don't do experiments that last more than three months," Schmidt said. "If you want to do an experiment that would last for six months, it's best to just wait a few months, and then [with faster computers] it only takes two months to run."
According to a semiannual list of the world's top 500 supercomputers, compiled in November 2007, IBM's BlueGene/L System is the fastest computer in the world, with benchmark performance of about 480 teraflops per second, or almost half a petaflop. That rig is a joint development of IBM and the National Nuclear Security Administration, and is housed in California's Lawrence Livermore National Laboratory.
With the research team trying to vault several orders of magnitude over any current system, Dosanjh said the new institute would need $20 to $30 million a year to accomplish its goals.
Even as individual supercomputers have grown in speed, distributed-computing initiatives, like the Folding@Home program, have enabled researchers to tap into thousands of users' computers and PS3s to solve some types of scientific problems.
Sunday, February 24, 2008
Monday, February 18, 2008
Time travel could be possible ... in the future
Roger Highfield, Science Editor
It may take more than a nuclear-powered De Lorean or a spinning police box, but time travel could actually be a possibility for future generations, according to an eminent professor of physics.
Prof Stephen Hawking refutes the possibility of time travel
The way the machine would work rests on Einstein’s theory of general relativity, a theory of gravity that shows how time can be warped by the gravitational pull of objects.
Bend time enough and you can create a loop and the possibility of temporal travel.
Prof Ori’s theory, set out in the prestigious science journal Physical Review, rests on a set of mathematical equations describing hypothetical conditions that, if established, could lead to the formation of a time machine, technically known as “closed time-like curves.”
In the blends of space and time, or spacetime, in his equations, time would be able to curve back on itself, so that a person travelling around the loop might be able to go further back in time with each lap.
In the past, one of the major challenges has been the alleged need for an exotic material with strange properties - what physicists call negative density - to create these time loops.
“This is no longer an issue,” he told The Daily Telegraph.
“You can construct a time machine without exotic matter,” he said.
It is now possible to use any material, even dust, so long as there is enough of it to bend spacetime into a loop.
Even though Prof Ori, of the Technion-Israel Institute of Technology, believes his new work strengthens the possibility of a real Tardis, he would not speculate on when a time machine would be built, or even if it would ever be possible.
“There are still some open questions.”
The main remaining issue is the stability of space time, the very fabric of the cosmos, in time travel scenarios.
But overcoming this obstacle may require the next generation of theory under development, called quantum gravity, which attempts to blend general relativity with the ideas of the quantum theory, the mathematical ideas that rule the atomic world.
Time travel has long been a fascination, HG Wells grappled with the scientific issues in his 1895 science fiction classic, The Time Machine, Dr Who is still fighting the time war and Hollywood insisted all that was needed for time travel was a De Lorean and a good flash of lightning.
But more serious work on general relativity first raised the astonishing possibility of time travel in the 1940s.
In the half century since, many eminent physicists have argued against time travel because it undermines ideas of cause and effect to create paradoxes so that a time traveller could go back to kill his grandfather so that she is never born in the first place.
In 1990, the world’s best known scientist, Prof Stephen Hawking proposed a “chronology protection conjecture”, which flatly says the laws of physics disallow time machines.
Three years later, Prof Ori concluded that the possibility of constructing a time machine from conventional materials could not be ruled out.
Prof Hawking then fought back with his Cambridge University colleague Michael Cassidy and they concluded that time loops are extremely unlikely.
Tongue in cheek, Prof Hawking added that there is experimental evidence that time travel doesn’t exist: “We have no reliable evidence of visitors from the future. (I’m discounting the conspiracy theory that UFOs are from the future and that the government knows and is covering it up. Its record of cover-ups is not that good.)”
But now, in Physical Review, Prof Ori has provided some more advanced solutions to the problems of time travel outlined by the likes of Prof Hawking, helping to realise an idea that dates back millennia and appears in 18th century literature, Harry Potter, Dickens, sci-fi movies and much more besides.
It may take more than a nuclear-powered De Lorean or a spinning police box, but time travel could actually be a possibility for future generations, according to an eminent professor of physics.
Prof Stephen Hawking refutes the possibility of time travel
The way the machine would work rests on Einstein’s theory of general relativity, a theory of gravity that shows how time can be warped by the gravitational pull of objects.
Bend time enough and you can create a loop and the possibility of temporal travel.
Prof Ori’s theory, set out in the prestigious science journal Physical Review, rests on a set of mathematical equations describing hypothetical conditions that, if established, could lead to the formation of a time machine, technically known as “closed time-like curves.”
In the blends of space and time, or spacetime, in his equations, time would be able to curve back on itself, so that a person travelling around the loop might be able to go further back in time with each lap.
In the past, one of the major challenges has been the alleged need for an exotic material with strange properties - what physicists call negative density - to create these time loops.
“This is no longer an issue,” he told The Daily Telegraph.
“You can construct a time machine without exotic matter,” he said.
It is now possible to use any material, even dust, so long as there is enough of it to bend spacetime into a loop.
Even though Prof Ori, of the Technion-Israel Institute of Technology, believes his new work strengthens the possibility of a real Tardis, he would not speculate on when a time machine would be built, or even if it would ever be possible.
“There are still some open questions.”
The main remaining issue is the stability of space time, the very fabric of the cosmos, in time travel scenarios.
But overcoming this obstacle may require the next generation of theory under development, called quantum gravity, which attempts to blend general relativity with the ideas of the quantum theory, the mathematical ideas that rule the atomic world.
Time travel has long been a fascination, HG Wells grappled with the scientific issues in his 1895 science fiction classic, The Time Machine, Dr Who is still fighting the time war and Hollywood insisted all that was needed for time travel was a De Lorean and a good flash of lightning.
But more serious work on general relativity first raised the astonishing possibility of time travel in the 1940s.
In the half century since, many eminent physicists have argued against time travel because it undermines ideas of cause and effect to create paradoxes so that a time traveller could go back to kill his grandfather so that she is never born in the first place.
In 1990, the world’s best known scientist, Prof Stephen Hawking proposed a “chronology protection conjecture”, which flatly says the laws of physics disallow time machines.
Three years later, Prof Ori concluded that the possibility of constructing a time machine from conventional materials could not be ruled out.
Prof Hawking then fought back with his Cambridge University colleague Michael Cassidy and they concluded that time loops are extremely unlikely.
Tongue in cheek, Prof Hawking added that there is experimental evidence that time travel doesn’t exist: “We have no reliable evidence of visitors from the future. (I’m discounting the conspiracy theory that UFOs are from the future and that the government knows and is covering it up. Its record of cover-ups is not that good.)”
But now, in Physical Review, Prof Ori has provided some more advanced solutions to the problems of time travel outlined by the likes of Prof Hawking, helping to realise an idea that dates back millennia and appears in 18th century literature, Harry Potter, Dickens, sci-fi movies and much more besides.
The Chinese Government's Plans for Nanotechnology
Alexis Madrigal February 17, 2008
BOSTON, MA - China aims to leapfrog the United States in technological development with substantial investment in nanotechnology, but whether those efforts will actually pay off is still unclear. That was the message from University of California at Santa Barbara researchers presenting their findings on the state of Chinese nanotechnology here at the AAAS annual meeting.
Richard Applebaum and Rachel Parker from the Center for Nanotechnology in Society at UCSB conducted about sixty interviews with Chinese officials to piece together a picture of the current state of Chinese nanotechnology. Applebaum set the specific research effort within the context of China's stated overarching goal to "leapfrog" the West by using a combination of learning from the West (i.e. technology transfer) and increasing domestic research capacity ("indigenous innovation" or zizhu chuangxin).
Nanotechnology research is one of four Chinese "science Megaprojects" that have the central purpose of catching the country up to US research by 2020. Still, for all the big talk, the actual government investment is not overwhelming. The researchers estimated that the Chinese government only invested $400 million from 2002 to 2007, although that investment is expected to rise considerably.
They highlighted several international partnerships related to nanotechnology including the Tsinghua-Foxconn Nanotechnology Research Center and the Zheijang-California NanoSystems Institute, but didn't go into much detail about what types of projects are being developed in those centers.
Right now, most nanotech research is being pushed by the central and regional governments with little private capital contributing to the national output. There are a lot of questions about whether or not that is a sustainable model for developing a high-tech industry, Applebaum noted. (It should also be noted, though, that some would question whether the venture capital model is sustainable either.)
It also leads to strange applications of nanotechnology in high-profile venues. Parker said that the Olympic village parking lots being constructed in Beijing will have a nanopolymer coating that will absorb exhaust. It was just an off-hand mention, but I am officially intrigued by the idea of coating our parking lots with pollution absorbing material. I can't vouch for the true environmental-safety of that solution, but I'd love to know how they're doing it. The coating could be something like this pollution absorbing concrete that uses titanium dioxide to degrade pollutants.
BOSTON, MA - China aims to leapfrog the United States in technological development with substantial investment in nanotechnology, but whether those efforts will actually pay off is still unclear. That was the message from University of California at Santa Barbara researchers presenting their findings on the state of Chinese nanotechnology here at the AAAS annual meeting.
Richard Applebaum and Rachel Parker from the Center for Nanotechnology in Society at UCSB conducted about sixty interviews with Chinese officials to piece together a picture of the current state of Chinese nanotechnology. Applebaum set the specific research effort within the context of China's stated overarching goal to "leapfrog" the West by using a combination of learning from the West (i.e. technology transfer) and increasing domestic research capacity ("indigenous innovation" or zizhu chuangxin).
Nanotechnology research is one of four Chinese "science Megaprojects" that have the central purpose of catching the country up to US research by 2020. Still, for all the big talk, the actual government investment is not overwhelming. The researchers estimated that the Chinese government only invested $400 million from 2002 to 2007, although that investment is expected to rise considerably.
They highlighted several international partnerships related to nanotechnology including the Tsinghua-Foxconn Nanotechnology Research Center and the Zheijang-California NanoSystems Institute, but didn't go into much detail about what types of projects are being developed in those centers.
Right now, most nanotech research is being pushed by the central and regional governments with little private capital contributing to the national output. There are a lot of questions about whether or not that is a sustainable model for developing a high-tech industry, Applebaum noted. (It should also be noted, though, that some would question whether the venture capital model is sustainable either.)
It also leads to strange applications of nanotechnology in high-profile venues. Parker said that the Olympic village parking lots being constructed in Beijing will have a nanopolymer coating that will absorb exhaust. It was just an off-hand mention, but I am officially intrigued by the idea of coating our parking lots with pollution absorbing material. I can't vouch for the true environmental-safety of that solution, but I'd love to know how they're doing it. The coating could be something like this pollution absorbing concrete that uses titanium dioxide to degrade pollutants.
Saturday, February 16, 2008
And the 14 Grand Engineering Challenges of the 21st Century Are..
.
By Chuck Squatriglia February 15, 2008
Before you can save the world, you'd better write a to-do list so nothing gets overlooked. Some of the world's brightest minds have done just that by laying out this century's greatest engineering challenges.
The panel of 18 engineers, technologists and futurists included Google co-founder Larry Page and genomics pioneer J. Craig Venter. They spent more than a year pondering how best to improve life on Earth and came up with 14 Grand Engineering Challenges, a list the National Academy of Engineering deemed so momentous it should be capitalized.
The list, announced this afternoon, addresses four themes the committee considered "essential for humanity to flourish" - environmental sustainability, health, reducing our vulnerability and adding to the joy of living.
"We chose engineering challenges that we feel can, through creativity and committment, be realistically met, most of them early in this century," said committee chair William J. Perry, the former Secretary of Defense who teaches engineering at Stanford University. "Some can be, and should be, achieved as soon as possible."
What are they?
Make solar energy affordable.
Provide energy from fusion.
Develop carbon sequestration methods.
Manage the nitrogen cycle.
Provide access to clean water.
Restore and improve urban infrastructure.
Advance health informatics.
Engineer better medicines.
Reverse-engineer the brain.
Prevent nuclear terror.
Secure cyberspace.
Enhance virtual reality.
Advance personalized learning.
Engineer the tools for scientific discovery.
The committee, which also included such luminaries as futurist Ray Kurzweil and robotics guru Dean Kamen, decided not to make any predictions or focus on gee-whiz gadgets. They felt it more important to outline broad objectives that might influence research funding and governmental policy.
The 14 challenges they laid out were culled from hundreds of suggestions from engineers, scientists, policymakers and ordinary people around the world.
"Meeting these challenges would be game changing," said Charles M. Vest, president of the NAE. "Success with any of them could dramatically improve life for everyone."
So... what should we check off first?
By Chuck Squatriglia February 15, 2008
Before you can save the world, you'd better write a to-do list so nothing gets overlooked. Some of the world's brightest minds have done just that by laying out this century's greatest engineering challenges.
The panel of 18 engineers, technologists and futurists included Google co-founder Larry Page and genomics pioneer J. Craig Venter. They spent more than a year pondering how best to improve life on Earth and came up with 14 Grand Engineering Challenges, a list the National Academy of Engineering deemed so momentous it should be capitalized.
The list, announced this afternoon, addresses four themes the committee considered "essential for humanity to flourish" - environmental sustainability, health, reducing our vulnerability and adding to the joy of living.
"We chose engineering challenges that we feel can, through creativity and committment, be realistically met, most of them early in this century," said committee chair William J. Perry, the former Secretary of Defense who teaches engineering at Stanford University. "Some can be, and should be, achieved as soon as possible."
What are they?
Make solar energy affordable.
Provide energy from fusion.
Develop carbon sequestration methods.
Manage the nitrogen cycle.
Provide access to clean water.
Restore and improve urban infrastructure.
Advance health informatics.
Engineer better medicines.
Reverse-engineer the brain.
Prevent nuclear terror.
Secure cyberspace.
Enhance virtual reality.
Advance personalized learning.
Engineer the tools for scientific discovery.
The committee, which also included such luminaries as futurist Ray Kurzweil and robotics guru Dean Kamen, decided not to make any predictions or focus on gee-whiz gadgets. They felt it more important to outline broad objectives that might influence research funding and governmental policy.
The 14 challenges they laid out were culled from hundreds of suggestions from engineers, scientists, policymakers and ordinary people around the world.
"Meeting these challenges would be game changing," said Charles M. Vest, president of the NAE. "Success with any of them could dramatically improve life for everyone."
So... what should we check off first?
Tuesday, February 12, 2008
Russians Plan New Space Platform
By Bill Christensen
posted: 15 January 2008
The Russian space agency stated that it intends to develop a space platform from which missions to the moon and to Mars could be launched. According to agency director Anatoly Perminov, the space platform project should be up and working after 2020. Russia plans its first moon mission for 2025.
The International Space Station will be decommissioned sometime between 2015 and 2025; by that time, the new space platform should be available.
As far as I know, the phrase "space platform" was first used in a scary short story by E. B. White. His short story,"The Morning of the Day They Did It," was published in The New Yorker magazine in 1950.
"We had arranged a radio hookup with the space platform, a gadget the Army had succeeded in establishing six hundred miles up, in the regions of the sky beyond the pull of gravity. The army, after many years of experimenting with rockets, had not only got the platform established but had sent two fellows there in a Spaceship, and also a liberal supply of the New Weapon."
(Read more about the space platform)
The concept of a space platform is at least several years older. In an annual report delivered by Secretary of Defense James Forrestal in 1948, an "earth satellite vehicle program" was mentioned. Forrestal remarked, "The earth-satellite vehicle program, which is being carried out independently by each military service, was assigned to the committee on guided missiles for coordination." It was specifically described as a platform from which missiles could be launched; it could function as an unmanned station.
The "earth satellite" was presented to the public in a great retro painting done by Frank Tinsely. Note the thoughtful details, including an astronomical observatory, cosmic ray traps, a sun power plant, rocket air lock, search radar and television sender. (See this more detailed drawing of the satellite base.)
Several years earlier, in 1946, General Curtis E. LeMay mentioned something similar in a research program announcement. He called for "flight and survival equipment for use above the atmosphere, including space vehicles, space bases and devices for use therein."
posted: 15 January 2008
The Russian space agency stated that it intends to develop a space platform from which missions to the moon and to Mars could be launched. According to agency director Anatoly Perminov, the space platform project should be up and working after 2020. Russia plans its first moon mission for 2025.
The International Space Station will be decommissioned sometime between 2015 and 2025; by that time, the new space platform should be available.
As far as I know, the phrase "space platform" was first used in a scary short story by E. B. White. His short story,"The Morning of the Day They Did It," was published in The New Yorker magazine in 1950.
"We had arranged a radio hookup with the space platform, a gadget the Army had succeeded in establishing six hundred miles up, in the regions of the sky beyond the pull of gravity. The army, after many years of experimenting with rockets, had not only got the platform established but had sent two fellows there in a Spaceship, and also a liberal supply of the New Weapon."
(Read more about the space platform)
The concept of a space platform is at least several years older. In an annual report delivered by Secretary of Defense James Forrestal in 1948, an "earth satellite vehicle program" was mentioned. Forrestal remarked, "The earth-satellite vehicle program, which is being carried out independently by each military service, was assigned to the committee on guided missiles for coordination." It was specifically described as a platform from which missiles could be launched; it could function as an unmanned station.
The "earth satellite" was presented to the public in a great retro painting done by Frank Tinsely. Note the thoughtful details, including an astronomical observatory, cosmic ray traps, a sun power plant, rocket air lock, search radar and television sender. (See this more detailed drawing of the satellite base.)
Several years earlier, in 1946, General Curtis E. LeMay mentioned something similar in a research program announcement. He called for "flight and survival equipment for use above the atmosphere, including space vehicles, space bases and devices for use therein."
Computing That’s Light Years Ahead
A new year brings new trends: in American sports, soccer looks poised to become the new basketball; in health and lifestyle features, fifty is touted as the new thirty; NYC hipsters have been alerted that Brooklyn is the new Manhattan; and, this year's fashion runways suggest that green is the new black.
In the world of technology, however, similar analogies are less ephemeral, and can come to mark quantum leaps forward in the realm of human progress. Just think: photographs vs. still-life paintings; phones vs. telegraphs; cars vs. horse and buggies; television vs. movie theaters; the computer vs. calculators. . .
What if plastic was about to become the new silicon,
and computing was on the verge of becoming fast and fluid as light?
The development of a viable electro-optic polymer has been in the sights of the fiber optic communications industry for decades, because it has been viewed as holding the key to unleashing waves of inexpensive bandwidth. Billions of dollars have been spent by thousands of researchers at large and small companies alike in this pursuit, all to no avail.
After fifty years of competitive research, a small nanotech company from Wilmington, Delaware, named Third-Order Nanotechnologies, has developed a materials breakthrough that could be suitable for making commercially viable photonic chips—chips that hold the promise to be the "silicon" of a new era in computing. In fact, Third-Order's inexpensive plastic photonic chips have shown the potential to be a thousand times more powerful than silicon chips.
In the same fashion that silicon was the material that shaped the twentieth century, Third-Order's third-generation materials just might mold the twenty-first. The company's patented electro-optic plastics would broadly replace more expensive, lower-performance materials that are currently used in fiber-optic ground, wireless, and satellite communication networks, bringing low-cost universal bandwidth along with it.
With this new all-optical platform, the potential exists for Promethean growth in a myriad of different markets. If the first iteration of the Internet created e-mail and Web pages, and the Megabit Internet gave birth to killer applications such as VoIP and streaming music and video, imagine what Third-Order's Gigabit Internet might be like. A billion instantly available television channels. . . ? Lifelike, super-high definition video conferencing with dozens of people at once. . . ? Photorealistic virtual reality role-playing games experienced with thousands of people from around the world. . . ?
One dramatic application for optical computing that may be crucial for national security purposes is instantaneous, "Where's Waldo?"-style facial recognition. With optical computing, faces of suspected wrongdoers may be distinguished with a higher degree of accuracy and one thousand to one million times faster than silicon. With optical computers the size of sticks of butter able to be inserted into traffic lights and security cameras (replacing rooms filled with dozens of bulky desktops), this nimble security application would be both more rigorous and cost-effective than existing solutions.
Third-Order's CEO, Hal Bennett, is both an inventor and a visionary. He would welcome the opportunity to discuss with you Third-Order's technological breakthrough to bring the Gigbit Internet to the home. In the meantime, we would be happy to provide you with a company media kit as well, and encourage you to visit www.Third-Order.com for more information.
In the world of technology, however, similar analogies are less ephemeral, and can come to mark quantum leaps forward in the realm of human progress. Just think: photographs vs. still-life paintings; phones vs. telegraphs; cars vs. horse and buggies; television vs. movie theaters; the computer vs. calculators. . .
What if plastic was about to become the new silicon,
and computing was on the verge of becoming fast and fluid as light?
The development of a viable electro-optic polymer has been in the sights of the fiber optic communications industry for decades, because it has been viewed as holding the key to unleashing waves of inexpensive bandwidth. Billions of dollars have been spent by thousands of researchers at large and small companies alike in this pursuit, all to no avail.
After fifty years of competitive research, a small nanotech company from Wilmington, Delaware, named Third-Order Nanotechnologies, has developed a materials breakthrough that could be suitable for making commercially viable photonic chips—chips that hold the promise to be the "silicon" of a new era in computing. In fact, Third-Order's inexpensive plastic photonic chips have shown the potential to be a thousand times more powerful than silicon chips.
In the same fashion that silicon was the material that shaped the twentieth century, Third-Order's third-generation materials just might mold the twenty-first. The company's patented electro-optic plastics would broadly replace more expensive, lower-performance materials that are currently used in fiber-optic ground, wireless, and satellite communication networks, bringing low-cost universal bandwidth along with it.
With this new all-optical platform, the potential exists for Promethean growth in a myriad of different markets. If the first iteration of the Internet created e-mail and Web pages, and the Megabit Internet gave birth to killer applications such as VoIP and streaming music and video, imagine what Third-Order's Gigabit Internet might be like. A billion instantly available television channels. . . ? Lifelike, super-high definition video conferencing with dozens of people at once. . . ? Photorealistic virtual reality role-playing games experienced with thousands of people from around the world. . . ?
One dramatic application for optical computing that may be crucial for national security purposes is instantaneous, "Where's Waldo?"-style facial recognition. With optical computing, faces of suspected wrongdoers may be distinguished with a higher degree of accuracy and one thousand to one million times faster than silicon. With optical computers the size of sticks of butter able to be inserted into traffic lights and security cameras (replacing rooms filled with dozens of bulky desktops), this nimble security application would be both more rigorous and cost-effective than existing solutions.
Third-Order's CEO, Hal Bennett, is both an inventor and a visionary. He would welcome the opportunity to discuss with you Third-Order's technological breakthrough to bring the Gigbit Internet to the home. In the meantime, we would be happy to provide you with a company media kit as well, and encourage you to visit www.Third-Order.com for more information.
Monday, February 11, 2008
Robot future poses hard questions
Public debate is needed about the future use of robots in society
Scientists have expressed concern about the use of autonomous decision-making robots, particularly for military use.
As they become more common, these machines could also have negative impacts on areas such as surveillance and elderly care, the roboticists warn.
The researchers were speaking ahead of a public debate at the Dana Centre, part of London's Science Museum.
Discussions about the future use of robots in society had been largely ill-informed so far, they argued.
Autonomous robots are able to make decisions without human intervention. At a simple level, these can include robot vacuum cleaners that "decide" for themselves when to move from room to room or to head back to a base station to recharge.
Military forces
Increasingly, autonomous machines are being used in military applications, too.
Samsung, for example, has developed a robotic sentry to guard the border between North and South Korea.
It is equipped with two cameras and a machine gun.
The development and eventual deployment of autonomous robots raised difficult questions, said Professor Alan Winfield of the University of West England.
"If an autonomous robot kills someone, whose fault is it?" said Professor Winfield.
"Right now, that's not an issue because the responsibility lies with the designer or operator of that robot; but as robots become more autonomous that line or responsibility becomes blurred."
Professor Noel Sharkey, of the University of Sheffield, said there could be more problems when robots moved from military to civil duties.
"Imagine the miners strike with robots armed with water cannons," he said. "These things are coming, definitely."
The researchers criticised recent research commissioned by the UK Office of Science and Innovation's Horizon Scanning Centre and released in December 2006.
Robot rights
The discussion paper was titled Utopian Dream or Rise of the Machines? It addressed issues such as the "rights" of robots, and examined developments in artificial intelligence and how this might impact on law and politics.
In particular, it predicted that robots could one day demand the same citizen's rights as humans, including housing and even "robo-healthcare".
I can imagine a future where it is much cheaper to dump old people in big hospitals where machines care for them
Professor Noel Sharkey
"It's poorly informed, poorly supported by science and it is sensationalist," said Professor Owen Holland of the University of Essex.
"My concern is that we should have an informed debate and it should be an informed debate about the right issues."
The robo-rights scan was one of 246 papers, commissioned by the UK government, and compiled by a group of futures researchers, the Outsights-Ipsos Mori partnership and the US-based Institute for the Future (IFTF).
At the time, Sir David King, the government's chief scientific adviser, said: "The scans are aimed at stimulating debate and critical discussion to enhance government's short and long-term policy and strategy."
Other scans examined the future of space flight and developments in nanotechnology.
Raised questions
The Dana Centre event will pick up some of these issues.
"I think that concerns about robot rights are just a distraction," said Professor Winfield.
"The more pressing and serious problem is the extent to which society is prepared to trust autonomous robots and entrust others into the care of autonomous robots."
Caring for an ageing population also raised questions, he said.
Robots were already being used in countries like Japan to take simple measurements, such as heart rate, from elderly patients.
Professor Sharkey, who worked in geriatric nursing in his youth, said he could envisage a future when it was "much cheaper to dump a lot of old people" in a large hospital, where they could be cared for by machines.
Scenarios like these meant that proper debate about robotics was imperative, he added.
"In the same way as we have an informed nuclear debate, we need to tell the public about what is going on in robotics and ask them what they want."
Scientists have expressed concern about the use of autonomous decision-making robots, particularly for military use.
As they become more common, these machines could also have negative impacts on areas such as surveillance and elderly care, the roboticists warn.
The researchers were speaking ahead of a public debate at the Dana Centre, part of London's Science Museum.
Discussions about the future use of robots in society had been largely ill-informed so far, they argued.
Autonomous robots are able to make decisions without human intervention. At a simple level, these can include robot vacuum cleaners that "decide" for themselves when to move from room to room or to head back to a base station to recharge.
Military forces
Increasingly, autonomous machines are being used in military applications, too.
Samsung, for example, has developed a robotic sentry to guard the border between North and South Korea.
It is equipped with two cameras and a machine gun.
The development and eventual deployment of autonomous robots raised difficult questions, said Professor Alan Winfield of the University of West England.
"If an autonomous robot kills someone, whose fault is it?" said Professor Winfield.
"Right now, that's not an issue because the responsibility lies with the designer or operator of that robot; but as robots become more autonomous that line or responsibility becomes blurred."
Professor Noel Sharkey, of the University of Sheffield, said there could be more problems when robots moved from military to civil duties.
"Imagine the miners strike with robots armed with water cannons," he said. "These things are coming, definitely."
The researchers criticised recent research commissioned by the UK Office of Science and Innovation's Horizon Scanning Centre and released in December 2006.
Robot rights
The discussion paper was titled Utopian Dream or Rise of the Machines? It addressed issues such as the "rights" of robots, and examined developments in artificial intelligence and how this might impact on law and politics.
In particular, it predicted that robots could one day demand the same citizen's rights as humans, including housing and even "robo-healthcare".
I can imagine a future where it is much cheaper to dump old people in big hospitals where machines care for them
Professor Noel Sharkey
"It's poorly informed, poorly supported by science and it is sensationalist," said Professor Owen Holland of the University of Essex.
"My concern is that we should have an informed debate and it should be an informed debate about the right issues."
The robo-rights scan was one of 246 papers, commissioned by the UK government, and compiled by a group of futures researchers, the Outsights-Ipsos Mori partnership and the US-based Institute for the Future (IFTF).
At the time, Sir David King, the government's chief scientific adviser, said: "The scans are aimed at stimulating debate and critical discussion to enhance government's short and long-term policy and strategy."
Other scans examined the future of space flight and developments in nanotechnology.
Raised questions
The Dana Centre event will pick up some of these issues.
"I think that concerns about robot rights are just a distraction," said Professor Winfield.
"The more pressing and serious problem is the extent to which society is prepared to trust autonomous robots and entrust others into the care of autonomous robots."
Caring for an ageing population also raised questions, he said.
Robots were already being used in countries like Japan to take simple measurements, such as heart rate, from elderly patients.
Professor Sharkey, who worked in geriatric nursing in his youth, said he could envisage a future when it was "much cheaper to dump a lot of old people" in a large hospital, where they could be cared for by machines.
Scenarios like these meant that proper debate about robotics was imperative, he added.
"In the same way as we have an informed nuclear debate, we need to tell the public about what is going on in robotics and ask them what they want."
Mobile Biz Stone Co-founder, Twitter
As we increasingly realise the web as a vital social utility and important marketplace we cannot ignore an even bigger potential. The power of the internet is not limited to the PC. Twitter has emerged to create a seamless layer of social connectivity across SMS, IM, and the web. Operating on the simple concept of status, Twitter asks one question: "What are you doing?" Friends, family and colleagues stay connected through short responses.
The potential for this simple form of hybrid communication technology is strong. For example, a person in India may text "Follow Biz" and get online via Twitter over SMS in a matter of seconds. Biz might be updating from the US on a PC. Nevertheless, the updates are exchanged instantly.
Our future holds in store the promise of increased connectivity to a powerful social internet which truly extends to every little spot on our Planet Earth. We're all affected by and defined by each other's actions. What are you doing?
The potential for this simple form of hybrid communication technology is strong. For example, a person in India may text "Follow Biz" and get online via Twitter over SMS in a matter of seconds. Biz might be updating from the US on a PC. Nevertheless, the updates are exchanged instantly.
Our future holds in store the promise of increased connectivity to a powerful social internet which truly extends to every little spot on our Planet Earth. We're all affected by and defined by each other's actions. What are you doing?
Advertising Maurice Lévy Chairman and CEO, Publicis Groupe
Five years is an eternity in technology, but from our vantage point a few things are clear about what the internet and internet advertising will look like in 2012. One, virtually all media will be digital, and digital will enable almost all kinds of advertising. Two, online advertising will depend more than ever on the one element which has always been at the heart of impactful advertising, both analogue and digital: creativity. The explosion of media channels means this is a glorious time to think and act creatively. In art history terms, we are at the dawn of the Renaissance after the Dark Ages.
Just as the Renaissance broke down the distinctions between sacred and profane art forms and between individual and community, so we are seeing a similar exciting blurring today - and this will only intensify. Linear media is fast giving way to liquid media, where you can move seamlessly in and out of different settings. Prescribed time - the 7 o'clock news, the Friday night out at the cinema, etc - is now becoming multitasking time. People are no longer willing to put up with interruptions for a commercial break during their entertainment experience, and so we have to find incredibly creative solutions to interact with them and engage them in genuine and honest ways. This implies a brave new world of engagement and involvement between marketers and consumers and will also mean co-production between marketers and media owners. Scale will be critical: in five years' time, around 2 billion people will be constant internet users and mobile internet computing will be ubiquitous. What a great time to be in the business!
Just as the Renaissance broke down the distinctions between sacred and profane art forms and between individual and community, so we are seeing a similar exciting blurring today - and this will only intensify. Linear media is fast giving way to liquid media, where you can move seamlessly in and out of different settings. Prescribed time - the 7 o'clock news, the Friday night out at the cinema, etc - is now becoming multitasking time. People are no longer willing to put up with interruptions for a commercial break during their entertainment experience, and so we have to find incredibly creative solutions to interact with them and engage them in genuine and honest ways. This implies a brave new world of engagement and involvement between marketers and consumers and will also mean co-production between marketers and media owners. Scale will be critical: in five years' time, around 2 billion people will be constant internet users and mobile internet computing will be ubiquitous. What a great time to be in the business!
Video By Chad Hurley CEO, co-founder YouTub
In five years, video broadcasting will be the most ubiquitous and accessible form of communication. The tools for video recording will continue to become smaller and more affordable. Personal media devices will be universal and interconnected. People will have the opportunity to record and share video with a small group of friends or everyone around the world.
Today, eight hours of new video are uploaded to YouTube every minute. This will grow exponentially over the next five years. Our goal is to allow every person on the planet to participate by making the upload process as simple as placing a phone call. This new video content will be available on any screen - in your living room or in your pocket - and will bring together all the diverse media which matters to you, from videos of family and friends to news, music, sports, cooking and more.
In the next five years, users will be at the centre of their video experience, you will have more access to more information, and the world will be a smaller place
Today, eight hours of new video are uploaded to YouTube every minute. This will grow exponentially over the next five years. Our goal is to allow every person on the planet to participate by making the upload process as simple as placing a phone call. This new video content will be available on any screen - in your living room or in your pocket - and will bring together all the diverse media which matters to you, from videos of family and friends to news, music, sports, cooking and more.
In the next five years, users will be at the centre of their video experience, you will have more access to more information, and the world will be a smaller place
Social networking by Chris De Wolfe
CEO, co-founder MySpace
In only a few years, social networks have become a staple in the internet landscape as the social networking phenomenon allowed people to "put their lives online". A person's profile became a representation of who they really were in the offline world, and allowed them to transfer their offline world online.
More than ever, social networks are blurring online and offline worlds, evolving into social destinations that are driving the direction of the larger web and affecting industries like advertising, music and politics.
Predicting the future of social networks exclusively misses the larger point - these evolving online social destinations are laying the groundwork for the new social web which we believe is becoming infinitely more personal, more portable, and more collaborative.
First, as we expand these social destinations to all corners of the world, we must always think in terms of the individual. With millions of people using social websites, there's an increasing demand to make everyone's web experience personal. In the same way a home or office is your physical address, we expect your personal, online social profile to become your internet address. When I give out www.myspace.com/chrisdewolfe to friends and colleagues, everyone knows where to find me online.
We expect aspects of all socially-based sites to become increasingly portable. In terms of mobile, we expect to have relationships with every carrier and device-maker in the world and we expect that half of our future traffic will come from non-PC users.
Social activity is happening everywhere and we expect applications and features to be more fluid, based on the online population that want content where they want it, when they want it, and how they want it. Social activity should be portable and we expect the industry will continue to move in that direction.
Lastly, online social destinations work best when creativity and development are collaborative concepts. From personal profiles, to the widget economy, to the OpenSocial standard - the future of the social web will harness the savvy of the masses to produce more relevant and meaningful social experiences, ultimately pushing the larger industry to be more innovative and progressive.
Lowering the barrier to entry for a new generation of developers will lead to a more collaborative and dynamic web and directly affect the tools and feature sets available on socially-based sites. Supporting a more collaborative web creates a more global and participatory internet experience for everyone.
The evolution of social networks is kick-starting a broad global shift for how people, content and culture collide on the web. Right now we're looking at the tip of the iceberg for what the social web will look like in the future. Fundamentally, all social destinations must expand while staying personal, they must engage users while empowering portability, and they must work with up and coming innovators and major web leaders to both collaborate and contribute to the larger web community.
In only a few years, social networks have become a staple in the internet landscape as the social networking phenomenon allowed people to "put their lives online". A person's profile became a representation of who they really were in the offline world, and allowed them to transfer their offline world online.
More than ever, social networks are blurring online and offline worlds, evolving into social destinations that are driving the direction of the larger web and affecting industries like advertising, music and politics.
Predicting the future of social networks exclusively misses the larger point - these evolving online social destinations are laying the groundwork for the new social web which we believe is becoming infinitely more personal, more portable, and more collaborative.
First, as we expand these social destinations to all corners of the world, we must always think in terms of the individual. With millions of people using social websites, there's an increasing demand to make everyone's web experience personal. In the same way a home or office is your physical address, we expect your personal, online social profile to become your internet address. When I give out www.myspace.com/chrisdewolfe to friends and colleagues, everyone knows where to find me online.
We expect aspects of all socially-based sites to become increasingly portable. In terms of mobile, we expect to have relationships with every carrier and device-maker in the world and we expect that half of our future traffic will come from non-PC users.
Social activity is happening everywhere and we expect applications and features to be more fluid, based on the online population that want content where they want it, when they want it, and how they want it. Social activity should be portable and we expect the industry will continue to move in that direction.
Lastly, online social destinations work best when creativity and development are collaborative concepts. From personal profiles, to the widget economy, to the OpenSocial standard - the future of the social web will harness the savvy of the masses to produce more relevant and meaningful social experiences, ultimately pushing the larger industry to be more innovative and progressive.
Lowering the barrier to entry for a new generation of developers will lead to a more collaborative and dynamic web and directly affect the tools and feature sets available on socially-based sites. Supporting a more collaborative web creates a more global and participatory internet experience for everyone.
The evolution of social networks is kick-starting a broad global shift for how people, content and culture collide on the web. Right now we're looking at the tip of the iceberg for what the social web will look like in the future. Fundamentally, all social destinations must expand while staying personal, they must engage users while empowering portability, and they must work with up and coming innovators and major web leaders to both collaborate and contribute to the larger web community.
Friday, February 8, 2008
Ecotopias Aren't Just for Hippies Anymore — and They're Sprouting Up Worldwide
By Frank Bures 01.18.08 | 6:00 PM
In the 1970s, environmental idealists had a vision of Ecotopia: Everyone recycled, there was no pollution, and we all worshipped trees and co-ops. Today's eco-communities are less crunchy and a lot more high tech. In addition to using renewable energy sources, these projects aim to limit their impact on surrounding ecosystems by building with green materials, promoting earth-friendly transportation, and recycling water and waste. The race for the first carbon-neutral, zero-emissions community is on.
Costa Rica Costa Rica
Dockside Green Victoria, British Columbia
Dongtan Chongming Island, China
Green Mountain Libya
Guangtang Chuangye Park Liuzhou, China
Masdar Abu Dhabi, United Arab Emirates
Northstowe Cambridge, England
Norway Norway
Treasure Island San Francisco
Vauban Freiburg, Germany
Växjö Växjö, Sweden
In the 1970s, environmental idealists had a vision of Ecotopia: Everyone recycled, there was no pollution, and we all worshipped trees and co-ops. Today's eco-communities are less crunchy and a lot more high tech. In addition to using renewable energy sources, these projects aim to limit their impact on surrounding ecosystems by building with green materials, promoting earth-friendly transportation, and recycling water and waste. The race for the first carbon-neutral, zero-emissions community is on.
Costa Rica Costa Rica
Dockside Green Victoria, British Columbia
Dongtan Chongming Island, China
Green Mountain Libya
Guangtang Chuangye Park Liuzhou, China
Masdar Abu Dhabi, United Arab Emirates
Northstowe Cambridge, England
Norway Norway
Treasure Island San Francisco
Vauban Freiburg, Germany
Växjö Växjö, Sweden
Wednesday, February 6, 2008
The Phone Glove
There is a rational argument to be made for the Bluetooth glove phone, a reassemblage of parts from a Bluetooth head set into a driving glove that was constructed by British gadget guy and television personality Jason Bradbury. This glove integrates our telecommunications devices into a stylish and functional clothing accessory while keeping the bulky phone out of the way. OK, maybe a glove phone isn't a great leap forward in technology or ergonomics, but it's hard to deny the goofy fun of answering a call with your thumb and pinky finger. In fact, Bradbury's tinkering hints at a trend that has received a lot of academic attention: wearable computing. Many futurists believe that our communications devices will eventually become cheap and ubiquitous enough to simply be integrated into the elements of our everyday attire. And it's already happening. Several Bluetooth helmets have been developed for skiing and motorcycling from companies such as Marker and Motorola, and jackets that plug into all of your gear and create a personal area network are available from ScotteVest (www.scottevest.com).
Future Paper
There is no shortage of designs for futuristic electronic paper. Various designs of this magical stuff will allow entire books to fit on a single memory-enabled, polymer sheet, displayed using infinitely reconfigurable magnetically-charged pigments; or futuristic printing technologies that will allow mass-produced super-cheap color screens that could be applied like wallpaper. One of the more interesting e-paper concepts comes from the techno-futurists at Lunar Design, who have imagined a product for the year 2015 called MicroMedia Paper. Available in packs of ten for around $35, these ultra-thin, mini color screens would work using replaceable "power sticker" batteries and would be controlled using touch sensitive buttons and a volume dial that adjusts the integrated speakers. Video, pictures and teleconferencing imagery could be transferred using a built-in wireless connection.
Taxi of the Future (It might happen)
Written by: Redyam
One thing that politicians realised early on was that the current system of transport for the country was a mess, and needed dramatic change.
The rapid advancement of computers and Artificial Intelligence over the last few decades meant that a new form of computer controlled transport could be developed and would be far more efficient than the human controlled cars of the late 20th century.
The original concept of owning your own car had all but been disappeared, instead having been replaced with an A.I. controlled 'taxi' service for every individual.
One would arrange for a taxi via an internet-like control centre installed in every house and within a few minutes a pod would be available. Then the user inputs a destination into the transport pod, and they would be whisked away automatically.
Instead of using the old over-ground network of roads and motorways, the pods would instead move about underground using a network of tunnels. Using recent advancements in electricity and magnetism, the pod would be able to glide through the tunnels effortlessly, quickly and efficiently, only rising above the ground once the destination is reached.
The computer controlled A.I. of the system would ensure that any problems would immediately be dealt with, giving a smooth ride.
Even though the expense of the system was tremendous, the benefit of virtually eliminating noise or air pollution, road accidents and congestion was great.
The system also gave a shot in the arm to Britain's economy, rapidly accelerating the country's GDP which has come to a standstill in recent years due to congestion.
All the existing roads over the ground were turned into grassy areas and cycle lanes, and the damage caused to the earth’s ozone layer was dramatically reduced.
One thing that politicians realised early on was that the current system of transport for the country was a mess, and needed dramatic change.
The rapid advancement of computers and Artificial Intelligence over the last few decades meant that a new form of computer controlled transport could be developed and would be far more efficient than the human controlled cars of the late 20th century.
The original concept of owning your own car had all but been disappeared, instead having been replaced with an A.I. controlled 'taxi' service for every individual.
One would arrange for a taxi via an internet-like control centre installed in every house and within a few minutes a pod would be available. Then the user inputs a destination into the transport pod, and they would be whisked away automatically.
Instead of using the old over-ground network of roads and motorways, the pods would instead move about underground using a network of tunnels. Using recent advancements in electricity and magnetism, the pod would be able to glide through the tunnels effortlessly, quickly and efficiently, only rising above the ground once the destination is reached.
The computer controlled A.I. of the system would ensure that any problems would immediately be dealt with, giving a smooth ride.
Even though the expense of the system was tremendous, the benefit of virtually eliminating noise or air pollution, road accidents and congestion was great.
The system also gave a shot in the arm to Britain's economy, rapidly accelerating the country's GDP which has come to a standstill in recent years due to congestion.
All the existing roads over the ground were turned into grassy areas and cycle lanes, and the damage caused to the earth’s ozone layer was dramatically reduced.
The Cardboard House
The Cardboard House represents the reduction of technology and the simplification of needs. By demonstrating that we are able to recycle 100% of the building components at extremely low cost, the Cardboard House is a direct challenge to the housing industry to reduce housing and environmental costs.
Stutchbury and Pape, working in association with the Ian Buchan Fell Housing Research Unit at University of Sydney, see this project as a genuine temporary housing option.
A cardboard house places the least demand on resources and encourages people to shift their preconceptions about the “typical Australian house”. Many Australians enjoy camping on their holidays, easily shifting their lifestyle from the rigidity of the urban home to the freedom of the campsite.
Being extremely low cost and transportable, the Cardboard House could be used in a wide variety of applications. You could live in one while your permanent house is being built or renovated, for emergency housing, or for short-term accommodation.
Why choose cardboard?
Cardboard is not a traditional building material, however the introduction of innovative bonding, cutting and structural techniques has provided the opportunity to consider this lightweight and recyclable material in a more creative fashion.
All the material in the house is recycled, and recyclable, making it an excellent environmentally sustainable option for housing. The Carboard House is made of recycled carboard supplied by Visy Industries. This is completed with a waterproof roof made from HDPE plastic, which also forms the material of the flexible under-floor water tanks and the novel kitchen and bathroom 'pods'.
How it all goes together
The Cardboard House is conceived as a kit of parts comprising a flat pack of frames, and infill floor and wall panels. It uses minimal fixings: nylon wing nuts, hand-tightened polyster tape stays and Velcro fastenings are used to assemble the frames and protective skin system.
The building can be assembled by two people over a six-hour period using appropriate scaffolding, and is transportable in a light commercial vehicle.
A series of repetitive portal frames are both spaced and stabilised by a standardised secondary structure, similar to the interlocking spacer sheets found in wine boxes. Once assembled, the structure provides a creative architectural frame from which the house derives its aesthetic.
Fixed and moveable furnishings, floor systems, door and opening frames, lighting and other services all relate to the structure and layout.
The roof covering is a lightweight material that is as transportable as the structure. Similar to a tent fly, the roof fabric assists in holding down the building, providing a diffuse light in the day and a glowing box at night.
Water is collected in bladders underneath the floor which double as ballast to hold down the lightweight building.
A composting toilet system produces nutrient-rich water for gardening.
Low-voltage lighting can be powered using a 12-volt car battery or small photovoltaic cells mounted on the roof framing.
What are the implications for the future of housing?
The Architects see this project as a genuine housing option. Extremely low cost, transportable, lightweight and flexible, this building could be used in a variety of widespread applications. The Cardboard House is seen as a prototype that may serve to meet future housing in a way that is responsible and beautiful.
Historical or theoretical precedents
Paper and cardboard have been used to construct domestic housing in Japan for many centuries, where rice paper (shoji) was both cheap and safe in earthquake prone regions. Folded cardboard (origami) was also used for lightweight enclosures, simulating paper sculpture.
Contemporary Japanese architect Shigeru Ban has used tubular and flat cardboard to great effect for housing, civic buildings, large exhibition pavilions and emergency shelters.
In Australia, pioneering work was carried out at the University of New South Wales by Vincent Sedlack, and just last year Adriano Pupilli, an honours student at the University of Sydney, designed and built a full-size bay of a 5-bedroom house with Col James. This attracted local attention and directly led to the invitation to showcase cardboard as a potential building material in the future.
Stutchbury and Pape, working in association with the Ian Buchan Fell Housing Research Unit at University of Sydney, see this project as a genuine temporary housing option.
A cardboard house places the least demand on resources and encourages people to shift their preconceptions about the “typical Australian house”. Many Australians enjoy camping on their holidays, easily shifting their lifestyle from the rigidity of the urban home to the freedom of the campsite.
Being extremely low cost and transportable, the Cardboard House could be used in a wide variety of applications. You could live in one while your permanent house is being built or renovated, for emergency housing, or for short-term accommodation.
Why choose cardboard?
Cardboard is not a traditional building material, however the introduction of innovative bonding, cutting and structural techniques has provided the opportunity to consider this lightweight and recyclable material in a more creative fashion.
All the material in the house is recycled, and recyclable, making it an excellent environmentally sustainable option for housing. The Carboard House is made of recycled carboard supplied by Visy Industries. This is completed with a waterproof roof made from HDPE plastic, which also forms the material of the flexible under-floor water tanks and the novel kitchen and bathroom 'pods'.
How it all goes together
The Cardboard House is conceived as a kit of parts comprising a flat pack of frames, and infill floor and wall panels. It uses minimal fixings: nylon wing nuts, hand-tightened polyster tape stays and Velcro fastenings are used to assemble the frames and protective skin system.
The building can be assembled by two people over a six-hour period using appropriate scaffolding, and is transportable in a light commercial vehicle.
A series of repetitive portal frames are both spaced and stabilised by a standardised secondary structure, similar to the interlocking spacer sheets found in wine boxes. Once assembled, the structure provides a creative architectural frame from which the house derives its aesthetic.
Fixed and moveable furnishings, floor systems, door and opening frames, lighting and other services all relate to the structure and layout.
The roof covering is a lightweight material that is as transportable as the structure. Similar to a tent fly, the roof fabric assists in holding down the building, providing a diffuse light in the day and a glowing box at night.
Water is collected in bladders underneath the floor which double as ballast to hold down the lightweight building.
A composting toilet system produces nutrient-rich water for gardening.
Low-voltage lighting can be powered using a 12-volt car battery or small photovoltaic cells mounted on the roof framing.
What are the implications for the future of housing?
The Architects see this project as a genuine housing option. Extremely low cost, transportable, lightweight and flexible, this building could be used in a variety of widespread applications. The Cardboard House is seen as a prototype that may serve to meet future housing in a way that is responsible and beautiful.
Historical or theoretical precedents
Paper and cardboard have been used to construct domestic housing in Japan for many centuries, where rice paper (shoji) was both cheap and safe in earthquake prone regions. Folded cardboard (origami) was also used for lightweight enclosures, simulating paper sculpture.
Contemporary Japanese architect Shigeru Ban has used tubular and flat cardboard to great effect for housing, civic buildings, large exhibition pavilions and emergency shelters.
In Australia, pioneering work was carried out at the University of New South Wales by Vincent Sedlack, and just last year Adriano Pupilli, an honours student at the University of Sydney, designed and built a full-size bay of a 5-bedroom house with Col James. This attracted local attention and directly led to the invitation to showcase cardboard as a potential building material in the future.
Tuesday, February 5, 2008
Businesses find real opportunity in the virtual world of Second Life
L.A. Lorek
Wandering through virtual streets in Second Life, Nappy Bread has met all kinds of people, from beggars to business moguls, at open houses, parties and public relations events.
Nappy Bread is an avatar, a computerized image, belonging to Dean McCall, 36, chief executive of Salsa.net, a technology company in San Antonio.
"Second Life is really like life," he said. "There's a whole culture and economy going on."
The virtual and the real worlds intersect in Second Life, a 3-D digital world "imagined, created and owned by its residents," says Linden Lab, the site's creator. The online virtual gaming community has more than 940,000 members, ranging from 18 to 85, from more than 80 countries.
Every day, residents of Second Life spend more than $400,000 in real money on everything from T-shirts to real estate, San Francisco-based Linden says.
That kind of economy has attracted the notice of businesses. This week, Reuters announced it was opening a virtual news bureau with a reporter, "Adam Reuters."
The game has created a $70 million economy, said Guhan Selvaretnam, Reuters' vice president of media strategies.
"It's the only virtual world community that has a self-sustaining economy," he said. "We think of it as covering a growing city in the world. This world just happens to be virtual."
Residents start out with a small amount of Linden Dollars, Second Life's official currency. People earn more money by making and selling items, by holding events or by using talents such as singing or playing a musical instrument.
The average resident of Second Life has an annual income of $1,000 U.S., but the highest earners make as much as $250,000, Selvaretnam said. Basic accounts are free, but landowners pay a monthly lease fee starting at $9.95. Fees go up from there.
A 16-acre island sells for $1,250, with a monthly maintenance fee of $195. A 64-acre island costs $5,000, with a monthly maintenance fee of $780.
Reuters would not say how much it's spending to be part of Second Life, but Selvaretnam said it was a small investment. The company bought an island and paid for a customized name for business reporter Adam Pasick, aka Adam Reuters.
He will cover events, interview residents and uncover interesting stories within Second Life. So far, he has filed stories that interweave the virtual with the real. The first dealt with Congress investigating the virtual economies of Second Life and World of Warcraft.
Another story concerns Ginko Financial, which is offering 40 percent interest on deposits. Adam Reuters questions whether the bank is a "pioneer or pyramid." He also reports the bank has $220,000 in deposits.
The virtual world is a bit like the Wild West, said Linda Zimmer, chief executive of MarCom:Interactive, based in Anaheim, Calif. She writes a blog, "Business Communicators of Second Life," on business interests in the community.
"We will eventually see similar kinds of regulations come into Second Life related to banking," Zimmer said. "We are going to have to address similar issues in this virtual world that we do in the real world."
In addition to Reuters, American Apparel, a Los Angeles-based maker of trendy T-shirts, and Starwood Hotels, based in White Plains, N.Y., have opened in the virtual game.
Second Life has become a new way for businesses to reach consumers, said Zimmer, who goes by the avatar ZnetLady Isbell.
Starwood's W Hotels created a replica of a real hotel it plans to build in two years, she said. Intel, IBM, Nike, Toyota, Amazon.com, and countless public relations agencies and marketing firms sponsor events.
"There is a really interesting gold rush going on there," said Kami Watson-Huyse, a San Antonio public relations expert and blogger. Her avatar is KamiChat Watson.
Artists, entertainers and other creative people have found it to be a good gathering place, she said. Next month, Duran Duran will play a live concert in Second Life. Countless other musicians earn real money playing in the virtual world.
The danger for businesses is commercializing a virtual world that was created, in part, to escape commercialization, Zimmer said.
"I think what businesses are going to have to do is invest time to see the value in what they do and how it affects people in the virtual world."
Wandering through virtual streets in Second Life, Nappy Bread has met all kinds of people, from beggars to business moguls, at open houses, parties and public relations events.
Nappy Bread is an avatar, a computerized image, belonging to Dean McCall, 36, chief executive of Salsa.net, a technology company in San Antonio.
"Second Life is really like life," he said. "There's a whole culture and economy going on."
The virtual and the real worlds intersect in Second Life, a 3-D digital world "imagined, created and owned by its residents," says Linden Lab, the site's creator. The online virtual gaming community has more than 940,000 members, ranging from 18 to 85, from more than 80 countries.
Every day, residents of Second Life spend more than $400,000 in real money on everything from T-shirts to real estate, San Francisco-based Linden says.
That kind of economy has attracted the notice of businesses. This week, Reuters announced it was opening a virtual news bureau with a reporter, "Adam Reuters."
The game has created a $70 million economy, said Guhan Selvaretnam, Reuters' vice president of media strategies.
"It's the only virtual world community that has a self-sustaining economy," he said. "We think of it as covering a growing city in the world. This world just happens to be virtual."
Residents start out with a small amount of Linden Dollars, Second Life's official currency. People earn more money by making and selling items, by holding events or by using talents such as singing or playing a musical instrument.
The average resident of Second Life has an annual income of $1,000 U.S., but the highest earners make as much as $250,000, Selvaretnam said. Basic accounts are free, but landowners pay a monthly lease fee starting at $9.95. Fees go up from there.
A 16-acre island sells for $1,250, with a monthly maintenance fee of $195. A 64-acre island costs $5,000, with a monthly maintenance fee of $780.
Reuters would not say how much it's spending to be part of Second Life, but Selvaretnam said it was a small investment. The company bought an island and paid for a customized name for business reporter Adam Pasick, aka Adam Reuters.
He will cover events, interview residents and uncover interesting stories within Second Life. So far, he has filed stories that interweave the virtual with the real. The first dealt with Congress investigating the virtual economies of Second Life and World of Warcraft.
Another story concerns Ginko Financial, which is offering 40 percent interest on deposits. Adam Reuters questions whether the bank is a "pioneer or pyramid." He also reports the bank has $220,000 in deposits.
The virtual world is a bit like the Wild West, said Linda Zimmer, chief executive of MarCom:Interactive, based in Anaheim, Calif. She writes a blog, "Business Communicators of Second Life," on business interests in the community.
"We will eventually see similar kinds of regulations come into Second Life related to banking," Zimmer said. "We are going to have to address similar issues in this virtual world that we do in the real world."
In addition to Reuters, American Apparel, a Los Angeles-based maker of trendy T-shirts, and Starwood Hotels, based in White Plains, N.Y., have opened in the virtual game.
Second Life has become a new way for businesses to reach consumers, said Zimmer, who goes by the avatar ZnetLady Isbell.
Starwood's W Hotels created a replica of a real hotel it plans to build in two years, she said. Intel, IBM, Nike, Toyota, Amazon.com, and countless public relations agencies and marketing firms sponsor events.
"There is a really interesting gold rush going on there," said Kami Watson-Huyse, a San Antonio public relations expert and blogger. Her avatar is KamiChat Watson.
Artists, entertainers and other creative people have found it to be a good gathering place, she said. Next month, Duran Duran will play a live concert in Second Life. Countless other musicians earn real money playing in the virtual world.
The danger for businesses is commercializing a virtual world that was created, in part, to escape commercialization, Zimmer said.
"I think what businesses are going to have to do is invest time to see the value in what they do and how it affects people in the virtual world."
Businesses find real opportunity in the virtual world of Second Life
L.A. Lorek
Wandering through virtual streets in Second Life, Nappy Bread has met all kinds of people, from beggars to business moguls, at open houses, parties and public relations events.
Nappy Bread is an avatar, a computerized image, belonging to Dean McCall, 36, chief executive of Salsa.net, a technology company in San Antonio.
"Second Life is really like life," he said. "There's a whole culture and economy going on."
The virtual and the real worlds intersect in Second Life, a 3-D digital world "imagined, created and owned by its residents," says Linden Lab, the site's creator. The online virtual gaming community has more than 940,000 members, ranging from 18 to 85, from more than 80 countries.
Every day, residents of Second Life spend more than $400,000 in real money on everything from T-shirts to real estate, San Francisco-based Linden says.
That kind of economy has attracted the notice of businesses. This week, Reuters announced it was opening a virtual news bureau with a reporter, "Adam Reuters."
The game has created a $70 million economy, said Guhan Selvaretnam, Reuters' vice president of media strategies.
"It's the only virtual world community that has a self-sustaining economy," he said. "We think of it as covering a growing city in the world. This world just happens to be virtual."
Residents start out with a small amount of Linden Dollars, Second Life's official currency. People earn more money by making and selling items, by holding events or by using talents such as singing or playing a musical instrument.
The average resident of Second Life has an annual income of $1,000 U.S., but the highest earners make as much as $250,000, Selvaretnam said. Basic accounts are free, but landowners pay a monthly lease fee starting at $9.95. Fees go up from there.
A 16-acre island sells for $1,250, with a monthly maintenance fee of $195. A 64-acre island costs $5,000, with a monthly maintenance fee of $780.
Reuters would not say how much it's spending to be part of Second Life, but Selvaretnam said it was a small investment. The company bought an island and paid for a customized name for business reporter Adam Pasick, aka Adam Reuters.
He will cover events, interview residents and uncover interesting stories within Second Life. So far, he has filed stories that interweave the virtual with the real. The first dealt with Congress investigating the virtual economies of Second Life and World of Warcraft.
Another story concerns Ginko Financial, which is offering 40 percent interest on deposits. Adam Reuters questions whether the bank is a "pioneer or pyramid." He also reports the bank has $220,000 in deposits.
The virtual world is a bit like the Wild West, said Linda Zimmer, chief executive of MarCom:Interactive, based in Anaheim, Calif. She writes a blog, "Business Communicators of Second Life," on business interests in the community.
"We will eventually see similar kinds of regulations come into Second Life related to banking," Zimmer said. "We are going to have to address similar issues in this virtual world that we do in the real world."
In addition to Reuters, American Apparel, a Los Angeles-based maker of trendy T-shirts, and Starwood Hotels, based in White Plains, N.Y., have opened in the virtual game.
Second Life has become a new way for businesses to reach consumers, said Zimmer, who goes by the avatar ZnetLady Isbell.
Starwood's W Hotels created a replica of a real hotel it plans to build in two years, she said. Intel, IBM, Nike, Toyota, Amazon.com, and countless public relations agencies and marketing firms sponsor events.
"There is a really interesting gold rush going on there," said Kami Watson-Huyse, a San Antonio public relations expert and blogger. Her avatar is KamiChat Watson.
Artists, entertainers and other creative people have found it to be a good gathering place, she said. Next month, Duran Duran will play a live concert in Second Life. Countless other musicians earn real money playing in the virtual world.
The danger for businesses is commercializing a virtual world that was created, in part, to escape commercialization, Zimmer said.
"I think what businesses are going to have to do is invest time to see the value in what they do and how it affects people in the virtual world."
Wandering through virtual streets in Second Life, Nappy Bread has met all kinds of people, from beggars to business moguls, at open houses, parties and public relations events.
Nappy Bread is an avatar, a computerized image, belonging to Dean McCall, 36, chief executive of Salsa.net, a technology company in San Antonio.
"Second Life is really like life," he said. "There's a whole culture and economy going on."
The virtual and the real worlds intersect in Second Life, a 3-D digital world "imagined, created and owned by its residents," says Linden Lab, the site's creator. The online virtual gaming community has more than 940,000 members, ranging from 18 to 85, from more than 80 countries.
Every day, residents of Second Life spend more than $400,000 in real money on everything from T-shirts to real estate, San Francisco-based Linden says.
That kind of economy has attracted the notice of businesses. This week, Reuters announced it was opening a virtual news bureau with a reporter, "Adam Reuters."
The game has created a $70 million economy, said Guhan Selvaretnam, Reuters' vice president of media strategies.
"It's the only virtual world community that has a self-sustaining economy," he said. "We think of it as covering a growing city in the world. This world just happens to be virtual."
Residents start out with a small amount of Linden Dollars, Second Life's official currency. People earn more money by making and selling items, by holding events or by using talents such as singing or playing a musical instrument.
The average resident of Second Life has an annual income of $1,000 U.S., but the highest earners make as much as $250,000, Selvaretnam said. Basic accounts are free, but landowners pay a monthly lease fee starting at $9.95. Fees go up from there.
A 16-acre island sells for $1,250, with a monthly maintenance fee of $195. A 64-acre island costs $5,000, with a monthly maintenance fee of $780.
Reuters would not say how much it's spending to be part of Second Life, but Selvaretnam said it was a small investment. The company bought an island and paid for a customized name for business reporter Adam Pasick, aka Adam Reuters.
He will cover events, interview residents and uncover interesting stories within Second Life. So far, he has filed stories that interweave the virtual with the real. The first dealt with Congress investigating the virtual economies of Second Life and World of Warcraft.
Another story concerns Ginko Financial, which is offering 40 percent interest on deposits. Adam Reuters questions whether the bank is a "pioneer or pyramid." He also reports the bank has $220,000 in deposits.
The virtual world is a bit like the Wild West, said Linda Zimmer, chief executive of MarCom:Interactive, based in Anaheim, Calif. She writes a blog, "Business Communicators of Second Life," on business interests in the community.
"We will eventually see similar kinds of regulations come into Second Life related to banking," Zimmer said. "We are going to have to address similar issues in this virtual world that we do in the real world."
In addition to Reuters, American Apparel, a Los Angeles-based maker of trendy T-shirts, and Starwood Hotels, based in White Plains, N.Y., have opened in the virtual game.
Second Life has become a new way for businesses to reach consumers, said Zimmer, who goes by the avatar ZnetLady Isbell.
Starwood's W Hotels created a replica of a real hotel it plans to build in two years, she said. Intel, IBM, Nike, Toyota, Amazon.com, and countless public relations agencies and marketing firms sponsor events.
"There is a really interesting gold rush going on there," said Kami Watson-Huyse, a San Antonio public relations expert and blogger. Her avatar is KamiChat Watson.
Artists, entertainers and other creative people have found it to be a good gathering place, she said. Next month, Duran Duran will play a live concert in Second Life. Countless other musicians earn real money playing in the virtual world.
The danger for businesses is commercializing a virtual world that was created, in part, to escape commercialization, Zimmer said.
"I think what businesses are going to have to do is invest time to see the value in what they do and how it affects people in the virtual world."
French Pilot Flies First Manned Electric Plane
Posted on Thu Jan 10 2008
By: Ianto Everett
Electric planes are nothing new - if you count remote control enthusiasts flying model aircraft around the local park, but in France in late December, pilot and test engineer Christian Vandamme successfully flew the first full size electric airplane.
The Electra F-WMDJ is built from wood and fabrics and powered by lithium polymer batteries and on December 23rd the aircraft took off from near the Southern Alp town of Gap and flew for 48 minutes around the Alps, achieving a flight distance of 50 kilometers (30 miles).
The French company APAME, who developed the aircraft, was launched only 18 months ago with finance from various French aerospace companies, and while this is still a fairly short distance it proves that the concept works, and APAME are committed to developing a commercially viable airplane for leisure pilots.
In the past the major hurdle for electric powered aircraft has been the weight of the battery, but with rapidly developing battery technology, electric aircraft look set to become a viable option; and as APAME state, as well as being environmentally friendly, a major advantage for pilots is the vast reduction in fuel costs. The company estimate that their engine, which costs around the same price as a standard engine, costs only one Euro per hour to run, compared to around 60 Euros per hour for a conventional airplane.
Earlier this year the company also successfully performed a test flight of a mono-wing microlight style aircraft, and have other designs in development. It's not just private aircraft that may soon be flying greener, however, as both NASA and Boeing are researching hydrogen fuel cell powered commercial passenger jets, and Richard Branson, owner of Virgin Airlines, has invested millions of dollars into researching greener fuel options.
And now ? There is a lot of work to be made before seeing of numerous ultralights and small planes flying with an electric engine ! But this flight shows that it's possible.
By: Ianto Everett
Electric planes are nothing new - if you count remote control enthusiasts flying model aircraft around the local park, but in France in late December, pilot and test engineer Christian Vandamme successfully flew the first full size electric airplane.
The Electra F-WMDJ is built from wood and fabrics and powered by lithium polymer batteries and on December 23rd the aircraft took off from near the Southern Alp town of Gap and flew for 48 minutes around the Alps, achieving a flight distance of 50 kilometers (30 miles).
The French company APAME, who developed the aircraft, was launched only 18 months ago with finance from various French aerospace companies, and while this is still a fairly short distance it proves that the concept works, and APAME are committed to developing a commercially viable airplane for leisure pilots.
In the past the major hurdle for electric powered aircraft has been the weight of the battery, but with rapidly developing battery technology, electric aircraft look set to become a viable option; and as APAME state, as well as being environmentally friendly, a major advantage for pilots is the vast reduction in fuel costs. The company estimate that their engine, which costs around the same price as a standard engine, costs only one Euro per hour to run, compared to around 60 Euros per hour for a conventional airplane.
Earlier this year the company also successfully performed a test flight of a mono-wing microlight style aircraft, and have other designs in development. It's not just private aircraft that may soon be flying greener, however, as both NASA and Boeing are researching hydrogen fuel cell powered commercial passenger jets, and Richard Branson, owner of Virgin Airlines, has invested millions of dollars into researching greener fuel options.
And now ? There is a lot of work to be made before seeing of numerous ultralights and small planes flying with an electric engine ! But this flight shows that it's possible.
Future Solar Power
If things go according to plan, construction on a giant solar tower could begin in Australia in 2006. The 3,280-foot tall tower will be surrounded by a vast greenhouse that will heat air to drive turbines around the base of the tower. It is estimated that the power station will be able to generate 200 megawatts of electricity, enough to power 200,000 households.
Solar energy requires no additional fuel to run and is pollution free. Sunlight can be captured as usable heat or converted into electricity using solar, or photoelectric, cells or through synchronized mirrors known as heliostats that track the sunes movement across the sky. Scientists have also developed methods for using solar power to replace a gas-powered engine by heating hydrogen gas in a tank, which expands to drive pistons and power a generator.
Drawbacks of solar energy include high initial cost, and the need for large spaces. Also, for most solar energy alternatives, productivity is subject to the whims of air pollution and weather, which can block sunlight.
Solar energy requires no additional fuel to run and is pollution free. Sunlight can be captured as usable heat or converted into electricity using solar, or photoelectric, cells or through synchronized mirrors known as heliostats that track the sunes movement across the sky. Scientists have also developed methods for using solar power to replace a gas-powered engine by heating hydrogen gas in a tank, which expands to drive pistons and power a generator.
Drawbacks of solar energy include high initial cost, and the need for large spaces. Also, for most solar energy alternatives, productivity is subject to the whims of air pollution and weather, which can block sunlight.
The best forecasters will be computers
PAUL SAFFO
Technology Forecaster
When I began my career as a forecaster over two decades ago, it was a given that the core of futures research lay beyond the reach of traditional quantitative forecasting and it's mathematical tools. This meant that futures researchers would not enjoy the full labor-saving benefits of number-crunching computers, but at least it guaranteed job security. Economists and financial analysts might one day wake up to discover that their computer tools were stealing their jobs, but futurists would not see machines muscling their way into the world of qualitative forecasting anytime soon.
I was mistaken. I now believe that in the not too distant future, the best forecasters will not be people, but machines: ever more capable "prediction engines" probing ever deeper into stochastic spaces. Indicators of this trend are everywhere from the rise of quantitative analysis in the financial sector, to the emergence of computer-based horizon scanning systems in use by governments around the world, and of course the relentless advance of computer systems along the upward-sweeping curve of Moore's Law.
We already have human-computer hybrids at work in the discovery/forecasting space, from Amazon's Mechanical Turk, to the myriad online prediction markets. In time, we will recognize that these systems are an intermediate step towards prediction engines in much the same way that human "computers" who once performed the mathematical calculations on complex projects were replaced by general-purpose electronic digital computers.
The eventual appearance of prediction engines will also be enabled by the steady uploading of reality into cyberspace, from the growth of web-based social activities to the steady accretion of sensor data sucked up by an exponentially growing number of devices observing and increasingly, manipulating the physical world. The result is an unimaginably vast corpus of raw material, grist for the prediction engines as they sift and sort and peer ahead. These prediction engines won't ever exhibit perfect foresight, but as they and the underlying data they work on co-evolve, it is a sure bet that they will do far better then mere humans
Technology Forecaster
When I began my career as a forecaster over two decades ago, it was a given that the core of futures research lay beyond the reach of traditional quantitative forecasting and it's mathematical tools. This meant that futures researchers would not enjoy the full labor-saving benefits of number-crunching computers, but at least it guaranteed job security. Economists and financial analysts might one day wake up to discover that their computer tools were stealing their jobs, but futurists would not see machines muscling their way into the world of qualitative forecasting anytime soon.
I was mistaken. I now believe that in the not too distant future, the best forecasters will not be people, but machines: ever more capable "prediction engines" probing ever deeper into stochastic spaces. Indicators of this trend are everywhere from the rise of quantitative analysis in the financial sector, to the emergence of computer-based horizon scanning systems in use by governments around the world, and of course the relentless advance of computer systems along the upward-sweeping curve of Moore's Law.
We already have human-computer hybrids at work in the discovery/forecasting space, from Amazon's Mechanical Turk, to the myriad online prediction markets. In time, we will recognize that these systems are an intermediate step towards prediction engines in much the same way that human "computers" who once performed the mathematical calculations on complex projects were replaced by general-purpose electronic digital computers.
The eventual appearance of prediction engines will also be enabled by the steady uploading of reality into cyberspace, from the growth of web-based social activities to the steady accretion of sensor data sucked up by an exponentially growing number of devices observing and increasingly, manipulating the physical world. The result is an unimaginably vast corpus of raw material, grist for the prediction engines as they sift and sort and peer ahead. These prediction engines won't ever exhibit perfect foresight, but as they and the underlying data they work on co-evolve, it is a sure bet that they will do far better then mere humans
Mum's diet shapes a child's future weight
By Ellen Connolly
January 27, 2008 01:00am
AUSTRALIAN scientists have made the world-first discovery that a pregnant woman's diet determines whether her baby grows into a fat adult or a skinny one.
The research suggests women who are overweight before they fall pregnant, and during it, may condemn their children to a life of overeating and obesity.
It reveals that a mother's diet during pregnancy affects the baby's brain circuits, determining appetite and energy expenditure in their offspring.
"This suggests that mothers should think twice about overindulging, or using the excuse that they're eating for two during pregnancy," University of NSW professor Margaret Morris said.
Pre-natal period programs a child's future appetite
Unlike previous studies, the groundbreaking work highlights the pre-natal period as a critical time for "programming of post-natal and adult appetite".
It found that even before a woman falls pregnant, she is potentially "programming" a child's future appetite.
"The major finding is the dramatic increase in body fat in offspring of overweight and obese mothers," Professor Morris said.
Mothers fed a high-fat diet had offspring that were heavier, with more body fat and altered appetite regulators in the brain, meaning they overate, she said.
The results are supported by a study published in the British Journal of Nutrition last year. It found that mothers who eat junk food during pregnancy may produce children who crave the same foods.
Professor Morris will present her findings at the Australian Neuroscience Society conference in Hobart this week.
She said the study was particularly relevant, given that about 30 per cent of mothers enter pregnancy in an overweight or obese condition.
Study mated overweight female rates with healthy males
The study was conducted using overweight female rats who mated with healthy males.
The females continued to be fed a high-fat Western diet during and after pregnancy, Professor Morris said.
"The mums were overeating for that whole period. We found the offspring were a third heavier than the rats fed a low-fat diet," she said.
Professor Morris said the brain pathways regulating appetite in rats were similar to those in humans, suggesting similar trends could be expected in people.
Sydney University nutritionist Dr Jenny O'Dea said it had become "quite well accepted" that a woman's diet during pregnancy impacted on the fetus.
"We also know that obesity during pregnancy more often than not causes gestational diabetes and high blood pressure," Dr O'Dea said.
Pregnant women should not 'eat for two'
She said that although nutritional needs were high during pregnancy, women should not be "eating for two".
Professor Morris studied mothers who were already overweight before they fell pregnant. The experiment results also found their offspring were showing signs of developing diabetes at a young age.
The findings are particularly relevant for overweight mothers, highlighting the importance of maintaining a normal weight before and during pregnancy.
Further research will examine how methods of intervention during breastfeeding can reverse bad nutritional habits and overeating.
Susie Burrell, a pediatric dietitian at The Children's Hospital at Westmead, said the study sent a powerful message to women planning to fall pregnant.
"They need to get their weight under control before conceiving, and those who are pregnant need to have minimum weight-gain during pregnancy," Ms Burrell said.
She said an increasing number of women were overweight before they fell pregnant, creating a "snowball effect".
"Their babies are more likely to have a high birth weight. This then leads to lifestyle diseases such as type 2 diabetes and heart disease."
January 27, 2008 01:00am
AUSTRALIAN scientists have made the world-first discovery that a pregnant woman's diet determines whether her baby grows into a fat adult or a skinny one.
The research suggests women who are overweight before they fall pregnant, and during it, may condemn their children to a life of overeating and obesity.
It reveals that a mother's diet during pregnancy affects the baby's brain circuits, determining appetite and energy expenditure in their offspring.
"This suggests that mothers should think twice about overindulging, or using the excuse that they're eating for two during pregnancy," University of NSW professor Margaret Morris said.
Pre-natal period programs a child's future appetite
Unlike previous studies, the groundbreaking work highlights the pre-natal period as a critical time for "programming of post-natal and adult appetite".
It found that even before a woman falls pregnant, she is potentially "programming" a child's future appetite.
"The major finding is the dramatic increase in body fat in offspring of overweight and obese mothers," Professor Morris said.
Mothers fed a high-fat diet had offspring that were heavier, with more body fat and altered appetite regulators in the brain, meaning they overate, she said.
The results are supported by a study published in the British Journal of Nutrition last year. It found that mothers who eat junk food during pregnancy may produce children who crave the same foods.
Professor Morris will present her findings at the Australian Neuroscience Society conference in Hobart this week.
She said the study was particularly relevant, given that about 30 per cent of mothers enter pregnancy in an overweight or obese condition.
Study mated overweight female rates with healthy males
The study was conducted using overweight female rats who mated with healthy males.
The females continued to be fed a high-fat Western diet during and after pregnancy, Professor Morris said.
"The mums were overeating for that whole period. We found the offspring were a third heavier than the rats fed a low-fat diet," she said.
Professor Morris said the brain pathways regulating appetite in rats were similar to those in humans, suggesting similar trends could be expected in people.
Sydney University nutritionist Dr Jenny O'Dea said it had become "quite well accepted" that a woman's diet during pregnancy impacted on the fetus.
"We also know that obesity during pregnancy more often than not causes gestational diabetes and high blood pressure," Dr O'Dea said.
Pregnant women should not 'eat for two'
She said that although nutritional needs were high during pregnancy, women should not be "eating for two".
Professor Morris studied mothers who were already overweight before they fell pregnant. The experiment results also found their offspring were showing signs of developing diabetes at a young age.
The findings are particularly relevant for overweight mothers, highlighting the importance of maintaining a normal weight before and during pregnancy.
Further research will examine how methods of intervention during breastfeeding can reverse bad nutritional habits and overeating.
Susie Burrell, a pediatric dietitian at The Children's Hospital at Westmead, said the study sent a powerful message to women planning to fall pregnant.
"They need to get their weight under control before conceiving, and those who are pregnant need to have minimum weight-gain during pregnancy," Ms Burrell said.
She said an increasing number of women were overweight before they fell pregnant, creating a "snowball effect".
"Their babies are more likely to have a high birth weight. This then leads to lifestyle diseases such as type 2 diabetes and heart disease."
Future Flyers: Pushing Forward for Personal Aircraft
By Tariq Malik
Ask science fiction fans about the future and no doubt they'll tell you it's full of flying cars.
The concept of personal flying machines seems to be as old as the aviation itself, yet cars, trucks and buses still claim the transportation throne.
However NASA researchers, as well as private aviation engineers, are working to develop the next step in personal air transportation.
The Personal Air Vehicle Exploration (PAVE) program at NASA's Langley Research Center in Hampton Virginia, for example, is working to develop easy-to-use aircraft that may one day take you from your garage to the airport and on to your destination, saving time - and hopefully dollars - otherwise be spent on a public flight.
"That's always been the dream," said Andrew Hahn, PAVE vehicle sector analyst at Langley, of flying cars. "Our plan is to have a flying demonstrator by [2009]."
While Hahn and PAVE manager Mark Moore work to develop roadable air vehicles, other Langley researchers in a separate project are hoping to streamline the interfaces between personal aircraft and the thousands of small airports across the country to serve future flyers.
Neither car, nor plane
The challenge of fusing cars with airplanes is that, in the end, the final vehicle tends to be inefficient on the road and in the air.
"It was heavier and costlier and not a particularly good car or airplane," Hahn said of the vehicular combination. "So at least with the foreseeable technology, it didn't look too attractive."
Meeting Department of Transportation regulations for road vehicles, he told SPACE.com, also adds a 1,000 pounds of weight to a vehicle.
PAVE is aimed at developing a small aircraft capable of short jaunts on city streets, enough to get to a nearby airport and back, researchers said.
The first step is a demonstrator aircraft, called TailFan and expected to fly by the 2009 fiscal year, to tackle some of the biggest challenges to personal air vehicles. Among them are ease-of-use, environmental issues like noise and pollution, as well as affordability.
"Airplanes are just not affordable to the bulk of people," Hahn said. "And those who can afford them aren't buying them in droves."
For TailFan, PAVE researchers are taking a horse-like approach to navigation, building a system smart enough know where to fly but allowing human aviators enough control to stay attentive in case of an emergency.
"It's sort of like this interesting middle ground...like having a robotic copilot," Hahn said. "Full autonomy is not going to be reliable enough for people to trust their lives to."
A roadable vehicle like the Gridlock Commuter is still about 10 years away, he added.
Smaller airport alternative
Another, potentially nearer-term flight alternative is also under study by LARC researchers.
Dubbed the Small Aircraft Transportation System (SATS) program, the project is developing some of the basic technologies to make use of the more than 5,000 small airports across as a greater means of point-to-point transport.
"Nearly all of the people in this country live in a 30 minutes of a small airport," said Jerry Hefner, SATS project manager at Langley, in a telephone interview. "We're working to move general aviation and small aircraft from enthusiasts to a mode of transportation."
SATS is primarily aimed at boosting volume operations at small airports without towers or radar, as well developing new technologies and operating procedures to help pilots land in low-visibility at minimally equipped airports. Increasing single-pilot crew safety, with systems such as synthetic vision, heads-up displays, and integrating those small aircraft into the larger national airspace are also project goals.
"The idea is that you won't have to leave home an hour early to get to the airport, then spend two hours waiting for security and check-in," Hefner said."
Developing that navigation and airport infrastructure is a vital component for the future of personal air travel, according to aircraft developers.
"Without it, the vehicle will be primarily a novelty," said Paul Moller, an aeronautical engineer spearheading development of the Skycar vertical takeoff and landing aircraft at Moller International in Davis, California. "A highway in the sky is a really critical tool."
SATS researchers are planning a flight demonstration of their technologies in 2005 at Danville Regional Airport in Danville, Virginia. The effort is a joint project with the Federal Aviation Administration (FAA), U.S. Department of Transportation and the National Consortium for Aviation Mobility.
Automobile aviators
The U.S. government has only approved two vehicles that could both drive on city streets and take to the air.
The first was Airphibian, a 150-horsepower vehicle with detachable tail and wings that could be removed for street driving. Developed by Robert Fulton in 1946 and approved by the FAA's precursor - the Civil Aeronautics Administration. Moulton Taylor's Aerocar, with a body covered in fiberglass, also gained FAA certification and could fly at 120 miles (194 kilometers) per hour.
While both vehicles were flyable, they failed to find enough support to get off the ground commercially. Hahn even doubts the true roadability of both vehicles.
"The Airphibian is a three-wheeled concept - only needs to meet motorcycle regulations - which did work, but I doubt was fully roadable [at] 65 mph on a highway for hours on end," he said. "The Taylor Aerocar is fully roadable, but is not compliant with the [Department of Transportation's] regulations for cars."
Flying in your future
Many researchers and engineers see robust personal air transportation as an inevitable alternative to congested airport hubs and freeways frozen by traffic.
"People are going to stop driving if they can't get anywhere," Moller told SPACE.com.
In addition to Moller's Skycar, a number of private groups are pursuing their own versions of a personal flying machine. Engineers at Israel's Urban Aeronautics Ltd. are developing an aerial utility vehicle dubbed X-Hawk while Hunstville, Alabama's MACRO Industries are hard at work on the SkyRider X2R.
If those vehicles catch on, Hahn said, they could even drive a wedge between rural and urban lifestyles, with people living on small airports and flying to their urban workplace a hundred miles away or more.
Many people already live in airport communities where aircraft taxiways start at the end of their driveways. In Florida's Spruce Creek community just south of Daytona Beach, for example, 4,500 people and 600 aircraft live in close quarters, with 1,500 people parked off nearby taxiways.
"There's just a lot of advantages to it," said Peter Rouse, Spruce Creek's airport manager and a longtime resident. "And it's a pleasant place to be, that's the main thing."
Ask science fiction fans about the future and no doubt they'll tell you it's full of flying cars.
The concept of personal flying machines seems to be as old as the aviation itself, yet cars, trucks and buses still claim the transportation throne.
However NASA researchers, as well as private aviation engineers, are working to develop the next step in personal air transportation.
The Personal Air Vehicle Exploration (PAVE) program at NASA's Langley Research Center in Hampton Virginia, for example, is working to develop easy-to-use aircraft that may one day take you from your garage to the airport and on to your destination, saving time - and hopefully dollars - otherwise be spent on a public flight.
"That's always been the dream," said Andrew Hahn, PAVE vehicle sector analyst at Langley, of flying cars. "Our plan is to have a flying demonstrator by [2009]."
While Hahn and PAVE manager Mark Moore work to develop roadable air vehicles, other Langley researchers in a separate project are hoping to streamline the interfaces between personal aircraft and the thousands of small airports across the country to serve future flyers.
Neither car, nor plane
The challenge of fusing cars with airplanes is that, in the end, the final vehicle tends to be inefficient on the road and in the air.
"It was heavier and costlier and not a particularly good car or airplane," Hahn said of the vehicular combination. "So at least with the foreseeable technology, it didn't look too attractive."
Meeting Department of Transportation regulations for road vehicles, he told SPACE.com, also adds a 1,000 pounds of weight to a vehicle.
PAVE is aimed at developing a small aircraft capable of short jaunts on city streets, enough to get to a nearby airport and back, researchers said.
The first step is a demonstrator aircraft, called TailFan and expected to fly by the 2009 fiscal year, to tackle some of the biggest challenges to personal air vehicles. Among them are ease-of-use, environmental issues like noise and pollution, as well as affordability.
"Airplanes are just not affordable to the bulk of people," Hahn said. "And those who can afford them aren't buying them in droves."
For TailFan, PAVE researchers are taking a horse-like approach to navigation, building a system smart enough know where to fly but allowing human aviators enough control to stay attentive in case of an emergency.
"It's sort of like this interesting middle ground...like having a robotic copilot," Hahn said. "Full autonomy is not going to be reliable enough for people to trust their lives to."
A roadable vehicle like the Gridlock Commuter is still about 10 years away, he added.
Smaller airport alternative
Another, potentially nearer-term flight alternative is also under study by LARC researchers.
Dubbed the Small Aircraft Transportation System (SATS) program, the project is developing some of the basic technologies to make use of the more than 5,000 small airports across as a greater means of point-to-point transport.
"Nearly all of the people in this country live in a 30 minutes of a small airport," said Jerry Hefner, SATS project manager at Langley, in a telephone interview. "We're working to move general aviation and small aircraft from enthusiasts to a mode of transportation."
SATS is primarily aimed at boosting volume operations at small airports without towers or radar, as well developing new technologies and operating procedures to help pilots land in low-visibility at minimally equipped airports. Increasing single-pilot crew safety, with systems such as synthetic vision, heads-up displays, and integrating those small aircraft into the larger national airspace are also project goals.
"The idea is that you won't have to leave home an hour early to get to the airport, then spend two hours waiting for security and check-in," Hefner said."
Developing that navigation and airport infrastructure is a vital component for the future of personal air travel, according to aircraft developers.
"Without it, the vehicle will be primarily a novelty," said Paul Moller, an aeronautical engineer spearheading development of the Skycar vertical takeoff and landing aircraft at Moller International in Davis, California. "A highway in the sky is a really critical tool."
SATS researchers are planning a flight demonstration of their technologies in 2005 at Danville Regional Airport in Danville, Virginia. The effort is a joint project with the Federal Aviation Administration (FAA), U.S. Department of Transportation and the National Consortium for Aviation Mobility.
Automobile aviators
The U.S. government has only approved two vehicles that could both drive on city streets and take to the air.
The first was Airphibian, a 150-horsepower vehicle with detachable tail and wings that could be removed for street driving. Developed by Robert Fulton in 1946 and approved by the FAA's precursor - the Civil Aeronautics Administration. Moulton Taylor's Aerocar, with a body covered in fiberglass, also gained FAA certification and could fly at 120 miles (194 kilometers) per hour.
While both vehicles were flyable, they failed to find enough support to get off the ground commercially. Hahn even doubts the true roadability of both vehicles.
"The Airphibian is a three-wheeled concept - only needs to meet motorcycle regulations - which did work, but I doubt was fully roadable [at] 65 mph on a highway for hours on end," he said. "The Taylor Aerocar is fully roadable, but is not compliant with the [Department of Transportation's] regulations for cars."
Flying in your future
Many researchers and engineers see robust personal air transportation as an inevitable alternative to congested airport hubs and freeways frozen by traffic.
"People are going to stop driving if they can't get anywhere," Moller told SPACE.com.
In addition to Moller's Skycar, a number of private groups are pursuing their own versions of a personal flying machine. Engineers at Israel's Urban Aeronautics Ltd. are developing an aerial utility vehicle dubbed X-Hawk while Hunstville, Alabama's MACRO Industries are hard at work on the SkyRider X2R.
If those vehicles catch on, Hahn said, they could even drive a wedge between rural and urban lifestyles, with people living on small airports and flying to their urban workplace a hundred miles away or more.
Many people already live in airport communities where aircraft taxiways start at the end of their driveways. In Florida's Spruce Creek community just south of Daytona Beach, for example, 4,500 people and 600 aircraft live in close quarters, with 1,500 people parked off nearby taxiways.
"There's just a lot of advantages to it," said Peter Rouse, Spruce Creek's airport manager and a longtime resident. "And it's a pleasant place to be, that's the main thing."
Trains of the future
The MAGLEV
There's something very unusual about these futuristic trains. They don't have any wheels! Instead, they float above the track. Maglev is short for magnetic levitation [mag-NET-ick lev-ee-TAY-shun] which means using magnets to make the train rise up from the track, and it means they can go faster. Some Maglevs have already been built in Germany and Japan, where trains have already run at an incredible 552kph (343 mph) on a test track. It is expected that Maglev trains will reach speeds of up to 800 kph (500 mph) by 2020.
Train experts are proposing an even more futuristic development of Maglevs. It is envisaged that these 'floating' trains would run through vacuum tubes. A vacuum is a space without any air in it. It would mean trains could go faster - up to 3,000 kph (1,684 mph). Not only is it super-fast, but it means it wouldn't take as much energy as other kinds of transport.
Vacuum tubes could be built all over the world - under the sea and across continents. A tube could be built under the Atlantic Ocean, and that would mean you could get from England to America in less than two hours (it takes 6 or 7 hours by plane). It would cost about £20 billion to build a tube like that.
Tony Roche, who is the president of the Institution of Mechanical Engineers said "Some might see this as a pie-in-the-sky idea, but a lot can happen in 50 years."
This was written by Robin Gray of the National Railway Museum.
There's something very unusual about these futuristic trains. They don't have any wheels! Instead, they float above the track. Maglev is short for magnetic levitation [mag-NET-ick lev-ee-TAY-shun] which means using magnets to make the train rise up from the track, and it means they can go faster. Some Maglevs have already been built in Germany and Japan, where trains have already run at an incredible 552kph (343 mph) on a test track. It is expected that Maglev trains will reach speeds of up to 800 kph (500 mph) by 2020.
Train experts are proposing an even more futuristic development of Maglevs. It is envisaged that these 'floating' trains would run through vacuum tubes. A vacuum is a space without any air in it. It would mean trains could go faster - up to 3,000 kph (1,684 mph). Not only is it super-fast, but it means it wouldn't take as much energy as other kinds of transport.
Vacuum tubes could be built all over the world - under the sea and across continents. A tube could be built under the Atlantic Ocean, and that would mean you could get from England to America in less than two hours (it takes 6 or 7 hours by plane). It would cost about £20 billion to build a tube like that.
Tony Roche, who is the president of the Institution of Mechanical Engineers said "Some might see this as a pie-in-the-sky idea, but a lot can happen in 50 years."
This was written by Robin Gray of the National Railway Museum.
Monday, February 4, 2008
Solar shingles could be your home’s power source of the future
New advances on the production of photovoltaic solar panels allow for them to be made in the style of roofing shingles. Instead of just a big clunky block of solar capturing, the solar cells can be integrated to look like normal roof tiles, making your choice of power not only inconspicuous, but more pleasing to the eye.
The National Institute of Standards and Technology has been testing new photovoltaic panels in Maryland, calibrating their output and performance for widespread use in the future. All of the test models have areas in the roof where homeowners can nail or mount something without the circuitry being destroyed. The test results will be more of a technological study than a buyers guide, yet will surely influence what we see in the future as effective solar energy consumption. — Andrew Dobrow
The National Institute of Standards and Technology has been testing new photovoltaic panels in Maryland, calibrating their output and performance for widespread use in the future. All of the test models have areas in the roof where homeowners can nail or mount something without the circuitry being destroyed. The test results will be more of a technological study than a buyers guide, yet will surely influence what we see in the future as effective solar energy consumption. — Andrew Dobrow
Saturday, February 2, 2008
Honda's vision of the future -- a car powered by hydrogen
Michael Taylor,
The future of driving, if Honda has anything to say about it, came to a Monterey County race track Tuesday in the form of a dark red sedan that is slated to be the first fuel cell car on the planet to come off a production line.
The Honda FCX looks like a slightly futuristic version of a blend of cars, especially those made by Honda Motor Co. But by one particular yardstick, the car is special -- it doesn't run on fossil fuel. Instead, a fuel cell car uses hydrogen.
"This is the first purpose-built fuel cell vehicle to be put on the road in the hands of retail customers," said Stephen Ellis, fuel cell marketing manager for American Honda Motor Co. "It's not a car that is remade from some other platform."
Fuel cell cars have been made by several of the world's biggest carmakers, but by and large they were cobbled together from an existing gas- or electric-powered vehicle. Honda itself earlier made a homely looking fuel cell car, one of which has been in use by a Los Angeles family for more than a year.
Honda says that within two years it plans to produce and lease to the public an untold number of cars based on the concept car the company put on display Tuesday. Tentative plans call for leasing the car for perhaps $600 or $700 a month. Automakers typically lease experimental cars to the public rather than sell them outright as a way of retaining control of them.
On Tuesday, Honda rented Laguna Seca Raceway to show off the only two FCX cars the company says exist in the world. Reporters were allowed to take the cars -- each is worth as much as $2 million, according to industry insiders -- around a portion of the race track, past signs encouraging "acceleration," "braking" and other exhortations.
The car performed like any moderately sporty sedan. It is quiet, it has a low center of gravity, and it's relatively fast.
What makes the car unlike any other sedan is its fuel cell stack, a sandwich of plates that generate electricity through an electro-chemical process using a combination of hydrogen and oxygen. The front wheels are driven by an electric motor. The only emission is water vapor.
The hydrogen can be refined from a number of sources, including coal, natural gas and methane.
Being a concept car, the FCX at the race track was far from the finished product. Every time a driver mentioned a possible problem, the reply was that it's a concept car and the problem will be fixed when it's in regular production.
A fuel cell car in regular production? Honda knows it faces enormous barriers as it tries to introduce a completely new way to propel a car.
The biggest problem is where to fuel it. Gov. Arnold Schwarzenegger's long-touted "hydrogen highway" is behind schedule, said Honda's FCX product planner, Christine Ra.
Still, a few stations accommodate fuel cell cars, and more are planned, said Catherine Dun- woody, executive director of the California Fuel Cell Partnership, a group of companies that promotes the technology.
"There are 23 in California, mostly in Southern California," Dunwoody said Tuesday, "and 14 more are on the way. Most fuel cell cars fuel at one or two stations, and we need to move to the point where any car can find a station."
UC Davis environmental science Professor Joan Ogden, who specializes in fuel cells, said a study she has seen says that in the next 10 years, there will be a "roll-out of hydrogen cars and stations" in California.
Others think it will take longer.
"Fuel cell cars have real promise to do double duty -- help the climate and end our oil addiction," said David Friedman, research director for vehicle programs at the Union of Concerned Scientists in Washington, D.C. "But that future is 20 to 30 years away. All the car companies are working really hard to make fuel cell vehicles a reality, and they deserve praise. Yet there are real hurdles to overcome."
Friedman cited problems of making a fuel cell system start in minus-40 degree weather and making the systems as durable as possible.
"We have to get a fuel cell vehicle that is durable and cheap enough," Friedman said, "and make sure the hydrogen is clean enough. No one will cheer if, at the end of the day, we make all our hydrogen from coal and melt the planet."
As for the economics, Honda Vice President Ben Knight said a fuel cell car can get the equivalent of a gasoline-powered car's 65 miles per gallon. An FCX filled with 8.8 pounds of hydrogen can go about 270 miles, he said.
One unknown is how much a hydrogen retailer -- probably one of the big oil companies -- would charge for hydrogen. Honda also is developing a home refueling station that draws natural gas from a home's utility supply and processes it for hydrogen use.
Then there is the real-world question of what a fuel cell car is like when you have one, day in and day out. Jon Spallino knows.
In June 2005, American Honda began leasing a 2005 Honda FCX to Spallino, a 41-year-old Redondo Beach businessman with a wife and two daughters. The Spallinos became what apparently is the only American family to use a fuel cell car every day, for such things, Spallino says, as "going to the shopping center, to the soccer field and to ballet lessons."
Asked what stood out, Spallino said, "the lack of trouble. I expected technical problems. All that happened was one flat tire."
He said he fills up the car about once a week at Honda's U.S. headquarters in Torrance, and otherwise it behaves like a normal car. Except that he does get a lot of attention, given that "Honda Fuel Cell Powered FCX" is written in giant letters on the side of the car.
"I finally ended up carrying a stack of brochures explaining the car," Spallino said. "All of that was part of the fun."
Fuel cells: electric power from hydrogen fuel
Fuel cells create electricity through an electrochemical process that combines hydrogen and oxygen. Vehicles running on fuel cells would need to be supplied with gaseous hydrogen extracted from a hydrocarbon fuel, such as coal, natural gas, or methane. Honda is developing a home refueling station that draws gas from the home's utility supply and processes it for hydrogen use.
How fuel cells work
Hydrogen fuel is fed into the anode of the fuel cell. Helped by a catalyst, hydrogen atoms are split into electrons and protons.
Electrons are channeled through a circuit to produce electricity.
Protons pass through the proton exchange membrane.
Oxygen enters the cathode and combines with the electrons and protons to form water.
Water vapor and heat are released as byproducts of the reaction.
The future of driving, if Honda has anything to say about it, came to a Monterey County race track Tuesday in the form of a dark red sedan that is slated to be the first fuel cell car on the planet to come off a production line.
The Honda FCX looks like a slightly futuristic version of a blend of cars, especially those made by Honda Motor Co. But by one particular yardstick, the car is special -- it doesn't run on fossil fuel. Instead, a fuel cell car uses hydrogen.
"This is the first purpose-built fuel cell vehicle to be put on the road in the hands of retail customers," said Stephen Ellis, fuel cell marketing manager for American Honda Motor Co. "It's not a car that is remade from some other platform."
Fuel cell cars have been made by several of the world's biggest carmakers, but by and large they were cobbled together from an existing gas- or electric-powered vehicle. Honda itself earlier made a homely looking fuel cell car, one of which has been in use by a Los Angeles family for more than a year.
Honda says that within two years it plans to produce and lease to the public an untold number of cars based on the concept car the company put on display Tuesday. Tentative plans call for leasing the car for perhaps $600 or $700 a month. Automakers typically lease experimental cars to the public rather than sell them outright as a way of retaining control of them.
On Tuesday, Honda rented Laguna Seca Raceway to show off the only two FCX cars the company says exist in the world. Reporters were allowed to take the cars -- each is worth as much as $2 million, according to industry insiders -- around a portion of the race track, past signs encouraging "acceleration," "braking" and other exhortations.
The car performed like any moderately sporty sedan. It is quiet, it has a low center of gravity, and it's relatively fast.
What makes the car unlike any other sedan is its fuel cell stack, a sandwich of plates that generate electricity through an electro-chemical process using a combination of hydrogen and oxygen. The front wheels are driven by an electric motor. The only emission is water vapor.
The hydrogen can be refined from a number of sources, including coal, natural gas and methane.
Being a concept car, the FCX at the race track was far from the finished product. Every time a driver mentioned a possible problem, the reply was that it's a concept car and the problem will be fixed when it's in regular production.
A fuel cell car in regular production? Honda knows it faces enormous barriers as it tries to introduce a completely new way to propel a car.
The biggest problem is where to fuel it. Gov. Arnold Schwarzenegger's long-touted "hydrogen highway" is behind schedule, said Honda's FCX product planner, Christine Ra.
Still, a few stations accommodate fuel cell cars, and more are planned, said Catherine Dun- woody, executive director of the California Fuel Cell Partnership, a group of companies that promotes the technology.
"There are 23 in California, mostly in Southern California," Dunwoody said Tuesday, "and 14 more are on the way. Most fuel cell cars fuel at one or two stations, and we need to move to the point where any car can find a station."
UC Davis environmental science Professor Joan Ogden, who specializes in fuel cells, said a study she has seen says that in the next 10 years, there will be a "roll-out of hydrogen cars and stations" in California.
Others think it will take longer.
"Fuel cell cars have real promise to do double duty -- help the climate and end our oil addiction," said David Friedman, research director for vehicle programs at the Union of Concerned Scientists in Washington, D.C. "But that future is 20 to 30 years away. All the car companies are working really hard to make fuel cell vehicles a reality, and they deserve praise. Yet there are real hurdles to overcome."
Friedman cited problems of making a fuel cell system start in minus-40 degree weather and making the systems as durable as possible.
"We have to get a fuel cell vehicle that is durable and cheap enough," Friedman said, "and make sure the hydrogen is clean enough. No one will cheer if, at the end of the day, we make all our hydrogen from coal and melt the planet."
As for the economics, Honda Vice President Ben Knight said a fuel cell car can get the equivalent of a gasoline-powered car's 65 miles per gallon. An FCX filled with 8.8 pounds of hydrogen can go about 270 miles, he said.
One unknown is how much a hydrogen retailer -- probably one of the big oil companies -- would charge for hydrogen. Honda also is developing a home refueling station that draws natural gas from a home's utility supply and processes it for hydrogen use.
Then there is the real-world question of what a fuel cell car is like when you have one, day in and day out. Jon Spallino knows.
In June 2005, American Honda began leasing a 2005 Honda FCX to Spallino, a 41-year-old Redondo Beach businessman with a wife and two daughters. The Spallinos became what apparently is the only American family to use a fuel cell car every day, for such things, Spallino says, as "going to the shopping center, to the soccer field and to ballet lessons."
Asked what stood out, Spallino said, "the lack of trouble. I expected technical problems. All that happened was one flat tire."
He said he fills up the car about once a week at Honda's U.S. headquarters in Torrance, and otherwise it behaves like a normal car. Except that he does get a lot of attention, given that "Honda Fuel Cell Powered FCX" is written in giant letters on the side of the car.
"I finally ended up carrying a stack of brochures explaining the car," Spallino said. "All of that was part of the fun."
Fuel cells: electric power from hydrogen fuel
Fuel cells create electricity through an electrochemical process that combines hydrogen and oxygen. Vehicles running on fuel cells would need to be supplied with gaseous hydrogen extracted from a hydrocarbon fuel, such as coal, natural gas, or methane. Honda is developing a home refueling station that draws gas from the home's utility supply and processes it for hydrogen use.
How fuel cells work
Hydrogen fuel is fed into the anode of the fuel cell. Helped by a catalyst, hydrogen atoms are split into electrons and protons.
Electrons are channeled through a circuit to produce electricity.
Protons pass through the proton exchange membrane.
Oxygen enters the cathode and combines with the electrons and protons to form water.
Water vapor and heat are released as byproducts of the reaction.
Electric Cars
VOLT FROM THE BLUE General Motors made a huge splash at the Detroit auto show in January when it unveiled its Chevy Volt concept. The car is designed to run on electricity alone, with batteries that can recharge either via an external outlet or an onboard gasoline engine, delivering 150 liles per gallon for drivers who commute up to 60 miles a day.
TECH Hybrid cars, which mate battery-powered electric motors with internal-combustion engines, are available now from Honda, Toyota and Ford, and most other manufacturers have models on the way. All-electric cars, including the Tesla Roadster, are having a harder go of it, mostly because of the limitations of current battery technology. The Chevy Volt concept, unveiled this year, cleverly straddles the fence between hybrid and all-electric cars. The Volt uses a 1.0-liter, three-cylinder gasoline engine only to charge a lithium-ion battery pack (yet to be developed) that powers the 120-kilowatt (160-horsepower) motor. Owners could also charge the Volt using a household outlet.
GREEN BENEFITS The Toyota Prius, a conventional hybrid, gets between 45 and 50 miles per gallon and produces half the amount of greenhouse-gas emissions of 30mpg sedans. If GM's Volt happens (battery technology won't be ready until at least 2010), it could drive on electricity alone for about 40 miles. That 40-mile-or-less drive describes 78 percent of American commuters. Even if the Volt drew its power from the dirtiest coal-fired plants, it would still produce less than half the emissions of a typical new car.
ECONOMICS Over 15,000 miles, the Prius costs an estimated $650 to fuel, compared with $1,300 for the 27mpg Toyota Camry. Driving the Volt 15,000 miles on electricity alone would cost only about $300 on the electric bill. Owners who drove 60 miles a day would see 150 miles per gallon and an average annual fuel cost of $116. But delivering a family car like the Volt at a price people could swallow—figure $25,000 in today's dollars—will be an enormous challenge.
OUTLOOK The first plug-in hybrids won't arrive before 2011; lithium-ion batteries have to get stronger, smaller and much cheaper to make the Volt viable and affordable for the masses. In the meantime, conventional hybrids will continue to dominate the alt-fuel market.
EST. MARKET SHARE IN 2027: 30%
TECH Hybrid cars, which mate battery-powered electric motors with internal-combustion engines, are available now from Honda, Toyota and Ford, and most other manufacturers have models on the way. All-electric cars, including the Tesla Roadster, are having a harder go of it, mostly because of the limitations of current battery technology. The Chevy Volt concept, unveiled this year, cleverly straddles the fence between hybrid and all-electric cars. The Volt uses a 1.0-liter, three-cylinder gasoline engine only to charge a lithium-ion battery pack (yet to be developed) that powers the 120-kilowatt (160-horsepower) motor. Owners could also charge the Volt using a household outlet.
GREEN BENEFITS The Toyota Prius, a conventional hybrid, gets between 45 and 50 miles per gallon and produces half the amount of greenhouse-gas emissions of 30mpg sedans. If GM's Volt happens (battery technology won't be ready until at least 2010), it could drive on electricity alone for about 40 miles. That 40-mile-or-less drive describes 78 percent of American commuters. Even if the Volt drew its power from the dirtiest coal-fired plants, it would still produce less than half the emissions of a typical new car.
ECONOMICS Over 15,000 miles, the Prius costs an estimated $650 to fuel, compared with $1,300 for the 27mpg Toyota Camry. Driving the Volt 15,000 miles on electricity alone would cost only about $300 on the electric bill. Owners who drove 60 miles a day would see 150 miles per gallon and an average annual fuel cost of $116. But delivering a family car like the Volt at a price people could swallow—figure $25,000 in today's dollars—will be an enormous challenge.
OUTLOOK The first plug-in hybrids won't arrive before 2011; lithium-ion batteries have to get stronger, smaller and much cheaper to make the Volt viable and affordable for the masses. In the meantime, conventional hybrids will continue to dominate the alt-fuel market.
EST. MARKET SHARE IN 2027: 30%
Future Fuel (Bio Diesel)
TECH Cleaner-burning than ethanol, with vastly better fuel economy, biodiesel could soon move beyond the fryer-grease fringe it's often associated with. Made largely from soybean oil or recycled cooking oil, biodiesel runs fine in unmodified diesel engines at up to a 20 percent blend with 80 percent petroleum diesel, a combination known as B20. Mercedes is conservative with its warrantied vehicles, recently announcing that drivers of Bluetec and CDI diesels could run B5; the same is true for Jeep's Liberty and the upcoming Grand Cherokee diesel. (In Europe, Citroën and Peugeot diesels can run up to B30.) Pure liquid biodiesel thickens at low temperatures, however, creating challenges for cold-climate storage and operation.
GREEN BENEFITS Biodiesel's greenhouse-gas emissions are about one third lower than gasoline. A University of Minnesota study found that biodiesel creates 93 percent more energy than is used to produce it, compared with just 25 percent for ethanol.
ECONOMICS B20 fuel contains only 2 percent less energy than regular diesel, so it delivers terrific mileage: 20 to 40 percent better than gasoline. Prices vary by region because of the limited supply sources, and it costs more than either gasoline or petroleum diesel.
OUTLOOK Despite efforts toward large-scale biodiesel production, for now it's a mere drop in a huge national reservoir of gasoline. Just 225 million gallons were produced in the U.S. last year, about as much gasoline as the nation guzzles in a day. There are only about 1,000 pumps nationwide, and any growth in that number depends on a rise in diesel usage. Biodiesel has better prospects in Europe, where diesel holds half the car market and the European Union produces nearly 90 percent of the world's biodiesel.
EST. MARKET SHARE IN 2027: 4% (B20)
GREEN BENEFITS Biodiesel's greenhouse-gas emissions are about one third lower than gasoline. A University of Minnesota study found that biodiesel creates 93 percent more energy than is used to produce it, compared with just 25 percent for ethanol.
ECONOMICS B20 fuel contains only 2 percent less energy than regular diesel, so it delivers terrific mileage: 20 to 40 percent better than gasoline. Prices vary by region because of the limited supply sources, and it costs more than either gasoline or petroleum diesel.
OUTLOOK Despite efforts toward large-scale biodiesel production, for now it's a mere drop in a huge national reservoir of gasoline. Just 225 million gallons were produced in the U.S. last year, about as much gasoline as the nation guzzles in a day. There are only about 1,000 pumps nationwide, and any growth in that number depends on a rise in diesel usage. Biodiesel has better prospects in Europe, where diesel holds half the car market and the European Union produces nearly 90 percent of the world's biodiesel.
EST. MARKET SHARE IN 2027: 4% (B20)
Friday, February 1, 2008
Microsoft wants to buy Yahoo! For $ 44.6 billion
The proposed US $ 31 per share represents a premium of 62% on the current price.
Microsoft already has plans for integrating the officials of the two companies.
Microsoft has made a proposal to purchase the Yahoo! This Friday (1) valued at $ 44.6 billion. The objective of the company is to increase its competitiveness in the market for online services, and, especially, from search engines.
The proposed US $ 31 per share represents a premium of 62% on the close of yesterday''''s action of Yahoo! On the New York Stock Exchange, to $ 19.18. With the news, from around 10am (Brasilia), the shares of Yahoo! Disparavam 54% in the pre-market on Wall Street, listed at $ 29.70, while Microsoft recuava 2%, to $ 31.95.
On release, the giant of the software industry, said the agreement would create a company more efficient, with a synergy total of $ 1 billion a year, with the generation of more value to advertisers and operating efficiencies.
Microsoft also revealed that already have developed a plan to integrate the officials of the two companies.
The expectation of Microsoft is to obtain regulatory approval for the agreement, with its completion planned for the second half of 2008.
"We have a great respect for Yahoo, and together we can offer a range of solutions increasingly attractive to consumers, publishers and advertisers, while posicionamos best in the competition in the market for online services," said Steve Ballmer, CEO of Microsoft .
Revitalising the company
The announcement was made before the opening of markets in the United States, one day after the ex-chief executive of Yahoo! Terry Semel left the board of the company.
The departure of Semel happens after that Yahoo! Announced a project to reduce staff by 1,000 employees as part of an effort to revitalize the company.
The co-founder of Yahoo! Jerry Yang Semel replaced as chief executive to enhance the profits of the Californian firm and the price of the shares.
Yahoo! Suffered a drop in its profits in the fourth quarter of 2007 and throughout the year, and warned that 2008 would also be difficult, as enfernta uam reorganization to increase its main source of incomes, sales of advertising.
The company recorded a net profit in the fourth quarter of $ 205.7 million, a decline of 23.5%, and for the whole year low in a profit of 12.1%, to $ 660 million.
Currently, its competitor, Google, embolsa more than 32% of their income from advertising on the Internet worldwide, compared with less than 20% for Yahoo, after just two years the two groups registering positions very close.
Yahoo! Was founded in 1994 by students Jerry Yang and David Filo at Stanford University. The company is headquartered in Sunnyvale, California (USA).
Http://g1.globo.com/Noticias/Economia_Negocios/0,, MUL283125-9356 ,00-MICROSOFT + + BUY OR YAHOO + O + + + PO + US BILHOES.html
Microsoft already has plans for integrating the officials of the two companies.
Microsoft has made a proposal to purchase the Yahoo! This Friday (1) valued at $ 44.6 billion. The objective of the company is to increase its competitiveness in the market for online services, and, especially, from search engines.
The proposed US $ 31 per share represents a premium of 62% on the close of yesterday''''s action of Yahoo! On the New York Stock Exchange, to $ 19.18. With the news, from around 10am (Brasilia), the shares of Yahoo! Disparavam 54% in the pre-market on Wall Street, listed at $ 29.70, while Microsoft recuava 2%, to $ 31.95.
On release, the giant of the software industry, said the agreement would create a company more efficient, with a synergy total of $ 1 billion a year, with the generation of more value to advertisers and operating efficiencies.
Microsoft also revealed that already have developed a plan to integrate the officials of the two companies.
The expectation of Microsoft is to obtain regulatory approval for the agreement, with its completion planned for the second half of 2008.
"We have a great respect for Yahoo, and together we can offer a range of solutions increasingly attractive to consumers, publishers and advertisers, while posicionamos best in the competition in the market for online services," said Steve Ballmer, CEO of Microsoft .
Revitalising the company
The announcement was made before the opening of markets in the United States, one day after the ex-chief executive of Yahoo! Terry Semel left the board of the company.
The departure of Semel happens after that Yahoo! Announced a project to reduce staff by 1,000 employees as part of an effort to revitalize the company.
The co-founder of Yahoo! Jerry Yang Semel replaced as chief executive to enhance the profits of the Californian firm and the price of the shares.
Yahoo! Suffered a drop in its profits in the fourth quarter of 2007 and throughout the year, and warned that 2008 would also be difficult, as enfernta uam reorganization to increase its main source of incomes, sales of advertising.
The company recorded a net profit in the fourth quarter of $ 205.7 million, a decline of 23.5%, and for the whole year low in a profit of 12.1%, to $ 660 million.
Currently, its competitor, Google, embolsa more than 32% of their income from advertising on the Internet worldwide, compared with less than 20% for Yahoo, after just two years the two groups registering positions very close.
Yahoo! Was founded in 1994 by students Jerry Yang and David Filo at Stanford University. The company is headquartered in Sunnyvale, California (USA).
Http://g1.globo.com/Noticias/Economia_Negocios/0,, MUL283125-9356 ,00-MICROSOFT + + BUY OR YAHOO + O + + + PO + US BILHOES.html
Carbon Capture And Storage To Combat Global Warming Examined
ScienceDaily (Jun. 12, 2007) — While solar power and hybrid cars have become popular symbols of green technology, Stanford researchers are exploring another path for cutting emissions of carbon dioxide, the leading greenhouse gas that causes global warming.
Carbon capture and storage, also called carbon sequestration, traps carbon dioxide after it is produced and injects it underground. The gas never enters the atmosphere. The practice could transform heavy carbon spewers, such as coal power plants, into relatively clean machines with regard to global warming.
''The notion is that the sooner we wean ourselves off fossil fuels, the sooner we'll be able to tackle the climate problem,'' said Sally Benson, executive director of the Global Climate and Energy Project (GCEP) and professor of energy resources engineering. ''But the idea that we can take fossil fuels out of the mix very quickly is unrealistic. We're reliant on fossil fuels, and a good pathway is to find ways to use them that don't create a problem for the climate.''
Carbon capture has the potential to reduce more than 90 percent of an individual plant's carbon emissions, said Lynn Orr, director of GCEP and professor of energy resources engineering. Stationary facilities that burn fossil fuels-such as power plants or cement factories-would be candidates for the technology, he said.
Capturing carbon dioxide from small, mobile sources, such as cars, would be more difficult, Orr said. But with power plants comprising 40 percent of the world's fossil fuel-derived carbon emissions, he added, the potential for reductions is significant.
Not only can a lot of carbon dioxide be captured, but the Earth's capacity to store it is also vast, he added.
Estimates of worldwide storage capacity range from 2 trillion to 10 trillion tons of carbon dioxide, according to the Intergovernmental Panel on Climate Change (IPCC) in its report on carbon capture and storage. Global emissions in 2004 totaled 27 billion tons, according to the U.S. Department of Energy's Energy Information Administration.
If all human-induced emissions were sequestered, enough capacity would exist to accommodate more than 100 years' worth of emissions, according to Benson, coordinating lead author of the IPCC chapter on underground geological storage.
With fossil fuels already comprising 85 percent of the world's energy consumption, and their use rapidly increasing due to the growth of developing countries, such as China and India, the need to find solutions to curb carbon emissions becomes even more crucial, Benson said.
From the air to the earth
In the capture process, carbon dioxide is extracted from a mix of waste gases. The most common method sends the exhaust through a chimney containing a three-dimensional mesh. As the gas goes up, a chemical solvent drizzles down, soaking up the gas where the two substances meet. The carbon dioxide is then extracted from the liquid and compressed, now ready for storage.
The best storage options today lie in geologic sequestration-storage in old oil fields, natural gas reservoirs, deep saline aquifers and unminable coal beds, hundreds to thousands of meters underground.
The carbon dioxide is pumped down through wells, like those used to extract oil, and dissolves or disperses in its reservoir.
Viable locations must have a caprock, or an impermeable layer above the reservoir shaped like an upside-down bowl, that traps the gas and keeps it from escaping, the researchers said.
Safety smarts
''The goal of carbon sequestration is to permanently store the carbon dioxide,'' Benson said, ''permanent meaning very, very long-term, geological time periods.''
The greatest concern surrounding carbon dioxide storage is the potential for it to leak, researchers said.
The most obvious worry, said Benson, is that leakage would lead to more global warming, defeating the purpose of storage in the first place.
''People think, it would have been sort of sad going through all this trouble,'' said Tony Kovscek, associate professor of energy resources engineering and a researcher on a GCEP project on carbon sequestration in coal.
But studies have shown that leakage, if it happened at all, would be insignificant, Benson said. The IPCC reported that 99 percent retention of the carbon dioxide that is stored would be ''very likely'' over 100 years and ''likely'' over 1,000 years, she said.
''If you do it right, if you select the site correctly and monitor, it can be near permanent,'' Benson said.
Of greater concern to the researchers are the potential risks of carbon sequestration to human health, mainly through asphyxiation and groundwater contamination.
The threat of asphyxiation-or suffocation due to carbon dioxide displacing oxygen-is very low, the researchers said, because of the unlikelihood of a rapid leakage, which would have to occur to cause a problem.
Drinking water contamination, Benson said, is the more probable danger. For example, if carbon dioxide enters the groundwater somehow, it can increase the water's acidity, potentially leaching toxic chemicals, such as lead, from rocks into the water, she said.
To address these risks, scientists are studying reservoir geology to better understand what happens after injecting carbon dioxide underground.
''You need to carefully select places that won't leak, and do a good job of engineering the injection systems and paying attention to where the carbon dioxide is actually going,'' Orr said.
While a thorough technical understanding of the risks will reveal best practices, the scientists also stressed the need for good management to see that proper procedures are followed.
Benson points to a familiar technology as a model for thinking about and tackling risk.
''People often ask, is geological storage safe" It's a very difficult question to answer. Is driving safe"'' she expounded. ''You might say yes or no, but what makes driving something we're willing to do" You get automakers to build good cars, we have driver training, we don't let children drive, we have laws against drunk driving-we implement a whole system to ensure that the activity is safe.''
Policy and progress
Engineers have more than three decades of experience putting carbon dioxide into oil reservoirs, where it increases oil production by making the oil expand and ''thin out'' such that it flows more easily, Benson said.
''That experience gives us confidence that we know how to drill the wells, push the [carbon dioxide] in and say something about what will happen when it gets down there,'' said Orr.
Currently, three industrial-scale projects are pumping millions of tons of carbon dioxide into the ground every year. Two of them represent the first efforts at storage in deep saline aquifers.
A Stanford team also has begun researching storage of carbon dioxide in deep coal beds. In coal, chemical bonds form between the carbon dioxide and the coal, making the method potentially more secure than others, the researchers said.
Even better, the process can free natural gas that sits on the coal's surface. Natural gas is a relatively clean fossil fuel, which can then be burned in place of coal, said Mark Zoback, professor of geophysics and a researcher on the project on storage in coal.
The project, which is funded by GCEP and GEOSEQ-a partnership involving the Department of Energy, several national labs, government groups and industry partners-is still in its early stages, the researchers said.
Of all the projects, only one is turning a profit without recovering oil. Sleipner, an industrial-scale project run by Norwegian oil company Statoil, injects carbon dioxide into a deep saline aquifer beneath the North Sea floor.
Its economic success, scientists say, is due to the presence of Norway's high carbon taxes, which give green technologies an advantage by discouraging carbon emissions.
Carbon taxes are charged to a company for every ton of carbon dioxide it emits, so that it becomes increasingly costly to be dirty. Thus the taxes encourage companies to be green.
When a clean technology is expensive-incorporating carbon capture and storage into a power plant costs $30 to $70 per ton of carbon dioxide-taxes on emissions level the playing field and help make it viable.
A policy framework, therefore, is essential for making carbon capture and storage economical, the Stanford researchers said.
''We need thousands of projects,'' Benson said. ''That's the kind of thing that will only happen if there are global policies to address these issues. That's the number one critical thing.''
With the proper development, Benson believes that carbon sequestration could be ripe for industry in the next 20 years.
'A family of solutions'
Critics of carbon sequestration argue that the technology will divert attention from research on long-term clean energy options, such as renewable power. Worse, they fear it will prolong fossil fuel use, if fossil fuels from some stationary sources can be used more cleanly.
But the researchers continually emphasize the need to adopt other technologies in addition to carbon sequestration.
''Geological sequestration is going to be one of a family of solutions for addressing the greenhouse gas issue,'' said Zoback.
Energy efficiency and renewable energy are already feasible today and also can define the long-term energy picture, he said.
''[Carbon dioxide] sequestration, on the other hand, is only a bridge technology,'' he added. ''Maybe we have another hundred years of using fossil fuels, and then we'll be on to better and smarter things, one hopes. If we're going to be creating greenhouse gases for another hundred years, it's a huge problem right now, so you have to get on this point. But nonetheless, our dependence on fossil fuels is not going to last forever.''
Adapted from materials provided by Stanford University.
Carbon capture and storage, also called carbon sequestration, traps carbon dioxide after it is produced and injects it underground. The gas never enters the atmosphere. The practice could transform heavy carbon spewers, such as coal power plants, into relatively clean machines with regard to global warming.
''The notion is that the sooner we wean ourselves off fossil fuels, the sooner we'll be able to tackle the climate problem,'' said Sally Benson, executive director of the Global Climate and Energy Project (GCEP) and professor of energy resources engineering. ''But the idea that we can take fossil fuels out of the mix very quickly is unrealistic. We're reliant on fossil fuels, and a good pathway is to find ways to use them that don't create a problem for the climate.''
Carbon capture has the potential to reduce more than 90 percent of an individual plant's carbon emissions, said Lynn Orr, director of GCEP and professor of energy resources engineering. Stationary facilities that burn fossil fuels-such as power plants or cement factories-would be candidates for the technology, he said.
Capturing carbon dioxide from small, mobile sources, such as cars, would be more difficult, Orr said. But with power plants comprising 40 percent of the world's fossil fuel-derived carbon emissions, he added, the potential for reductions is significant.
Not only can a lot of carbon dioxide be captured, but the Earth's capacity to store it is also vast, he added.
Estimates of worldwide storage capacity range from 2 trillion to 10 trillion tons of carbon dioxide, according to the Intergovernmental Panel on Climate Change (IPCC) in its report on carbon capture and storage. Global emissions in 2004 totaled 27 billion tons, according to the U.S. Department of Energy's Energy Information Administration.
If all human-induced emissions were sequestered, enough capacity would exist to accommodate more than 100 years' worth of emissions, according to Benson, coordinating lead author of the IPCC chapter on underground geological storage.
With fossil fuels already comprising 85 percent of the world's energy consumption, and their use rapidly increasing due to the growth of developing countries, such as China and India, the need to find solutions to curb carbon emissions becomes even more crucial, Benson said.
From the air to the earth
In the capture process, carbon dioxide is extracted from a mix of waste gases. The most common method sends the exhaust through a chimney containing a three-dimensional mesh. As the gas goes up, a chemical solvent drizzles down, soaking up the gas where the two substances meet. The carbon dioxide is then extracted from the liquid and compressed, now ready for storage.
The best storage options today lie in geologic sequestration-storage in old oil fields, natural gas reservoirs, deep saline aquifers and unminable coal beds, hundreds to thousands of meters underground.
The carbon dioxide is pumped down through wells, like those used to extract oil, and dissolves or disperses in its reservoir.
Viable locations must have a caprock, or an impermeable layer above the reservoir shaped like an upside-down bowl, that traps the gas and keeps it from escaping, the researchers said.
Safety smarts
''The goal of carbon sequestration is to permanently store the carbon dioxide,'' Benson said, ''permanent meaning very, very long-term, geological time periods.''
The greatest concern surrounding carbon dioxide storage is the potential for it to leak, researchers said.
The most obvious worry, said Benson, is that leakage would lead to more global warming, defeating the purpose of storage in the first place.
''People think, it would have been sort of sad going through all this trouble,'' said Tony Kovscek, associate professor of energy resources engineering and a researcher on a GCEP project on carbon sequestration in coal.
But studies have shown that leakage, if it happened at all, would be insignificant, Benson said. The IPCC reported that 99 percent retention of the carbon dioxide that is stored would be ''very likely'' over 100 years and ''likely'' over 1,000 years, she said.
''If you do it right, if you select the site correctly and monitor, it can be near permanent,'' Benson said.
Of greater concern to the researchers are the potential risks of carbon sequestration to human health, mainly through asphyxiation and groundwater contamination.
The threat of asphyxiation-or suffocation due to carbon dioxide displacing oxygen-is very low, the researchers said, because of the unlikelihood of a rapid leakage, which would have to occur to cause a problem.
Drinking water contamination, Benson said, is the more probable danger. For example, if carbon dioxide enters the groundwater somehow, it can increase the water's acidity, potentially leaching toxic chemicals, such as lead, from rocks into the water, she said.
To address these risks, scientists are studying reservoir geology to better understand what happens after injecting carbon dioxide underground.
''You need to carefully select places that won't leak, and do a good job of engineering the injection systems and paying attention to where the carbon dioxide is actually going,'' Orr said.
While a thorough technical understanding of the risks will reveal best practices, the scientists also stressed the need for good management to see that proper procedures are followed.
Benson points to a familiar technology as a model for thinking about and tackling risk.
''People often ask, is geological storage safe" It's a very difficult question to answer. Is driving safe"'' she expounded. ''You might say yes or no, but what makes driving something we're willing to do" You get automakers to build good cars, we have driver training, we don't let children drive, we have laws against drunk driving-we implement a whole system to ensure that the activity is safe.''
Policy and progress
Engineers have more than three decades of experience putting carbon dioxide into oil reservoirs, where it increases oil production by making the oil expand and ''thin out'' such that it flows more easily, Benson said.
''That experience gives us confidence that we know how to drill the wells, push the [carbon dioxide] in and say something about what will happen when it gets down there,'' said Orr.
Currently, three industrial-scale projects are pumping millions of tons of carbon dioxide into the ground every year. Two of them represent the first efforts at storage in deep saline aquifers.
A Stanford team also has begun researching storage of carbon dioxide in deep coal beds. In coal, chemical bonds form between the carbon dioxide and the coal, making the method potentially more secure than others, the researchers said.
Even better, the process can free natural gas that sits on the coal's surface. Natural gas is a relatively clean fossil fuel, which can then be burned in place of coal, said Mark Zoback, professor of geophysics and a researcher on the project on storage in coal.
The project, which is funded by GCEP and GEOSEQ-a partnership involving the Department of Energy, several national labs, government groups and industry partners-is still in its early stages, the researchers said.
Of all the projects, only one is turning a profit without recovering oil. Sleipner, an industrial-scale project run by Norwegian oil company Statoil, injects carbon dioxide into a deep saline aquifer beneath the North Sea floor.
Its economic success, scientists say, is due to the presence of Norway's high carbon taxes, which give green technologies an advantage by discouraging carbon emissions.
Carbon taxes are charged to a company for every ton of carbon dioxide it emits, so that it becomes increasingly costly to be dirty. Thus the taxes encourage companies to be green.
When a clean technology is expensive-incorporating carbon capture and storage into a power plant costs $30 to $70 per ton of carbon dioxide-taxes on emissions level the playing field and help make it viable.
A policy framework, therefore, is essential for making carbon capture and storage economical, the Stanford researchers said.
''We need thousands of projects,'' Benson said. ''That's the kind of thing that will only happen if there are global policies to address these issues. That's the number one critical thing.''
With the proper development, Benson believes that carbon sequestration could be ripe for industry in the next 20 years.
'A family of solutions'
Critics of carbon sequestration argue that the technology will divert attention from research on long-term clean energy options, such as renewable power. Worse, they fear it will prolong fossil fuel use, if fossil fuels from some stationary sources can be used more cleanly.
But the researchers continually emphasize the need to adopt other technologies in addition to carbon sequestration.
''Geological sequestration is going to be one of a family of solutions for addressing the greenhouse gas issue,'' said Zoback.
Energy efficiency and renewable energy are already feasible today and also can define the long-term energy picture, he said.
''[Carbon dioxide] sequestration, on the other hand, is only a bridge technology,'' he added. ''Maybe we have another hundred years of using fossil fuels, and then we'll be on to better and smarter things, one hopes. If we're going to be creating greenhouse gases for another hundred years, it's a huge problem right now, so you have to get on this point. But nonetheless, our dependence on fossil fuels is not going to last forever.''
Adapted from materials provided by Stanford University.
Thursday, January 31, 2008
Small is Beautiful Nanotechnology
Nanotechnology is the science of construction on scales of a billionth of a metre. It involves making things using beams, girders, pumps and wheels just one millionth of a millimetre long. Microelectromechanical machines with parts a thousandth of a millimetre across are now made by the million, and sold for use as sensors in such things as airbags, computer joysticks and inkjet printers - being so small makes them extraordinarily sensitive to movement. But compared to what's coming, this is crude. The future could be 1,000 times smaller and 1,000 times more unpredictable.
In 1959 the Nobel prize winner Richard Feynman proposed, almost jokingly, that there was "plenty of room at the bottom"; that things could be made very small, an atom at a time. K Eric Drexler, in a 1986 classic called Engines of Creation, mapped out the possibilities of a nanotechnological world describing self-replicating machines the size of molecules that could do whatever you want.
Realists point out that some of these things already exist anyway: the manufactured ones are called drugs and the self-replicating ones are called immune cells. But the implications of nonotechnology go further: little nanosubmarines that would roam around your body repairing tissues and preventing heart disease, or computers in your ballpoint pen that will blink when the ink gets low. The possibilities, like the tools themselves, are endless
Colin Humphreys, professor of materials science at Cambridge University, argues that one of the most interesting things about small lumps of matter is that their properties change dramatically as the samples shrink. "Silicon is a good example. Bulk silicon doesn't emit light. But if you make silicon very small, it emits light. It's a fundamental change in its properties that occurs when you get to 2-3 nanometers," he says.
Explanations for such phenomena lie in the realm of quantum physics, where matter in bulk can have hard, toe-stubbing solidity while the same substance on the atomic scale can seem so much empty space and random possibilities. Now physicists are creating semiconductor wafers only molecules thick to build what they call "quantum wells" and "quantum dots" for ever smaller, faster computers. Other potential payoffs light-emitting diodes so efficient and so durable that they could one day cut electric lighting costs by 80%, while nonotechnology will also have huge benefits in the area of keyhole surgery.
Douglas Philp, of Birmingham university, believes in looking at what nature does, and learning from it. He argues that replicating the way molecules work in life could be used to make things perfectly - no polluted reactions, no untidy catalysts, no faulty molecules. "The ultimate nanotechnologist is, in fact, life itself. We could spend hours discussing why life is very good at doing certain things on the nanometer scale," he says. "What we are saying is: okay, let's try and learn how nature does it, and apply that."
"You can apply the same Darwinian principles in chemistry that you can apply in biology," Philp says. "You could imagine a coffee cup that got better at keeping the coffee warm, because you are challenging it when you put the coffee in. As long as you have some selection criteria, the system will actually evolve."
What it will evolve into nobody knows: computers that will assemble themselves from buckets of goo as soon as you download the software; superweapons; the end of world hunger; the colonisation of the asteroids; extended lifetimes - anything you want. But Philp sounds a note of caution. "My personal view is that we are not, in the next 20 years, going to be carrying computers the size of cigarette lighters. There are some market issues here as well. How is it an advantage to have a computer that small? I don't know. My computer is quite small enough as it is. How fast is a computer? I mostly use mine to type papers."
Source: The Guardian
In 1959 the Nobel prize winner Richard Feynman proposed, almost jokingly, that there was "plenty of room at the bottom"; that things could be made very small, an atom at a time. K Eric Drexler, in a 1986 classic called Engines of Creation, mapped out the possibilities of a nanotechnological world describing self-replicating machines the size of molecules that could do whatever you want.
Realists point out that some of these things already exist anyway: the manufactured ones are called drugs and the self-replicating ones are called immune cells. But the implications of nonotechnology go further: little nanosubmarines that would roam around your body repairing tissues and preventing heart disease, or computers in your ballpoint pen that will blink when the ink gets low. The possibilities, like the tools themselves, are endless
Colin Humphreys, professor of materials science at Cambridge University, argues that one of the most interesting things about small lumps of matter is that their properties change dramatically as the samples shrink. "Silicon is a good example. Bulk silicon doesn't emit light. But if you make silicon very small, it emits light. It's a fundamental change in its properties that occurs when you get to 2-3 nanometers," he says.
Explanations for such phenomena lie in the realm of quantum physics, where matter in bulk can have hard, toe-stubbing solidity while the same substance on the atomic scale can seem so much empty space and random possibilities. Now physicists are creating semiconductor wafers only molecules thick to build what they call "quantum wells" and "quantum dots" for ever smaller, faster computers. Other potential payoffs light-emitting diodes so efficient and so durable that they could one day cut electric lighting costs by 80%, while nonotechnology will also have huge benefits in the area of keyhole surgery.
Douglas Philp, of Birmingham university, believes in looking at what nature does, and learning from it. He argues that replicating the way molecules work in life could be used to make things perfectly - no polluted reactions, no untidy catalysts, no faulty molecules. "The ultimate nanotechnologist is, in fact, life itself. We could spend hours discussing why life is very good at doing certain things on the nanometer scale," he says. "What we are saying is: okay, let's try and learn how nature does it, and apply that."
"You can apply the same Darwinian principles in chemistry that you can apply in biology," Philp says. "You could imagine a coffee cup that got better at keeping the coffee warm, because you are challenging it when you put the coffee in. As long as you have some selection criteria, the system will actually evolve."
What it will evolve into nobody knows: computers that will assemble themselves from buckets of goo as soon as you download the software; superweapons; the end of world hunger; the colonisation of the asteroids; extended lifetimes - anything you want. But Philp sounds a note of caution. "My personal view is that we are not, in the next 20 years, going to be carrying computers the size of cigarette lighters. There are some market issues here as well. How is it an advantage to have a computer that small? I don't know. My computer is quite small enough as it is. How fast is a computer? I mostly use mine to type papers."
Source: The Guardian
Nanotechnology: Why It Matters
Interest in nanotech is strong because standard silicon techniques have nearly reached their limit--CPUs and similar products can't get much smaller with current technology because makers can't keep stuffing more and more transistors in the same space. With nanotech, they can.
Materials shrunk to a few billionths of a meter go crazy. Magnets demagnetize, and conventional techniques of semiconductor information processing--used for everything from storing data to moving bits and bytes around your PC--don't work. But though the rules change, they can be exploited in ways that offer more, not less, functionality and speed. And it will all eventually cost less, too.
This is the world of nanotechnology, and you're already starting to live in it. "The whole trillion-dollar information technology industry is based on the continuing drive of miniaturization," says Thomas Theis, director of physical sciences at IBM's Watson Research Center. Imagine, he says, how big that economy can be when you can get a million times the complexity of today's information systems for the same dollars.
Nanotech research by government and private industry promises to create breakthroughs across information technology--creating dramatically faster, smaller, and cheaper devices that will permit ubiquitous computing, some forms of which we haven't conceived of yet--along with enhancing just about everything else humans make.
"We and others are using nanotechnology to create smaller and smaller chips that have more and more power and communicate with everything around them," comments Nantero CEO Greg Schmergel. "Everything in your home and office and car will have intelligence and the information you need."
Materials shrunk to a few billionths of a meter go crazy. Magnets demagnetize, and conventional techniques of semiconductor information processing--used for everything from storing data to moving bits and bytes around your PC--don't work. But though the rules change, they can be exploited in ways that offer more, not less, functionality and speed. And it will all eventually cost less, too.
This is the world of nanotechnology, and you're already starting to live in it. "The whole trillion-dollar information technology industry is based on the continuing drive of miniaturization," says Thomas Theis, director of physical sciences at IBM's Watson Research Center. Imagine, he says, how big that economy can be when you can get a million times the complexity of today's information systems for the same dollars.
Nanotech research by government and private industry promises to create breakthroughs across information technology--creating dramatically faster, smaller, and cheaper devices that will permit ubiquitous computing, some forms of which we haven't conceived of yet--along with enhancing just about everything else humans make.
"We and others are using nanotechnology to create smaller and smaller chips that have more and more power and communicate with everything around them," comments Nantero CEO Greg Schmergel. "Everything in your home and office and car will have intelligence and the information you need."
How Robots Will Affect Future Generations by Brian Huse
What does the future hold for robot applications? How will robots affect society in five years; 10 years; 20? These are typical questions received by Robotic Industries Association. Following is a look forward based on a correspondence I recently sent to a student to address in a small way a very big question: ''How will robots affect future generations?''
Robots in Your Every Day Life
Let's start with life as we know it. Did you know that your life is affected virtually every day by robots?
If you ride in a car, an industrial robot helped build it. If you eat cookies, such as the Milano brand from Pepperidge Farm, there are robot assembly lines to help make and pack them. The computer you use to send e-mails and use for research almost certainly owes its existence, in part, to industrial robots. Industrial robots are even used in the medical field, from pharmaceuticals to surgery.
From the manufacturing of pagers and cell phones to space exploration, robots are part of the every day fabric of life.
Robots: Past and Present
Thirty years ago, a person who pondered robots would probably never have guessed that robot technology would be so pervasive, and yet so overlooked. A 19 year-old author named Isaac Asimov, who in 1939 started writing science fiction about humanoid robots, inspired some of the first popular notions about robots. Before him it was Karel Capek, a Czech playwright, who coined the word 'robot' in his 1921 play ''R.U.R.'' And even in millennia past, some folks conceived of artificial people built of wire and metal, even stone, known by some as ''automatons,'' or manlike machines.
Today, robots are doing human labor in all kinds of places. Best of all, they are doing the jobs that are unhealthy or impractical for people. This frees up workers to do the more skilled jobs, including the programming, maintenance and operation of robots.
A simplified definition of a robot is that it must be a device with three or more axis of motion (e.g. shoulder, elbow, wrist), an end effector (tool), and that it may be reprogrammed for different tasks. (This disqualifies most of the toy ''robots'' sold at stores.)
Robots that work on cars and trucks are welding and assembling parts, or lifting heavy parts --the types of jobs that involve risks like injury to your back and arm or wrist, or they work in environments filled with hazards like excessive heat, noise or fumes-dangerous places for people. Robots that assemble and pack cookies or other foodstuff do so without the risk of carpal tunnel injury, unlike their human counterparts. Robots that make computer chips are working in such tiny dimensions that a person couldn't even do some of the precision work required.
In the health industry, robots are helping to research and develop drugs, package them and even assist doctors in complicated surgery such as hip replacement and open heart procedures. And the main reason robots are used in any application is because they do the work so much better that there is a vast improvement in quality and/or production, or costs are brought down so that companies can be the best at what they do while keeping workers safe.
Robots Keep the Economy Rolling
High-quality products can lead to higher sales, which means the company that uses technology like robots is more likely to stay alive and vital, which is good for the economy. In addition to improving quality, robots improve productivity, another key element to economic health.
To think about how robots might affect future generations, consider what happened a few hundred years ago when the industrial revolution began. For instance, in 1794 Eli Whitney invented the cotton gin, and later the concept of interchangeable parts for mass production of manufactured products. His inventions spurred growth in the United States, increased productivity in a variety of industries, and created more job opportunities as companies throughout the world adopted his technology and ideas.
In 1865 John Deere invented the cast steel plow blade, giving farmers a tool to greatly increase productivity. The light bulb came in 1880. The airplane appeared in 1906. Assembly lines, TVs, plastics, and many other inventions came in the decades to follow, further changing the face of the industrialized world.
In 1961, Joseph Engelberger sold the first industrial robot to General Motors Corporation, where it performed machine loading and unloading duties in an environment that was hot and dirty, and in fact dangerous to humans. That was 40 years ago...before personal computers and the Internet. A lot of technology evolved that helped make the industrial robot the affordable, successful machine it is today.
A Future in Service Robots?
Who knew all the effects the robot would have? Maybe Mr. Engelberger, often referred to as the ''Father of Robotics,'' could foresee much of what was to come. He eventually sold his company, called Unimation, and became a pioneer in service robots, a sector of robotics in its infancy, but which is predicted to eventually exceed the market for industrial robots. He lectures even today that service robots must have the following criteria to succeed:
Magnificent physical execution (they have to be really, really good at what they do);
Sensory perception (one or more of the five senses, like sight, touch, etc.);
A ''quasi-structured'' living environment (things have to be predictable)
Prior knowledge of their environment and duties (programmed with expert skills and knowledge);
A good cost/benefit standard (reasonable cost compared to expected duties).
These are high standards indeed! Most people can do service tasks very efficiently compared to any current robotic alternative. Most service robots would cost far more than human labor does at this time (although Mr. Engelberger did demonstrate a successful business model for a cost-effective system for hospital robot ''gofers'' when he created the HelpMate company).
The opportunity for robotics arises when you ask if there are enough skilled people to do certain tasks at a reasonable price, like elder care, an industry greatly lacking in skilled labor and laborers. Much thought has been put into development of robotic helpers for the infirmed and elderly.
Untapped Robot Applications Abound
According to the RIA, 90% of companies with robotic manufacturing applications have not installed their first robot. Yet more than 115,000 robots are installed in the U.S. today, making it second only to Japan. Material handling and assembly are among the leading applications poised for growth within the robotics industry.
The future for robots is bright. But, how will robots affect future generations? Sometimes you can get ideas for the future by looking into the past and thinking about the changes we've seen as a result of other great inventions, like the cotton gin, airplane or Internet. Perhaps one day we will have true robotic ''helpers'' that guide the blind, assist the elderly. Maybe they'll be modular devices that can switch from lawn mower to vacuum cleaner, to dish washer and window washer.
Maybe one day ''robots'' will be so small they will travel through your blood stream delivering life-saving drugs to eliminate disease. Perhaps they will have a major role in the educational and entertainment industries. Law enforcement and security may become major users of robotics. (Robots already have been deployed for such hazardous tasks as bomb disposal, hostage recovery, and search and rescue operations, including at the World Trade Center.)
Certainly, robots will always have a role in manufacturing. They are invaluable to the trend of product miniaturization, and they provide an economical solution for manufacturing the high-quality products mandated for success in a global economy.
Industrial robots are somewhat underrated in today's society, but the world owes much to the productivity and quality measures imparted by robotics. Their effect on future generations may well be the assistance they provide in manufacturing faster computers, more intelligent vehicles and better consumer and health products.
Donald A. Vincent, Executive Vice President, RIA, a 25-year veteran of the industry wrote this assessment about the future of robots in the Handbook of Industrial Robotics:
''After a quarter-century of being involved with robotics, I have concluded that the robotics industry is here to stay. And robotics does not stop here. Sojourner (was) the first, but certainly not the last, intelligent robot sent by humans to operate on another planet, Mars. Robotics, robots, and their peripheral equipment will respond well to the challenges of space construction, assembly, and communications; new applications in agriculture, agri-industries, and chemical industries; work in recycling, cleaning, and hazardous waste disposal to protect our environment and the quality of our air and water; safe, reliable and fast transportation relying on robotics in flight and on intelligent highways. Robotics prospered in the 1900s; it will thrive and proliferate in the twenty-first century.''
Robots in Your Every Day Life
Let's start with life as we know it. Did you know that your life is affected virtually every day by robots?
If you ride in a car, an industrial robot helped build it. If you eat cookies, such as the Milano brand from Pepperidge Farm, there are robot assembly lines to help make and pack them. The computer you use to send e-mails and use for research almost certainly owes its existence, in part, to industrial robots. Industrial robots are even used in the medical field, from pharmaceuticals to surgery.
From the manufacturing of pagers and cell phones to space exploration, robots are part of the every day fabric of life.
Robots: Past and Present
Thirty years ago, a person who pondered robots would probably never have guessed that robot technology would be so pervasive, and yet so overlooked. A 19 year-old author named Isaac Asimov, who in 1939 started writing science fiction about humanoid robots, inspired some of the first popular notions about robots. Before him it was Karel Capek, a Czech playwright, who coined the word 'robot' in his 1921 play ''R.U.R.'' And even in millennia past, some folks conceived of artificial people built of wire and metal, even stone, known by some as ''automatons,'' or manlike machines.
Today, robots are doing human labor in all kinds of places. Best of all, they are doing the jobs that are unhealthy or impractical for people. This frees up workers to do the more skilled jobs, including the programming, maintenance and operation of robots.
A simplified definition of a robot is that it must be a device with three or more axis of motion (e.g. shoulder, elbow, wrist), an end effector (tool), and that it may be reprogrammed for different tasks. (This disqualifies most of the toy ''robots'' sold at stores.)
Robots that work on cars and trucks are welding and assembling parts, or lifting heavy parts --the types of jobs that involve risks like injury to your back and arm or wrist, or they work in environments filled with hazards like excessive heat, noise or fumes-dangerous places for people. Robots that assemble and pack cookies or other foodstuff do so without the risk of carpal tunnel injury, unlike their human counterparts. Robots that make computer chips are working in such tiny dimensions that a person couldn't even do some of the precision work required.
In the health industry, robots are helping to research and develop drugs, package them and even assist doctors in complicated surgery such as hip replacement and open heart procedures. And the main reason robots are used in any application is because they do the work so much better that there is a vast improvement in quality and/or production, or costs are brought down so that companies can be the best at what they do while keeping workers safe.
Robots Keep the Economy Rolling
High-quality products can lead to higher sales, which means the company that uses technology like robots is more likely to stay alive and vital, which is good for the economy. In addition to improving quality, robots improve productivity, another key element to economic health.
To think about how robots might affect future generations, consider what happened a few hundred years ago when the industrial revolution began. For instance, in 1794 Eli Whitney invented the cotton gin, and later the concept of interchangeable parts for mass production of manufactured products. His inventions spurred growth in the United States, increased productivity in a variety of industries, and created more job opportunities as companies throughout the world adopted his technology and ideas.
In 1865 John Deere invented the cast steel plow blade, giving farmers a tool to greatly increase productivity. The light bulb came in 1880. The airplane appeared in 1906. Assembly lines, TVs, plastics, and many other inventions came in the decades to follow, further changing the face of the industrialized world.
In 1961, Joseph Engelberger sold the first industrial robot to General Motors Corporation, where it performed machine loading and unloading duties in an environment that was hot and dirty, and in fact dangerous to humans. That was 40 years ago...before personal computers and the Internet. A lot of technology evolved that helped make the industrial robot the affordable, successful machine it is today.
A Future in Service Robots?
Who knew all the effects the robot would have? Maybe Mr. Engelberger, often referred to as the ''Father of Robotics,'' could foresee much of what was to come. He eventually sold his company, called Unimation, and became a pioneer in service robots, a sector of robotics in its infancy, but which is predicted to eventually exceed the market for industrial robots. He lectures even today that service robots must have the following criteria to succeed:
Magnificent physical execution (they have to be really, really good at what they do);
Sensory perception (one or more of the five senses, like sight, touch, etc.);
A ''quasi-structured'' living environment (things have to be predictable)
Prior knowledge of their environment and duties (programmed with expert skills and knowledge);
A good cost/benefit standard (reasonable cost compared to expected duties).
These are high standards indeed! Most people can do service tasks very efficiently compared to any current robotic alternative. Most service robots would cost far more than human labor does at this time (although Mr. Engelberger did demonstrate a successful business model for a cost-effective system for hospital robot ''gofers'' when he created the HelpMate company).
The opportunity for robotics arises when you ask if there are enough skilled people to do certain tasks at a reasonable price, like elder care, an industry greatly lacking in skilled labor and laborers. Much thought has been put into development of robotic helpers for the infirmed and elderly.
Untapped Robot Applications Abound
According to the RIA, 90% of companies with robotic manufacturing applications have not installed their first robot. Yet more than 115,000 robots are installed in the U.S. today, making it second only to Japan. Material handling and assembly are among the leading applications poised for growth within the robotics industry.
The future for robots is bright. But, how will robots affect future generations? Sometimes you can get ideas for the future by looking into the past and thinking about the changes we've seen as a result of other great inventions, like the cotton gin, airplane or Internet. Perhaps one day we will have true robotic ''helpers'' that guide the blind, assist the elderly. Maybe they'll be modular devices that can switch from lawn mower to vacuum cleaner, to dish washer and window washer.
Maybe one day ''robots'' will be so small they will travel through your blood stream delivering life-saving drugs to eliminate disease. Perhaps they will have a major role in the educational and entertainment industries. Law enforcement and security may become major users of robotics. (Robots already have been deployed for such hazardous tasks as bomb disposal, hostage recovery, and search and rescue operations, including at the World Trade Center.)
Certainly, robots will always have a role in manufacturing. They are invaluable to the trend of product miniaturization, and they provide an economical solution for manufacturing the high-quality products mandated for success in a global economy.
Industrial robots are somewhat underrated in today's society, but the world owes much to the productivity and quality measures imparted by robotics. Their effect on future generations may well be the assistance they provide in manufacturing faster computers, more intelligent vehicles and better consumer and health products.
Donald A. Vincent, Executive Vice President, RIA, a 25-year veteran of the industry wrote this assessment about the future of robots in the Handbook of Industrial Robotics:
''After a quarter-century of being involved with robotics, I have concluded that the robotics industry is here to stay. And robotics does not stop here. Sojourner (was) the first, but certainly not the last, intelligent robot sent by humans to operate on another planet, Mars. Robotics, robots, and their peripheral equipment will respond well to the challenges of space construction, assembly, and communications; new applications in agriculture, agri-industries, and chemical industries; work in recycling, cleaning, and hazardous waste disposal to protect our environment and the quality of our air and water; safe, reliable and fast transportation relying on robotics in flight and on intelligent highways. Robotics prospered in the 1900s; it will thrive and proliferate in the twenty-first century.''
Subscribe to:
Posts (Atom)