Youth from impoverished urban areas in Kenya, Tanzania and Ghana will join a climb to the summit of Mount Kilimanjaro to raise awareness on climate change.
The climb, which kicks off on 28 February, is the fourth annual ascent to "rooftop of Africa" organised by the Kilimanjaro Initiative, a Nairobi-based Non-Governmental Organisation, in partnership with the United Nations. This year, ten underprivileged youth will join 25 other people from the private and public sectors in a group led by Timothy Challen, the founder of the Kilimanjaro Initiative.
Timothy Challen founded the Kilimanjaro Initiative in 2005 after he was shot during an armed robbery in Nairobi, Kenya. Following extensive surgery and a long period of recovery back home in Switzerland, Tim returned to East Africa with a desire to help create a safer urban environment.
This year, the aim of the climb is to raise awareness on the dangers linked to climate change by highlighting the need to protect our environment and the importance of providing greener, safer and better communities for all. Upon reaching the summit under the banner of the UN's global UNite to Combat Climate Change campaign, participants will use a satellite telephone to call UN Secretary General Ban Ki-Moon.
Rising temperatures, increased precipitation and extreme weather conditions will have a direct impact on where and how people live. Not only will the economic and public infrastructures of communities be affected, but urban centres may also be dangerously stretched in order to accommodate displaced populations. Consequently, social ills such as unemployment, lack of educational facilities, inadequate health care systems and criminal activities may increase in urban centres.
On 25 February 2009, the Kilimanjaro Initiative officially began its operations as a Kenyan-based International NGO with the support of an international network of committed partners including UN entities such as the UN Federal Credit Union (UNFCU), the UN Environment Programme (UNEP), the UN Human Settlements Programme (UN-HABITAT) and the UN Office on Sport for Development and Peace (UNOSDP) as well as civil society organisations including the NGO 'Play Soccer'.
Since 2006 the Kilimanjaro Initiative has brought close to 100 people to the summit of Mount Kilimanjaro, including young people, athletes, musicians and members of the private and public sectors. The Initiative's main objective is to encourage young people to have self-belief and to assist in providing opportunities that will enable them to take on a constructive role in their communities. The climb of Mount Kilimanjaro exemplifies how sport can be used as a tool towards the development of a community. The mountain reminds participants that in order to overcome adversity they must understand their environment, unite as a team and persevere.
Wangari Maathai Calls On Armies to Join the Billion Tree Campaign
The world's armies and UN peacekeepers around the globe should join the Billion Tree Campaign as it strives to reach its target of seven billion trees planted by the end of 2009, according to Nobel Peace Laureate Wangari Maathai.
Speaking during the UN Environment Programme (UNEP)'s Governing Council meeting, Wangari Maathai, who is the co-patron of the Billion Tree Campaign, appealed to Heads of State around the world.
"Imagine all soldiers marching for the planet," the Nobel Peace Prize Laureate said.
"While the armies of the world are waiting to fight an enemy that comes with a gun, we have another enemy, an unseen enemy, an enemy that is destroying our environment," she added. "The enemy that takes away our topsoil, takes away our waters, destroys our forests, destroys the air we breathe, clears the forest."
"This is the unseen enemy and it cannot be fought with a gun. This enemy can be fought with a tree," Wangari Maathai said. "So you can imagine how wonderful it would be if every soldier on this planet started seeing himself and herself as a soldier for the planet. Holding a gun on one side and a tree seedling on the other, to fight this unseen enemy which is actually more dangerous to us than the other enemy."
Her words come as a growing number of governments, communities and people around the world join the Billion Tree Campaign. The campaign, which is under the patronage of Wangari Maathai and His Serene Highness Prince Albert II of Monaco, has now catalysed the planting of 2.6 billion trees in 165 countries around the world, far exceeding its original target.
On 22 February, Peruvian President Alan Garcia Perez personally planted the 40 millionth tree in Lima, concluding the country's National Tree Campaign of Afforestation and Reforestation. Peru plans to plant another 60 million trees by 2010.
So far, the roll of honour of the countries where the biggest number of trees has been planted is headed by Ethiopia (700 million trees), Mexico (470 million trees) and Turkey (400 million trees).
Speaking during the UN Environment Programme (UNEP)'s Governing Council meeting, Wangari Maathai, who is the co-patron of the Billion Tree Campaign, appealed to Heads of State around the world.
"Imagine all soldiers marching for the planet," the Nobel Peace Prize Laureate said.
"While the armies of the world are waiting to fight an enemy that comes with a gun, we have another enemy, an unseen enemy, an enemy that is destroying our environment," she added. "The enemy that takes away our topsoil, takes away our waters, destroys our forests, destroys the air we breathe, clears the forest."
"This is the unseen enemy and it cannot be fought with a gun. This enemy can be fought with a tree," Wangari Maathai said. "So you can imagine how wonderful it would be if every soldier on this planet started seeing himself and herself as a soldier for the planet. Holding a gun on one side and a tree seedling on the other, to fight this unseen enemy which is actually more dangerous to us than the other enemy."
Her words come as a growing number of governments, communities and people around the world join the Billion Tree Campaign. The campaign, which is under the patronage of Wangari Maathai and His Serene Highness Prince Albert II of Monaco, has now catalysed the planting of 2.6 billion trees in 165 countries around the world, far exceeding its original target.
On 22 February, Peruvian President Alan Garcia Perez personally planted the 40 millionth tree in Lima, concluding the country's National Tree Campaign of Afforestation and Reforestation. Peru plans to plant another 60 million trees by 2010.
So far, the roll of honour of the countries where the biggest number of trees has been planted is headed by Ethiopia (700 million trees), Mexico (470 million trees) and Turkey (400 million trees).
Search on for Environmentally Sensitive Projects
Energy efficiency, natural resource management & use of sustainable products key criteria for Middle East Real Estate Awards
The search is on for unique projects and developments that contribute not only towards architectural excellence and but remain sensitive to the environment for this year's Middle East Real Estate Awards, the organisers of Cityscape Abu Dhabi have announced.
The awards will be presented at a dinner attended by over 500 industry leaders on 19 April 2009 during Cityscape Abu Dhabi, the international real estate and investment development event, with the deadline for submission of entries set at 19 March 2009.
"The judges are looking for projects that demonstrate innovative design and development concepts combined with architectural excellence and functionality," said a statement from Mark Goodchild, Exhibition Director for Cityscape Abu Dhabi and the Middle East Real Estate Awards. "As well as quality of design, construction and visual appeal, the judges are looking for compatibility with neighbouring land uses and sensitivity to environmental issues," Goodchild added.
Cityscape Abu Dhabi, takes place from 19-22 April at the Abu Dhabi National Exhibition Centre, and will bring together international and regional investors, developers, architects, government authorities, regulators and professional bodies to discuss market challenges.
The awards cover six categories with entries in some cases separated into projects already complete of future projects.
There will be a Best Sustainable Development Award where judges will be looking in particular for innovative use of new materials, products or construction methods which deliver energy efficient, high performing green buildings. The judges will also be making an award for a world class and distinctive Waterfront Development which shows sensitivity and maximises water as a resource.
There will be a Commercial and Retail Project Award where developments will need to show world class standards of design, safety, security, energy efficiency and innovation. Similar criteria will apply to a Residential Project Award.
A Mixed Use Project Award seeks a development with a strong design, sensitivity to the surrounding environment and urban community as well as showing flair and intelligence in relation to the present and future needs of the community.
The Best Urban Design and Master Planning Award will go to an outstanding project which best encourages or promotes a sense of place or significantly benefits an entire neighbourhood, town, city or region.
The search is on for unique projects and developments that contribute not only towards architectural excellence and but remain sensitive to the environment for this year's Middle East Real Estate Awards, the organisers of Cityscape Abu Dhabi have announced.
The awards will be presented at a dinner attended by over 500 industry leaders on 19 April 2009 during Cityscape Abu Dhabi, the international real estate and investment development event, with the deadline for submission of entries set at 19 March 2009.
"The judges are looking for projects that demonstrate innovative design and development concepts combined with architectural excellence and functionality," said a statement from Mark Goodchild, Exhibition Director for Cityscape Abu Dhabi and the Middle East Real Estate Awards. "As well as quality of design, construction and visual appeal, the judges are looking for compatibility with neighbouring land uses and sensitivity to environmental issues," Goodchild added.
Cityscape Abu Dhabi, takes place from 19-22 April at the Abu Dhabi National Exhibition Centre, and will bring together international and regional investors, developers, architects, government authorities, regulators and professional bodies to discuss market challenges.
The awards cover six categories with entries in some cases separated into projects already complete of future projects.
There will be a Best Sustainable Development Award where judges will be looking in particular for innovative use of new materials, products or construction methods which deliver energy efficient, high performing green buildings. The judges will also be making an award for a world class and distinctive Waterfront Development which shows sensitivity and maximises water as a resource.
There will be a Commercial and Retail Project Award where developments will need to show world class standards of design, safety, security, energy efficiency and innovation. Similar criteria will apply to a Residential Project Award.
A Mixed Use Project Award seeks a development with a strong design, sensitivity to the surrounding environment and urban community as well as showing flair and intelligence in relation to the present and future needs of the community.
The Best Urban Design and Master Planning Award will go to an outstanding project which best encourages or promotes a sense of place or significantly benefits an entire neighbourhood, town, city or region.
President Obama Calls for Carbon Cap, More Clean Energy Investment
Naming energy as one of the three areas of investment "that are absolutely critical to our economic future," President Barack Obama called for a greater investment in clean energy technologies and a cap on carbon emissions. "We have known for decades that our survival depends on finding new sources of energy, yet we import more oil today than ever before," noted the president in his address to Congress. The president noted that the American Recovery and Reinvestment Act (ARRA) provides a key investment that will save or create 3.5 million jobs, including jobs "constructing wind turbines and solar panels ... and expanding mass transit." He also noted that the ARRA will double the US supply of renewable energy in the next three years. Claiming that "the country that harnesses the power of clean, renewable energy will lead the 21st century," President Obama declared that "it is time for America to lead again."
"We will soon lay down thousands of miles of power lines that can carry new energy to cities and towns across this country," said the president, "and we will put Americans to work making our homes and buildings more efficient, so that we can save billions of dollars on our energy bills. But to truly transform our economy, protect our security, and save our planet from the ravages of climate change, we need to ultimately make clean, renewable energy the profitable kind of energy. So I ask this Congress to send me legislation that places a market-based cap on carbon pollution and drives the production of more renewable energy in America. And to support that innovation, we will invest USD15 billion a year to develop technologies like wind power and solar power; advanced biofuels, clean coal, and more fuel-efficient cars and trucks built right here in America."
President Obama also pointed out the example of Greensburg, Kansas, "a town that was completely destroyed by a tornado, but is being rebuilt by its residents as a global example of how clean energy can power an entire community, how it can bring jobs and businesses to a place where piles of bricks and rubble once lay." DOE, by the way, was heavily involved in the effort to rebuild Greensburg using renewable energy and green building principles.
"We will soon lay down thousands of miles of power lines that can carry new energy to cities and towns across this country," said the president, "and we will put Americans to work making our homes and buildings more efficient, so that we can save billions of dollars on our energy bills. But to truly transform our economy, protect our security, and save our planet from the ravages of climate change, we need to ultimately make clean, renewable energy the profitable kind of energy. So I ask this Congress to send me legislation that places a market-based cap on carbon pollution and drives the production of more renewable energy in America. And to support that innovation, we will invest USD15 billion a year to develop technologies like wind power and solar power; advanced biofuels, clean coal, and more fuel-efficient cars and trucks built right here in America."
President Obama also pointed out the example of Greensburg, Kansas, "a town that was completely destroyed by a tornado, but is being rebuilt by its residents as a global example of how clean energy can power an entire community, how it can bring jobs and businesses to a place where piles of bricks and rubble once lay." DOE, by the way, was heavily involved in the effort to rebuild Greensburg using renewable energy and green building principles.
28 Million Scouts to Mobilise for Earth Hour
The Scouts, the world's largest youth movement with more than 28 million members in 160 countries, lead thousands of community groups around the world mobilising their supporters for Earth Hour, the global expression of a desire for serious and sustained action on climate change.
“It is possible for everyone to take action against global warming," said Luc Panissod, Acting Secretary General of the World Organisation of the Scout Movement, and James Leape, Director General of WWF International, in a joint letter to Scouting's global network earlier this month.
What was described as "an opportunity to talk to your neighbours about the environment and climate change" is the latest expression of a partnership between the Scouts and WWF that goes back decades.
"We see that Scouts all over the world have a great interest in the environment and are leaders in their community," said Luc Panissod. "Earth Hour offers an opportunity for Scouts to demonstrate this commitment to tackling climate change and engage with their family and friends."
"The young are vitally concerned with the future and many are well aware that climate change is the greatest threat to the planet’s future. We are delighted that the Scouts are again working with us to secure the environment for generations to come," said James Leape.
With more than 681 cities in 76 countries already signed up to turn their lights out on March 28, Earth Hour 2009 is shaping up as one of the greatest voluntary actions the world has ever seen.
Executive Director of Earth Hour, Andy Ridley, said community groups are playing a vital role in getting more and more people from around the world engaged in the lights out campaign.
“Earth Hour is driven by citizens and grassroots groups thinking globally and acting locally. No matter how big or small your organisation, I urge you to get involved in Earth Hour and really make a difference in your community and in the world,” Ridley said.
Among other community groups working to ensure the largest possible participation in Earth Hour is the Church of Sweden, which will ring its bells across the country to signal the start of Earth Hour at 8.30pm on March 28.
Sydney Ferries, which has been an enthusiastic supporter of Earth Hour since 2007, will this year be sounding the horns of all its ferries operating on Sydney Harbour, heralding Earth Hour in the city where the campaign first began.
In the United States the National Education Association, representing 3.2 million teachers and education professionals, has also pledged its support for Earth Hour, as has the 1.4 million-strong American Federation of Teachers.
“It is possible for everyone to take action against global warming," said Luc Panissod, Acting Secretary General of the World Organisation of the Scout Movement, and James Leape, Director General of WWF International, in a joint letter to Scouting's global network earlier this month.
What was described as "an opportunity to talk to your neighbours about the environment and climate change" is the latest expression of a partnership between the Scouts and WWF that goes back decades.
"We see that Scouts all over the world have a great interest in the environment and are leaders in their community," said Luc Panissod. "Earth Hour offers an opportunity for Scouts to demonstrate this commitment to tackling climate change and engage with their family and friends."
"The young are vitally concerned with the future and many are well aware that climate change is the greatest threat to the planet’s future. We are delighted that the Scouts are again working with us to secure the environment for generations to come," said James Leape.
With more than 681 cities in 76 countries already signed up to turn their lights out on March 28, Earth Hour 2009 is shaping up as one of the greatest voluntary actions the world has ever seen.
Executive Director of Earth Hour, Andy Ridley, said community groups are playing a vital role in getting more and more people from around the world engaged in the lights out campaign.
“Earth Hour is driven by citizens and grassroots groups thinking globally and acting locally. No matter how big or small your organisation, I urge you to get involved in Earth Hour and really make a difference in your community and in the world,” Ridley said.
Among other community groups working to ensure the largest possible participation in Earth Hour is the Church of Sweden, which will ring its bells across the country to signal the start of Earth Hour at 8.30pm on March 28.
Sydney Ferries, which has been an enthusiastic supporter of Earth Hour since 2007, will this year be sounding the horns of all its ferries operating on Sydney Harbour, heralding Earth Hour in the city where the campaign first began.
In the United States the National Education Association, representing 3.2 million teachers and education professionals, has also pledged its support for Earth Hour, as has the 1.4 million-strong American Federation of Teachers.
President’s Proposed EPA Budget Provides Strengthened Environmental Protection
The Obama administration today proposed a budget of USD10.5 billion for the US Environmental Protection Agency, the largest in the agency’s 39-year history. The increase of USD3 billion from 2008 funding levels will further ensure the protection of public health and the environment for all Americans.
“The president’s budget proposes critical resources to protect the American people and the places where they live, work and play,” said EPA Administrator Lisa P. Jackson. “We are no longer faced with the false choice of a strong economy or a clean environment. The president’s budget shows that making critical and responsible investments in protecting the health and environment of all Americans will also lead to a more vibrant and stable economy. With these proposed resources, and the president’s strong environmental agenda, it should be overwhelmingly clear that EPA is back on the job.”
Last week, President Obama announced the American Recovery and Reinvestment Act of 2009, which includes USD7.22 billion for EPA-administered projects and programs to protect human health and the environment.
Some key highlights of 2010 budget initiatives include:
• USD3.9 billion for the Clean Water State Revolving Fund and Drinking Water State Revolving Fund grants to support approximately 1,000 clean water projects and 700 drinking water projects - this year’s largest single investment. In addition to the funds recently invested through the ARRA, this funding is a critical step in addressing the water infrastructure needs in thousands of communities across the country. EPA will work with state and local partners to develop a sustainability policy, including management and pricing, conservation, security and a plan for adequate long-term state and municipal funding for future capital needs.
• A new USD475 million, multi-agency Great Lakes Initiative to protect the world’s largest fresh water resource. EPA will coordinate with federal partners, states, tribes, localities and other entities to protect, maintain and restore the chemical, biological and physical integrity of the lakes. EPA and its partners will address invasive species, non-point source pollution, habitat restoration, contaminated sediment and other critical issues.
• A USD19 million increase for the greenhouse gas emissions inventory and related activities that will provide data critical for implementing a comprehensive climate change bill. EPA’s funding for climate change investments is the foundation for working with key stakeholders and Congress to develop an economy-wide cap-and-trade program to reduce greenhouse gas emissions approximately 83 per cent below 2005 levels by 2050.
• Strengthening EPA’s core research, enforcement and regulatory capabilities. The budget request also proposes reinstating the Superfund excise taxes that expired. Reinstating the Superfund taxes would collect over USD1 billion annually to fund the cleanup of the nation’s most contaminated sites.
More information on EPA’s FY 2010 budget request: http://www.whitehouse.gov/omb/budget/
“The president’s budget proposes critical resources to protect the American people and the places where they live, work and play,” said EPA Administrator Lisa P. Jackson. “We are no longer faced with the false choice of a strong economy or a clean environment. The president’s budget shows that making critical and responsible investments in protecting the health and environment of all Americans will also lead to a more vibrant and stable economy. With these proposed resources, and the president’s strong environmental agenda, it should be overwhelmingly clear that EPA is back on the job.”
Last week, President Obama announced the American Recovery and Reinvestment Act of 2009, which includes USD7.22 billion for EPA-administered projects and programs to protect human health and the environment.
Some key highlights of 2010 budget initiatives include:
• USD3.9 billion for the Clean Water State Revolving Fund and Drinking Water State Revolving Fund grants to support approximately 1,000 clean water projects and 700 drinking water projects - this year’s largest single investment. In addition to the funds recently invested through the ARRA, this funding is a critical step in addressing the water infrastructure needs in thousands of communities across the country. EPA will work with state and local partners to develop a sustainability policy, including management and pricing, conservation, security and a plan for adequate long-term state and municipal funding for future capital needs.
• A new USD475 million, multi-agency Great Lakes Initiative to protect the world’s largest fresh water resource. EPA will coordinate with federal partners, states, tribes, localities and other entities to protect, maintain and restore the chemical, biological and physical integrity of the lakes. EPA and its partners will address invasive species, non-point source pollution, habitat restoration, contaminated sediment and other critical issues.
• A USD19 million increase for the greenhouse gas emissions inventory and related activities that will provide data critical for implementing a comprehensive climate change bill. EPA’s funding for climate change investments is the foundation for working with key stakeholders and Congress to develop an economy-wide cap-and-trade program to reduce greenhouse gas emissions approximately 83 per cent below 2005 levels by 2050.
• Strengthening EPA’s core research, enforcement and regulatory capabilities. The budget request also proposes reinstating the Superfund excise taxes that expired. Reinstating the Superfund taxes would collect over USD1 billion annually to fund the cleanup of the nation’s most contaminated sites.
More information on EPA’s FY 2010 budget request: http://www.whitehouse.gov/omb/budget/
Study Finds Hemlock Trees Dying Rapidly, Affecting Forest Carbon Cycle
New research by US Forest Service Southern Research Station (SRS) scientists and partners suggests the hemlock woolly adelgid is killing hemlock trees faster than expected in the southern Appalachians and rapidly altering the carbon cycle of these forests. SRS researchers and co-operators from the University of Georgia published the findings in the most recent issue of the journal Ecosystems.
"The study marks the first time that scientists have tracked the short-term effects hemlock woolly adelgid infestations are having on the forest carbon cycle," said Chelcy Ford, SRS ecologist and co-author of the paper.
Eastern hemlock, a keystone species in the streamside forests of the southern Appalachian region, is already experiencing widespread decline and mortality because of hemlock woolly adelgid (a tiny non-native insect) infestation. The pest has the potential to kill most of the region's hemlock trees within the next decade. As a native evergreen capable of maintaining year-round transpiration rates, hemlock plays an important role in the ecology and hydrology of mountain ecosystems. Hemlock forests provide critical habitat for birds and other animals; their shade helps maintain the cool water temperatures required by trout and other aquatic organisms in mountain streams.
Scientists conducted the study in mixed hardwood forests along the edges of two streams at the SRS Coweeta Hydrologic Laboratory, a 5,600-acre research facility and experimental forest in the Nantahala Mountain Range of western North Carolina.
Researchers compared rates of decline of adelgid-infested hemlock trees to a small number of girdled (severely wounded the bark of a tree to initiate tree mortality) trees that were not infested. Researchers tracked changes in the carbon cycle of these hemlock stands over a three year period. Scientists measured components of the forest carbon cycle – including tree growth, leaf litter and fine root biomass, and soil respiration – over the three year period.
"While we expected that girdled trees would decline quickly, we were surprised to find that hemlock declines just as quickly from adelgid infestation," said Ford. "This research shows that hemlock woolly adelgid infestation is rapidly impacting the carbon cycle in these tree stands. The study also supports the widely held belief that adelgid-infested hemlock trees in the South are declining much faster than the reported nine year decline of some infested hemlock trees in the North-east."
The study showed, among other things, that very fine roots in the girdled and hemlock woolly adelgid-infested plots declined by 38 per cent and 22 per cent, respectively, during the three year period. In addition, in the first year after girdling and infestation, researchers found soil respiration was approximately 20 per cent lower than they expected.
The authors suggest that infrequent frigid winter temperatures in the southern Appalachians may not be enough to suppress adelgid populations. The authors believe this could be one explanation of why infested hemlocks appear to be declining faster in the South than in the Northeast. The authors also point out that other tree species are quick to occupy the space given up by their dying hemlock neighbours.
"Perhaps because of increased light in the canopy and reduced competition for soil nutrients and water, other species are already increasing their growth," said Ford. "We'll continue to monitor this, but, it's still too early to predict just how different these forests will look 50 or 100 years from now."
"The study marks the first time that scientists have tracked the short-term effects hemlock woolly adelgid infestations are having on the forest carbon cycle," said Chelcy Ford, SRS ecologist and co-author of the paper.
Eastern hemlock, a keystone species in the streamside forests of the southern Appalachian region, is already experiencing widespread decline and mortality because of hemlock woolly adelgid (a tiny non-native insect) infestation. The pest has the potential to kill most of the region's hemlock trees within the next decade. As a native evergreen capable of maintaining year-round transpiration rates, hemlock plays an important role in the ecology and hydrology of mountain ecosystems. Hemlock forests provide critical habitat for birds and other animals; their shade helps maintain the cool water temperatures required by trout and other aquatic organisms in mountain streams.
Scientists conducted the study in mixed hardwood forests along the edges of two streams at the SRS Coweeta Hydrologic Laboratory, a 5,600-acre research facility and experimental forest in the Nantahala Mountain Range of western North Carolina.
Researchers compared rates of decline of adelgid-infested hemlock trees to a small number of girdled (severely wounded the bark of a tree to initiate tree mortality) trees that were not infested. Researchers tracked changes in the carbon cycle of these hemlock stands over a three year period. Scientists measured components of the forest carbon cycle – including tree growth, leaf litter and fine root biomass, and soil respiration – over the three year period.
"While we expected that girdled trees would decline quickly, we were surprised to find that hemlock declines just as quickly from adelgid infestation," said Ford. "This research shows that hemlock woolly adelgid infestation is rapidly impacting the carbon cycle in these tree stands. The study also supports the widely held belief that adelgid-infested hemlock trees in the South are declining much faster than the reported nine year decline of some infested hemlock trees in the North-east."
The study showed, among other things, that very fine roots in the girdled and hemlock woolly adelgid-infested plots declined by 38 per cent and 22 per cent, respectively, during the three year period. In addition, in the first year after girdling and infestation, researchers found soil respiration was approximately 20 per cent lower than they expected.
The authors suggest that infrequent frigid winter temperatures in the southern Appalachians may not be enough to suppress adelgid populations. The authors believe this could be one explanation of why infested hemlocks appear to be declining faster in the South than in the Northeast. The authors also point out that other tree species are quick to occupy the space given up by their dying hemlock neighbours.
"Perhaps because of increased light in the canopy and reduced competition for soil nutrients and water, other species are already increasing their growth," said Ford. "We'll continue to monitor this, but, it's still too early to predict just how different these forests will look 50 or 100 years from now."
Biodegradable Mulch Films on the Horizon
Eco-friendly alternatives to plastic mulch films prove effective in tomato production
In 1999, more than 30 million acres of agricultural land worldwide were covered with plastic mulch, and those numbers have been increasing significantly since then. With the recent trend toward "going green", researchers are seeking environmentally friendlier alternatives to conventional plastic mulch.
Plastic mulch can provide earlier crop maturity, higher yields, increased quality, improved disease and insect resistance, and more efficient water and fertiliser use, but carries a high cost financially and environmentally when it comes to removing the estimated one million tons of mulch film used internationally each year.
Mathieu Ngouajio, of the Department of Horticulture at Michigan State University, led a study comparing black and white biodegradable mulch films in two thicknesses to traditional plastic mulch in the production of tomato. The results of the study were published in the American Society for Horticultural Science journal HortTechnology.
The lowest soil temperatures were identified with the white films, which is also associated with the white film's higher rate of degradation. Breakdown of white mulch occurred early and exposed the bed for weed growth, creating competition for nutrients between weeds and tomato. As the weeds grew, they tore the mulch, leading to further degradation. Furthermore, the weeds hosted a large insect population that reduced the quality of the tomato.
"The [conventional] LDPE mulch provided 100 per cent weed control in both years, which confirms why this is the preferred mulch used by most vegetable growers," Ngouajio remarked. Weed control levels for both thicknesses of the black biodegradable mulch were more than 90 per cent. Black biodegradable mulch performed well in the field, producing tomato crops similar to conventional mulch during both years of the study.
The study authors explain that there are three factors to be resolved before black biodegradable mulch can be seen as a viable replacement for conventional methods. First, more research is needed to produce mulch that can fully break down in the field. Second, biodegradable mulch must be able to withstand the stresses of being applied to fields by machine. Last, the price of biodegradable mulch needs to be economically acceptable compared to conventional mulch after factoring in the savings for removal and disposal.
In 1999, more than 30 million acres of agricultural land worldwide were covered with plastic mulch, and those numbers have been increasing significantly since then. With the recent trend toward "going green", researchers are seeking environmentally friendlier alternatives to conventional plastic mulch.
Plastic mulch can provide earlier crop maturity, higher yields, increased quality, improved disease and insect resistance, and more efficient water and fertiliser use, but carries a high cost financially and environmentally when it comes to removing the estimated one million tons of mulch film used internationally each year.
Mathieu Ngouajio, of the Department of Horticulture at Michigan State University, led a study comparing black and white biodegradable mulch films in two thicknesses to traditional plastic mulch in the production of tomato. The results of the study were published in the American Society for Horticultural Science journal HortTechnology.
The lowest soil temperatures were identified with the white films, which is also associated with the white film's higher rate of degradation. Breakdown of white mulch occurred early and exposed the bed for weed growth, creating competition for nutrients between weeds and tomato. As the weeds grew, they tore the mulch, leading to further degradation. Furthermore, the weeds hosted a large insect population that reduced the quality of the tomato.
"The [conventional] LDPE mulch provided 100 per cent weed control in both years, which confirms why this is the preferred mulch used by most vegetable growers," Ngouajio remarked. Weed control levels for both thicknesses of the black biodegradable mulch were more than 90 per cent. Black biodegradable mulch performed well in the field, producing tomato crops similar to conventional mulch during both years of the study.
The study authors explain that there are three factors to be resolved before black biodegradable mulch can be seen as a viable replacement for conventional methods. First, more research is needed to produce mulch that can fully break down in the field. Second, biodegradable mulch must be able to withstand the stresses of being applied to fields by machine. Last, the price of biodegradable mulch needs to be economically acceptable compared to conventional mulch after factoring in the savings for removal and disposal.
Study: Soybean Oil Reduces Carbon Footprint in Swine Barns
One of agriculture's most versatile crops could one day play a role in combating climate change, Purdue University research shows.
In addition to using soybeans in beverages, biofuel, lip balm, crayons, candles and a host of other products, Purdue agricultural engineers Al Heber and Jiqin Ni found that soybean oil reduces greenhouse gas emissions when sprayed inside swine finishing barns.
Heber and Ni led a team of Purdue and University of Missouri researchers in the year long project, which monitored the effectiveness of soybean oil on dust and odour within hog facilities. Additional research is needed to address problems with oil spraying and substantiate the study's findings, the researchers said.
"This project provided baseline measurements of the greenhouse gas contributions of swine finishing barns," Heber said. "In addition to the baseline measurements, we now have some data on an abatement technology to reduce the carbon footprint contribution of a pound of pork."
Greenhouse gases are chemical compounds that contribute to the greenhouse effect, a condition in which heat is trapped in the lower atmosphere, producing global warming. In 2005, agricultural practices were responsible for 7.4 per cent of total greenhouse gas emissions in the United States, according to the US Environmental Protection Agency.
The Purdue study was conducted at a northern Missouri farm during a 12-month period ending in July 2003. Oil was sprayed in one of two monitored barns. Each barn housed about 1,100 pigs, Ni said.
The treated barn was sprayed with five cubic centimetres of oil per square metre of floor for one minute per day. The spray system was similar to the spray technology used to treat crop fields with pesticides.
"We tested three different methods of pollution mitigation: soybean oil sprinkling, misting with essential oils, and misting with essential oils and water," Ni said. "Our original intent was to see if those three methods would control dust, as well as odour emissions, ammonia, hydrogen sulphide, methane and carbon dioxide emissions."
Compared with the unsprayed monitored barn, the oil-treated barn showed an average 20 per cent decrease in methane emissions and a 19 per cent average reduction in carbon dioxide emissions. Methane and carbon dioxide are greenhouse gases.
Dust reduction was even more significant. The treated barn emitted about 65 per cent less particulate matter than the untreated barn. Researchers suspected controlling dust also would lead to reduced greenhouse gas escapes, Heber said.
"The spray takes out dust, and since dust carries odour and it absorbs other gases, there was a scientific reason why it might take out those greenhouse gases," Heber said.
"We saw a reduction in odour, but it wasn't statistically significant. That may be because we didn't take enough air samples. All we can say is that there was a trend in odour reduction."
Several challenges stand in the way of using soybean oil in swine barns, including safety, cleaning and the cost of application, Heber said.
"First of all, soybean oil is more expensive now than it was when we did the study," Heber said. "Whereas we thought it would cost less than a dollar per pig marketed to treat the barn, around 60 cents, since then the price of soybean oil has increased dramatically, and so the economics are not as good. Also, the application of oil can create a safety hazard for the producer.
"In addition, some of the oil ended up on the floor, the pigs, the feeders and fans. This makes the cleaning process more difficult. The producer we worked with indicated it took an additional day of power washing to clean that barn. That's an extra expense."
While soybean oil shows promise as a greenhouse gas control agent, it is too early to declare the findings conclusive, Heber and Ni said.
"There are technical problems with this practice, but those may be overcome through good engineering," Heber said.
"We need to do more research to get a better idea of the effectiveness of this technology and its benefit on environmental protection," Ni said.
In addition to using soybeans in beverages, biofuel, lip balm, crayons, candles and a host of other products, Purdue agricultural engineers Al Heber and Jiqin Ni found that soybean oil reduces greenhouse gas emissions when sprayed inside swine finishing barns.
Heber and Ni led a team of Purdue and University of Missouri researchers in the year long project, which monitored the effectiveness of soybean oil on dust and odour within hog facilities. Additional research is needed to address problems with oil spraying and substantiate the study's findings, the researchers said.
"This project provided baseline measurements of the greenhouse gas contributions of swine finishing barns," Heber said. "In addition to the baseline measurements, we now have some data on an abatement technology to reduce the carbon footprint contribution of a pound of pork."
Greenhouse gases are chemical compounds that contribute to the greenhouse effect, a condition in which heat is trapped in the lower atmosphere, producing global warming. In 2005, agricultural practices were responsible for 7.4 per cent of total greenhouse gas emissions in the United States, according to the US Environmental Protection Agency.
The Purdue study was conducted at a northern Missouri farm during a 12-month period ending in July 2003. Oil was sprayed in one of two monitored barns. Each barn housed about 1,100 pigs, Ni said.
The treated barn was sprayed with five cubic centimetres of oil per square metre of floor for one minute per day. The spray system was similar to the spray technology used to treat crop fields with pesticides.
"We tested three different methods of pollution mitigation: soybean oil sprinkling, misting with essential oils, and misting with essential oils and water," Ni said. "Our original intent was to see if those three methods would control dust, as well as odour emissions, ammonia, hydrogen sulphide, methane and carbon dioxide emissions."
Compared with the unsprayed monitored barn, the oil-treated barn showed an average 20 per cent decrease in methane emissions and a 19 per cent average reduction in carbon dioxide emissions. Methane and carbon dioxide are greenhouse gases.
Dust reduction was even more significant. The treated barn emitted about 65 per cent less particulate matter than the untreated barn. Researchers suspected controlling dust also would lead to reduced greenhouse gas escapes, Heber said.
"The spray takes out dust, and since dust carries odour and it absorbs other gases, there was a scientific reason why it might take out those greenhouse gases," Heber said.
"We saw a reduction in odour, but it wasn't statistically significant. That may be because we didn't take enough air samples. All we can say is that there was a trend in odour reduction."
Several challenges stand in the way of using soybean oil in swine barns, including safety, cleaning and the cost of application, Heber said.
"First of all, soybean oil is more expensive now than it was when we did the study," Heber said. "Whereas we thought it would cost less than a dollar per pig marketed to treat the barn, around 60 cents, since then the price of soybean oil has increased dramatically, and so the economics are not as good. Also, the application of oil can create a safety hazard for the producer.
"In addition, some of the oil ended up on the floor, the pigs, the feeders and fans. This makes the cleaning process more difficult. The producer we worked with indicated it took an additional day of power washing to clean that barn. That's an extra expense."
While soybean oil shows promise as a greenhouse gas control agent, it is too early to declare the findings conclusive, Heber and Ni said.
"There are technical problems with this practice, but those may be overcome through good engineering," Heber said.
"We need to do more research to get a better idea of the effectiveness of this technology and its benefit on environmental protection," Ni said.
Oceanic Seesaw Links Northern and Southern Hemisphere During Abrupt Climate Change
Very large and abrupt changes in temperature recorded over Greenland and across the North Atlantic during the last Ice Age were actually global in extent, according to an international team of researchers led by Cardiff University.
New research, published in the journal Nature, supports the idea that changes in ocean circulation within the Atlantic played a central role in abrupt climate change on a global scale.
Using a sediment core taken from the seafloor in the South Atlantic, the team were able to create a detailed reconstruction of ocean conditions in the South Atlantic during the final phases of the last ice age.
Dr Stephen Barker, Cardiff University's School of Earth and Ocean Sciences and lead author on the paper, said, "During this period very large and abrupt changes in temperature were observed across the North Atlantic region. However, evidence for the direct transmission of these shifts between the northern and southern hemispheres has so far been lacking".
The new study suggests that abrupt changes in the north were accompanied by equally abrupt but opposite changes in the south. It provides the first concrete evidence of an immediate seesaw connection between the North and South Atlantic. The data shows, for example, that an abrupt cooling in the north would be accompanied by a rapid southerly shift of ocean fronts in the Southern Ocean, followed by more gradual warming across the south.
Dr Barker explains, "The most intuitive way to explain these changes is by varying the strength of ocean circulation in the Atlantic. By weakening the circulation, the heat transported northwards would be retained in the south."
Climate physicist, Dr Gregor Knorr, co-author of the study and now based at the Alfred Wegener Institute in Germany, said, "Our new results agree with climate models that predict a rapid transmission of climate signals between the two hemispheres as a consequence of abrupt changes in ocean circulation."
The study has wide implications for our understanding of abrupt climate change. Dr Ian Hall, School of Earth and Ocean Sciences, said, "While it is unlikely that an abrupt change in climate, related to changes in ocean circulation, will occur in the near future, our results suggest that if such an extreme scenario did occur, its effects could be felt globally within years to decades."
New research, published in the journal Nature, supports the idea that changes in ocean circulation within the Atlantic played a central role in abrupt climate change on a global scale.
Using a sediment core taken from the seafloor in the South Atlantic, the team were able to create a detailed reconstruction of ocean conditions in the South Atlantic during the final phases of the last ice age.
Dr Stephen Barker, Cardiff University's School of Earth and Ocean Sciences and lead author on the paper, said, "During this period very large and abrupt changes in temperature were observed across the North Atlantic region. However, evidence for the direct transmission of these shifts between the northern and southern hemispheres has so far been lacking".
The new study suggests that abrupt changes in the north were accompanied by equally abrupt but opposite changes in the south. It provides the first concrete evidence of an immediate seesaw connection between the North and South Atlantic. The data shows, for example, that an abrupt cooling in the north would be accompanied by a rapid southerly shift of ocean fronts in the Southern Ocean, followed by more gradual warming across the south.
Dr Barker explains, "The most intuitive way to explain these changes is by varying the strength of ocean circulation in the Atlantic. By weakening the circulation, the heat transported northwards would be retained in the south."
Climate physicist, Dr Gregor Knorr, co-author of the study and now based at the Alfred Wegener Institute in Germany, said, "Our new results agree with climate models that predict a rapid transmission of climate signals between the two hemispheres as a consequence of abrupt changes in ocean circulation."
The study has wide implications for our understanding of abrupt climate change. Dr Ian Hall, School of Earth and Ocean Sciences, said, "While it is unlikely that an abrupt change in climate, related to changes in ocean circulation, will occur in the near future, our results suggest that if such an extreme scenario did occur, its effects could be felt globally within years to decades."
University Of Alberta and NINT Researchers Make Solar Energy Breakthrough
The University of Alberta and the National Research Council's National Institute (NINT) for Nanotechnology have engineered an approach that is leading to improved performance of plastic solar cells (hybrid organic solar cells). The development of inexpensive, mass-produced plastic solar panels is a goal of intense interest for many of the world's scientists and engineers because of the high cost and shortage of the ultra-high purity silicon and other materials normally required.
Plastic solar cells are made up of layers of different materials, each with a specific function, called a sandwich structure. Jillian Buriak, a professor of chemistry at the U of A, NINT principal investigator and member of the research team, uses a simple analogy to describe the approach.
"Consider a clubhouse sandwich, with many different layers. One layer absorbs the light, another helps to generate the electricity, and others help to draw the electricity out of the device. Normally, the layers don't stick well, and so the electricity ends up stuck and never gets out, leading to inefficient devices. We are working on the mayonnaise, the mustard, the butter and other 'special sauces' that bring the sandwich together, and make each of the layers work together. That makes a better sandwich, and makes a better solar cell, in our case".
After two years of research, these U of A and NINT scientists have, by only working on one part of the sandwich, seen improvements of about 30 per cent in the efficiency of the working model. Michael Brett, professor of electrical and computer engineering, NINT principal investigator and member of the research team is optimistic, "our team is so incredibly cross-disciplinary, with people from engineering, physics and chemistry backgrounds all working towards this common goal of cheap manufacturable solar cells. This collaboration is extremely productive because of the great team with such diverse backgrounds, [although] there is still so much more for us to do, which is exciting." This multidisciplinary approach, common at the National Institute for Nanotechnology, brings together the best of the NRC and the University of Alberta.
The team estimates it will be five to seven years before plastic solar panels will be mass produced but Buriak adds that when it happens solar energy will be available to everyone. She says the next generation of solar technology belongs to plastic.
"Plastic solar cell material will be made cheaply and quickly and in massive quantities by ink jet-like printers."
Plastic solar cells are made up of layers of different materials, each with a specific function, called a sandwich structure. Jillian Buriak, a professor of chemistry at the U of A, NINT principal investigator and member of the research team, uses a simple analogy to describe the approach.
"Consider a clubhouse sandwich, with many different layers. One layer absorbs the light, another helps to generate the electricity, and others help to draw the electricity out of the device. Normally, the layers don't stick well, and so the electricity ends up stuck and never gets out, leading to inefficient devices. We are working on the mayonnaise, the mustard, the butter and other 'special sauces' that bring the sandwich together, and make each of the layers work together. That makes a better sandwich, and makes a better solar cell, in our case".
After two years of research, these U of A and NINT scientists have, by only working on one part of the sandwich, seen improvements of about 30 per cent in the efficiency of the working model. Michael Brett, professor of electrical and computer engineering, NINT principal investigator and member of the research team is optimistic, "our team is so incredibly cross-disciplinary, with people from engineering, physics and chemistry backgrounds all working towards this common goal of cheap manufacturable solar cells. This collaboration is extremely productive because of the great team with such diverse backgrounds, [although] there is still so much more for us to do, which is exciting." This multidisciplinary approach, common at the National Institute for Nanotechnology, brings together the best of the NRC and the University of Alberta.
The team estimates it will be five to seven years before plastic solar panels will be mass produced but Buriak adds that when it happens solar energy will be available to everyone. She says the next generation of solar technology belongs to plastic.
"Plastic solar cell material will be made cheaply and quickly and in massive quantities by ink jet-like printers."
Polar Research Reveals New Evidence of Global Environmental Change
Multidisciplinary research from the International Polar Year (IPY) 2007-2008 provides new evidence of the widespread effects of global warming in the polar regions. Snow and ice are declining in both polar regions, affecting human livelihoods as well as local plant and animal life in the Arctic, as well as global ocean and atmospheric circulation and sea level. These are but a few findings reported in “State of Polar Research”, released by the World Meteorological Organisation (WMO) and the International Council for Science (ICSU). In addition to lending insight into climate change, IPY has aided our understanding of pollutant transport, species’ evolution, and storm formation, among many other areas.
The wide-ranging IPY findings result from more than 160 endorsed science projects assembled from researchers in more than 60 countries. Launched in March 2007, the IPY covers a two-year period to March 2009 to allow for observations during the alternate seasons in both Polar Regions. A joint project of WMO and ICSU, IPY spearheaded efforts to better monitor and understand the Arctic and Antarctic regions, with international funding support of about USD 1.2 billion over the two-year period.
“The International Polar Year 2007 – 2008 came at a crossroads for the planet’s future” said Michel Jarraud, Secretary-General of WMO. “The new evidence resulting from polar research will strengthen the scientific basis on which we build future actions.”
Catherine Bréchignac, President of ICSU, adds “the planning for IPY set ambitious goals that have been achieved, and even exceeded, thanks to the tireless efforts, enthusiasm, and imagination of thousands of scientists, working with teachers, artists, and many other collaborators.”
IPY has provided a critical boost to polar research during a time in which the global environment is changing faster than ever in human history. It now appears clear that the Greenland and Antarctic ice sheets are losing mass contributing to sea level rise. Warming in the Antarctic is much more widespread than it was thought prior to the IPY, and it now appears that the rate of ice loss from Greenland is increasing.
Researchers also found that in the Arctic, during the summers of 2007 and 2008, the minimum extent of year-round sea ice decreased to its lowest level since satellite records began 30 years ago. IPY expeditions recorded an unprecedented rate of sea-ice drift in the Arctic as well. Due to global warming, the types and extent of vegetation in the Arctic shifted, affecting grazing animals and hunting.
Other evidence for global warming comes from IPY research vessels that have confirmed above-global-average warming in the Southern Ocean. A freshening of the bottom water near Antarctica is consistent with increased ice melt from Antarctica and could affect ocean circulation. Global warming is thus affecting Antarctica in ways not previously identified.
IPY research has also identified large pools of carbon stored as methane in permafrost. Thawing permafrost threatens to destabilise the stored methane, a greenhouse gas, and send it into the atmosphere. Indeed, IPY researchers along the Siberian coast observed substantial emissions of methane from ocean sediments.
In the area of biodiversity, surveys of the Southern Ocean have uncovered a remarkably rich, colourful and complex range of life. Some species appear to be migrating pole ward in response to global warming. Other IPY studies reveal interesting evolutionary trends such as many present-day deep-sea octopuses having originated from common ancestor species that still survive in the Southern Ocean.
IPY has also given atmospheric research new insight. Researchers have discovered that North Atlantic storms are major sources of heat and moisture for the Polar Regions. Understanding these mechanisms will improve forecasts of the path and intensity of storms. Studies of the ozone hole have benefited from IPY research as well, with new connections identified between the ozone concentrations above Antarctica and wind and storm conditions over the Southern Ocean. This information will improve predictions of climate and ozone depletion.
Many Arctic residents, including indigenous communities, participated in IPY’s projects. Over 30 of these projects addressed Arctic social and human science issues, including food security, pollution, and other health issues, and will bring new understanding to addressing these pressing challenges. “IPY has been the catalyst for the development and strengthening of community monitoring networks across the North” said David Carlson, Director of the IPY International Programme Office. “These networks stimulate the information flow among communities and back and forth from science to communities.”
IPY leaves as its legacy enhanced observational capacity, stronger links across disciplines and communities, and an energised new generation of polar researchers. “The work begun by IPY must continue”, said Jarraud. “Internationally coordinated action related to the Polar Regions will still be needed in the next decades,” he said. Bréchignac concurs, “This IPY has further strengthened the ICSU-WMO relationship on polar research coordination, and we must continue to assist the scientific community in its quest to understand and predict polar change and its global manifestations at this critical time.”
The increased threats posed by climate change make polar research a special priority. The “State of Polar Research” document not only describes some of the striking discoveries during IPY, it also recommends priorities for future action to ensure that society is best informed about ongoing polar change and its likely future evolution and global impacts. A major IPY science conference will take place in Oslo in June 2010.
For more information about IPY, including the “State of Polar Research” report visit www.ipy.org
The wide-ranging IPY findings result from more than 160 endorsed science projects assembled from researchers in more than 60 countries. Launched in March 2007, the IPY covers a two-year period to March 2009 to allow for observations during the alternate seasons in both Polar Regions. A joint project of WMO and ICSU, IPY spearheaded efforts to better monitor and understand the Arctic and Antarctic regions, with international funding support of about USD 1.2 billion over the two-year period.
“The International Polar Year 2007 – 2008 came at a crossroads for the planet’s future” said Michel Jarraud, Secretary-General of WMO. “The new evidence resulting from polar research will strengthen the scientific basis on which we build future actions.”
Catherine Bréchignac, President of ICSU, adds “the planning for IPY set ambitious goals that have been achieved, and even exceeded, thanks to the tireless efforts, enthusiasm, and imagination of thousands of scientists, working with teachers, artists, and many other collaborators.”
IPY has provided a critical boost to polar research during a time in which the global environment is changing faster than ever in human history. It now appears clear that the Greenland and Antarctic ice sheets are losing mass contributing to sea level rise. Warming in the Antarctic is much more widespread than it was thought prior to the IPY, and it now appears that the rate of ice loss from Greenland is increasing.
Researchers also found that in the Arctic, during the summers of 2007 and 2008, the minimum extent of year-round sea ice decreased to its lowest level since satellite records began 30 years ago. IPY expeditions recorded an unprecedented rate of sea-ice drift in the Arctic as well. Due to global warming, the types and extent of vegetation in the Arctic shifted, affecting grazing animals and hunting.
Other evidence for global warming comes from IPY research vessels that have confirmed above-global-average warming in the Southern Ocean. A freshening of the bottom water near Antarctica is consistent with increased ice melt from Antarctica and could affect ocean circulation. Global warming is thus affecting Antarctica in ways not previously identified.
IPY research has also identified large pools of carbon stored as methane in permafrost. Thawing permafrost threatens to destabilise the stored methane, a greenhouse gas, and send it into the atmosphere. Indeed, IPY researchers along the Siberian coast observed substantial emissions of methane from ocean sediments.
In the area of biodiversity, surveys of the Southern Ocean have uncovered a remarkably rich, colourful and complex range of life. Some species appear to be migrating pole ward in response to global warming. Other IPY studies reveal interesting evolutionary trends such as many present-day deep-sea octopuses having originated from common ancestor species that still survive in the Southern Ocean.
IPY has also given atmospheric research new insight. Researchers have discovered that North Atlantic storms are major sources of heat and moisture for the Polar Regions. Understanding these mechanisms will improve forecasts of the path and intensity of storms. Studies of the ozone hole have benefited from IPY research as well, with new connections identified between the ozone concentrations above Antarctica and wind and storm conditions over the Southern Ocean. This information will improve predictions of climate and ozone depletion.
Many Arctic residents, including indigenous communities, participated in IPY’s projects. Over 30 of these projects addressed Arctic social and human science issues, including food security, pollution, and other health issues, and will bring new understanding to addressing these pressing challenges. “IPY has been the catalyst for the development and strengthening of community monitoring networks across the North” said David Carlson, Director of the IPY International Programme Office. “These networks stimulate the information flow among communities and back and forth from science to communities.”
IPY leaves as its legacy enhanced observational capacity, stronger links across disciplines and communities, and an energised new generation of polar researchers. “The work begun by IPY must continue”, said Jarraud. “Internationally coordinated action related to the Polar Regions will still be needed in the next decades,” he said. Bréchignac concurs, “This IPY has further strengthened the ICSU-WMO relationship on polar research coordination, and we must continue to assist the scientific community in its quest to understand and predict polar change and its global manifestations at this critical time.”
The increased threats posed by climate change make polar research a special priority. The “State of Polar Research” document not only describes some of the striking discoveries during IPY, it also recommends priorities for future action to ensure that society is best informed about ongoing polar change and its likely future evolution and global impacts. A major IPY science conference will take place in Oslo in June 2010.
For more information about IPY, including the “State of Polar Research” report visit www.ipy.org
Global Seed Vault Marks One Year Anniversary with Four Ton Shipment of Critical Food Crops
With new evidence warning climate change threatens food production, scientists gather in Svalbard to discuss crop diversity and the vault's role in averting agricultural disaster
Four tons of seeds, almost 90,000 samples of hundreds of crop species, from food crop collections maintained by Canada, Ireland, Switzerland, USA, and three international agricultural research centres in Syria, Mexico and Colombia, were delivered to the Svalbard Global Seed Vault as it celebrated its one-year anniversary. The repository, located near the village of Longyearbyen on the Norwegian archipelago of Svalbard, has in one year amassed a collection of more than 400,000 unique seed samples – some 200 million seeds.
"We are especially proud to see such a large number of countries work quickly to provide samples from their collections for safekeeping in the vault," said Norwegian Agriculture Minister Lars Peder Brekk. "It shows that there are situations in the world today capable of transcending politics and inspiring a strong unity of purpose among a diverse community of nations."
"The vault was opened last year to ensure that one day all of humanity's existing food crop varieties would be safely protected from any threat to agricultural production, natural or man made. It's amazing how far we have come toward accomplishing that goal," said Cary Fowler, Executive Director of the Global Crop Diversity Trust, which operates the seed vault in partnership with the Norwegian government and the Nordic Genetic Resource Centre in Sweden.
For example, in its first year of operation, the vault at Svalbard has so far received duplicates of nearly half of the crop samples maintained by the gene banks of the international agricultural research centres of the Consultative Group on International Agricultural Research (CGIAR).
These international gene banks are seen as the custodians of the crown jewels of crop diversity. This diversity has been instrumental in the breeding of new varieties responsible for the remarkable productivity gains made in global agriculture in recent decades, and in averting food crises when farm production has been threatened by natural disasters, plant diseases, and plant pests.
To mark the anniversary of the vault, experts on global warming and its effects on food production have gathered in Longyearbyen to discuss how climate change could pose a major threat to food production, and to examine crop diversity's role in averting crisis. They include the authors of a study published last month in Science magazine warning that by the end of this century the average temperatures during growing seasons in many regions will probably be higher than the most extreme heat recorded over the last 100 years. Crop diversity will be required by scientists to breed new varieties able to flourish in such dramatically different conditions.
"This means that the vital importance of crop diversity to our food supply, which inspired the creation of the seed vault, is neither remote nor theoretical but immediate and real," said David Battisti, a climate change expert at the University of Washington and one of the lead authors of the paper.
"When we see research indicating that global warming could diminish maize production by 30 per cent in southern Africa in only 20 years' time, it shows that crop diversity is needed to adapt agriculture to climate change right now," added Frank Loy, former Under Secretary of State for Global Affairs and an advisor to President Obama's transition team on environment and climate change, who is also attending.
With its new acquisitions, the vault is now providing a secure second home for a third of humanity's most important crop varieties, and a level of security for crop diversity conservation that was not available until a year ago. More gene banks and countries are in the process of signing agreements and preparing seeds collections to deposit in the vault.
Seeds arriving for the vault anniversary include samples of 32 varieties of potatoes in addition to oat, wheat, barley, and native grass species from two of Ireland's national gene banks. Ireland's participation and its inclusion of potato varieties is particularly appropriate for an occasion celebrating crop diversity. It was a lack of diversity that is believed to have made Ireland's potato crop particularly vulnerable to the devastating blight of the mid-1800s that lead to the deaths of more than one million people.
In addition to Ireland's contribution, 3,800 samples of wheat and barley have come from Switzerland's national seed bank in Changins. The United States is sending 20,000 samples from the seed repository maintained by the federal Department of Agriculture that represents 361 crop species. They include samples of crop varieties that originally came from 151 countries and are now part of the US collection.
Like all seeds coming to the vault, the samples arriving today are duplicates of seeds from other collections. The vault is intended to serve as a fail-safe backup should the original samples be lost or damaged or, more dramatically, to provide something of a Noah's ark for agriculture in the event of a global catastrophe.
Four tons of seeds, almost 90,000 samples of hundreds of crop species, from food crop collections maintained by Canada, Ireland, Switzerland, USA, and three international agricultural research centres in Syria, Mexico and Colombia, were delivered to the Svalbard Global Seed Vault as it celebrated its one-year anniversary. The repository, located near the village of Longyearbyen on the Norwegian archipelago of Svalbard, has in one year amassed a collection of more than 400,000 unique seed samples – some 200 million seeds.
"We are especially proud to see such a large number of countries work quickly to provide samples from their collections for safekeeping in the vault," said Norwegian Agriculture Minister Lars Peder Brekk. "It shows that there are situations in the world today capable of transcending politics and inspiring a strong unity of purpose among a diverse community of nations."
"The vault was opened last year to ensure that one day all of humanity's existing food crop varieties would be safely protected from any threat to agricultural production, natural or man made. It's amazing how far we have come toward accomplishing that goal," said Cary Fowler, Executive Director of the Global Crop Diversity Trust, which operates the seed vault in partnership with the Norwegian government and the Nordic Genetic Resource Centre in Sweden.
For example, in its first year of operation, the vault at Svalbard has so far received duplicates of nearly half of the crop samples maintained by the gene banks of the international agricultural research centres of the Consultative Group on International Agricultural Research (CGIAR).
These international gene banks are seen as the custodians of the crown jewels of crop diversity. This diversity has been instrumental in the breeding of new varieties responsible for the remarkable productivity gains made in global agriculture in recent decades, and in averting food crises when farm production has been threatened by natural disasters, plant diseases, and plant pests.
To mark the anniversary of the vault, experts on global warming and its effects on food production have gathered in Longyearbyen to discuss how climate change could pose a major threat to food production, and to examine crop diversity's role in averting crisis. They include the authors of a study published last month in Science magazine warning that by the end of this century the average temperatures during growing seasons in many regions will probably be higher than the most extreme heat recorded over the last 100 years. Crop diversity will be required by scientists to breed new varieties able to flourish in such dramatically different conditions.
"This means that the vital importance of crop diversity to our food supply, which inspired the creation of the seed vault, is neither remote nor theoretical but immediate and real," said David Battisti, a climate change expert at the University of Washington and one of the lead authors of the paper.
"When we see research indicating that global warming could diminish maize production by 30 per cent in southern Africa in only 20 years' time, it shows that crop diversity is needed to adapt agriculture to climate change right now," added Frank Loy, former Under Secretary of State for Global Affairs and an advisor to President Obama's transition team on environment and climate change, who is also attending.
With its new acquisitions, the vault is now providing a secure second home for a third of humanity's most important crop varieties, and a level of security for crop diversity conservation that was not available until a year ago. More gene banks and countries are in the process of signing agreements and preparing seeds collections to deposit in the vault.
Seeds arriving for the vault anniversary include samples of 32 varieties of potatoes in addition to oat, wheat, barley, and native grass species from two of Ireland's national gene banks. Ireland's participation and its inclusion of potato varieties is particularly appropriate for an occasion celebrating crop diversity. It was a lack of diversity that is believed to have made Ireland's potato crop particularly vulnerable to the devastating blight of the mid-1800s that lead to the deaths of more than one million people.
In addition to Ireland's contribution, 3,800 samples of wheat and barley have come from Switzerland's national seed bank in Changins. The United States is sending 20,000 samples from the seed repository maintained by the federal Department of Agriculture that represents 361 crop species. They include samples of crop varieties that originally came from 151 countries and are now part of the US collection.
Like all seeds coming to the vault, the samples arriving today are duplicates of seeds from other collections. The vault is intended to serve as a fail-safe backup should the original samples be lost or damaged or, more dramatically, to provide something of a Noah's ark for agriculture in the event of a global catastrophe.
CO2 Drop and Global Cooling Caused Antarctic Glacier to Form
Global climate rapidly shifted from a relatively ice-free world to one with massive ice sheets on Antarctica about 34 million years ago. What happened? What changed? A team of scientists led by Yale geologists offers a new perspective on the nature of changing climatic conditions across this greenhouse-to-icehouse transition, one that refutes earlier theories and has important implications for predicting future climate changes.
Detailed in the February 27 issue of Science, their data disproves a long-held idea that massive ice growth in the Antarctic was accompanied by little to no global temperature change.
This report shows that before the Southern Hemisphere ice expansion, high-latitude temperatures were at least 10°C (about 18˚F) warmer than previously estimated and that there was a five to ten degree Celsius drop in surface-water temperature during the climate transition.
"Previous reconstructions gave no evidence of high-latitude cooling," according to senior author Mark Pagani, professor of geology and geophysics at Yale. "Our data demonstrate a clear temperature drop in both hemispheres during this time."
Their conclusions are based on sea-surface "temperature proxies" – calculations of temperature based on the distribution of specific organic molecules from ancient plankton that only lived at certain temperatures and were later preserved in ocean sediments. These molecules were assayed in ocean cores collected by the Integrated Ocean Drilling Program (IODP) and earlier marine programs that study Earth history by coring deep-ocean sediments and crust around the world.
"Temperatures in some regions, just before the Antarctic glaciers formed, were surprisingly higher than current climate models predicted, suggesting that these models underestimate high-latitude warming under high CO2 conditions," said lead author Zhonghui Liu, Pagani's post doctoral associate who is now an assistant professor at the University of Hong Kong. Further, he said, the substantial cooling that occurred in both Northern and Southern high latitudes suggests that a decline in CO2 level, rather than a localised change of ocean circulation drove the climate transition.
The ice formed over Antarctica in about 100,000 years, which is an "overnight" shift in geological terms. "Just over thirty-five million years ago, there was an ice sheet where there had been subtropical temperatures before," said Co-author Matthew Huber of Purdue University.
Another theory refuted by this study is the notion that ice-expansion also occurred in the Northern Hemisphere during this time, a supposition poorly supported by physical evidence of glacier formation in that region, say the Yale scientists.
There are about 70 metres of vertical sea level rise represented in the ice sheets of Antarctica. And, there are many questions regarding the glacier's stability, the temperature thresholds that would cause radical glacier melting, and the rate at which it would change, according to Pagani. "Our findings point to the difficulty of modelling accurate temperatures under higher CO2 in this critical region."
Detailed in the February 27 issue of Science, their data disproves a long-held idea that massive ice growth in the Antarctic was accompanied by little to no global temperature change.
This report shows that before the Southern Hemisphere ice expansion, high-latitude temperatures were at least 10°C (about 18˚F) warmer than previously estimated and that there was a five to ten degree Celsius drop in surface-water temperature during the climate transition.
"Previous reconstructions gave no evidence of high-latitude cooling," according to senior author Mark Pagani, professor of geology and geophysics at Yale. "Our data demonstrate a clear temperature drop in both hemispheres during this time."
Their conclusions are based on sea-surface "temperature proxies" – calculations of temperature based on the distribution of specific organic molecules from ancient plankton that only lived at certain temperatures and were later preserved in ocean sediments. These molecules were assayed in ocean cores collected by the Integrated Ocean Drilling Program (IODP) and earlier marine programs that study Earth history by coring deep-ocean sediments and crust around the world.
"Temperatures in some regions, just before the Antarctic glaciers formed, were surprisingly higher than current climate models predicted, suggesting that these models underestimate high-latitude warming under high CO2 conditions," said lead author Zhonghui Liu, Pagani's post doctoral associate who is now an assistant professor at the University of Hong Kong. Further, he said, the substantial cooling that occurred in both Northern and Southern high latitudes suggests that a decline in CO2 level, rather than a localised change of ocean circulation drove the climate transition.
The ice formed over Antarctica in about 100,000 years, which is an "overnight" shift in geological terms. "Just over thirty-five million years ago, there was an ice sheet where there had been subtropical temperatures before," said Co-author Matthew Huber of Purdue University.
Another theory refuted by this study is the notion that ice-expansion also occurred in the Northern Hemisphere during this time, a supposition poorly supported by physical evidence of glacier formation in that region, say the Yale scientists.
There are about 70 metres of vertical sea level rise represented in the ice sheets of Antarctica. And, there are many questions regarding the glacier's stability, the temperature thresholds that would cause radical glacier melting, and the rate at which it would change, according to Pagani. "Our findings point to the difficulty of modelling accurate temperatures under higher CO2 in this critical region."
Commercial Ships Spew Half as Much Particulate Pollution as World's Cars
Globally, commercial ships emit almost half as much particulate pollution into the air as the total amount released by cars, according to a new study. Ship pollutants affect both the Earth's climate and the health of people living along coastlines.
The study is the first to provide a global estimate of maritime shipping's total contribution to air particle pollution based on direct measurements of emissions. The authors estimate that worldwide, ships emit 0.9 teragrams, or about 2.2 million pounds, of particulate pollution each year. Shipping also contributes almost 30 per cent of smog-forming nitrogen oxide gases.
"Since more than 70 per cent of shipping traffic takes place within 250 miles of the coastline, this is a significant health concern for coastal communities," says lead author Daniel Lack, a researcher at the National Oceanic and Atmospheric Administration (NOAA)'s Earth System Research Laboratory in Boulder, Colorado. He and his colleagues reported their findings on 25 February 2009 in the Journal of Geophysical Research – Atmospheres, a publication of the American Geophysical Union (AGU).
Earlier research by one of the study's authors, James Corbett, of the University of Delaware, in Newark, linked particulate pollution to premature deaths among coastal populations.
Commercial ships emit both particulate pollution and carbon dioxide. Carbon dioxide from ships makes up roughly three per cent of all human caused emissions of the gas. But particulate pollution and carbon dioxide have opposite effects on climate. The particles have a global cooling effect at least five times greater than the global warming effect from ships' carbon dioxide emissions, Lack says.
Lack is also with the Boulder based Cooperative Institute for Research in Environmental Sciences, which is jointly supported by NOAA and the University of Colorado, Boulder.
During the summer of 2006, Lack and colleagues, aboard the NOAA ship Ronald H. Brown, analysed the exhaust from over 200 commercial vessels, including cargo ships, tankers and cruise ships, in the Gulf of Mexico, Galveston Bay, and the Houston Ship Channel. The researchers also examined the chemistry of particles in ship exhaust to understand what makes ships such hefty polluters.
Ships emit sulphates, the same polluting particles associated with diesel-engine cars and trucks that prompted improvements in on-road vehicle fuel standards. Sulphate emissions from ships vary with the concentration of sulphur in ship fuel, the authors find. Globally, fuel sulphur content is capped under the International Convention for the Prevention of Pollution from Ships. As a result of the cap, some ships use "cleaner," low-sulphur fuels, while others continue to use the high-sulphur counterparts.
Yet, sulphates make up just under half of shipping's total particle emissions, according to the new study. The other half, composed by organic pollutants and sooty, black carbon, are not directly targeted by today's regulations.
A 2008 study by Lack's team focused exclusively on soot (http://www.agu.org/sci_soc/prrl/2008-23.html). Emissions of these non-sulphate particles, the earlier study found, depend on the operating speed of the engine and the amount of lubricating oil needed to deal with wear and tear from burning less-refined fuels. "Fortunately, engines burning 'cleaner,' low-sulphur fuels tend to require less complex lubricants. So the sulphur fuel regulations have the indirect effect of reducing the organic particles emitted," says Corbett.
One surprising result of burning low-sulphur fuels is that, although total particle emissions diminish, the time that particles spend in the air appears to increase. It's while they're airborne that particles pose a risk to human health and affect climate.
Lack and colleagues find that the organic and black carbon portion of ship exhaust is less likely to form cloud droplets. As a result, these particles remain suspended for longer periods of time before being washed to the ground through precipitation.
The study is the first to provide a global estimate of maritime shipping's total contribution to air particle pollution based on direct measurements of emissions. The authors estimate that worldwide, ships emit 0.9 teragrams, or about 2.2 million pounds, of particulate pollution each year. Shipping also contributes almost 30 per cent of smog-forming nitrogen oxide gases.
"Since more than 70 per cent of shipping traffic takes place within 250 miles of the coastline, this is a significant health concern for coastal communities," says lead author Daniel Lack, a researcher at the National Oceanic and Atmospheric Administration (NOAA)'s Earth System Research Laboratory in Boulder, Colorado. He and his colleagues reported their findings on 25 February 2009 in the Journal of Geophysical Research – Atmospheres, a publication of the American Geophysical Union (AGU).
Earlier research by one of the study's authors, James Corbett, of the University of Delaware, in Newark, linked particulate pollution to premature deaths among coastal populations.
Commercial ships emit both particulate pollution and carbon dioxide. Carbon dioxide from ships makes up roughly three per cent of all human caused emissions of the gas. But particulate pollution and carbon dioxide have opposite effects on climate. The particles have a global cooling effect at least five times greater than the global warming effect from ships' carbon dioxide emissions, Lack says.
Lack is also with the Boulder based Cooperative Institute for Research in Environmental Sciences, which is jointly supported by NOAA and the University of Colorado, Boulder.
During the summer of 2006, Lack and colleagues, aboard the NOAA ship Ronald H. Brown, analysed the exhaust from over 200 commercial vessels, including cargo ships, tankers and cruise ships, in the Gulf of Mexico, Galveston Bay, and the Houston Ship Channel. The researchers also examined the chemistry of particles in ship exhaust to understand what makes ships such hefty polluters.
Ships emit sulphates, the same polluting particles associated with diesel-engine cars and trucks that prompted improvements in on-road vehicle fuel standards. Sulphate emissions from ships vary with the concentration of sulphur in ship fuel, the authors find. Globally, fuel sulphur content is capped under the International Convention for the Prevention of Pollution from Ships. As a result of the cap, some ships use "cleaner," low-sulphur fuels, while others continue to use the high-sulphur counterparts.
Yet, sulphates make up just under half of shipping's total particle emissions, according to the new study. The other half, composed by organic pollutants and sooty, black carbon, are not directly targeted by today's regulations.
A 2008 study by Lack's team focused exclusively on soot (http://www.agu.org/sci_soc/prrl/2008-23.html). Emissions of these non-sulphate particles, the earlier study found, depend on the operating speed of the engine and the amount of lubricating oil needed to deal with wear and tear from burning less-refined fuels. "Fortunately, engines burning 'cleaner,' low-sulphur fuels tend to require less complex lubricants. So the sulphur fuel regulations have the indirect effect of reducing the organic particles emitted," says Corbett.
One surprising result of burning low-sulphur fuels is that, although total particle emissions diminish, the time that particles spend in the air appears to increase. It's while they're airborne that particles pose a risk to human health and affect climate.
Lack and colleagues find that the organic and black carbon portion of ship exhaust is less likely to form cloud droplets. As a result, these particles remain suspended for longer periods of time before being washed to the ground through precipitation.
Prehistoric Global Cooling Caused By CO2, Research Finds
Ice in Antarctica suddenly appeared, in geologic terms, about 35 million years ago. For the previous 100 million years the continent had been essentially ice-free.
The question for science has been, why? What triggered glaciers to form at the South Pole?
Matthew Huber, assistant professor of earth and atmospheric sciences at Purdue University, says no evidence of global cooling during the period had been found.
"Previous evidence points paradoxically to a stable climate at the same time this event, one of the biggest climate events in Earth's history, was happening," Huber says.
However, in a paper published this week in the journal Science, a team of researchers found evidence of widespread cooling. Additional computer modelling of the cooling suggests that the cooling was caused by a reduction of greenhouse gases in the atmosphere.
Even after the continent of Antarctica had drifted to near its present location, its climate was subtropical. Then, 35.5 million years ago, ice formed on Antarctica in about 100,000 years, which is an "overnight" shift in geological terms.
"Our studies show that just over thirty-five million years ago, 'poof,' there was an ice sheet where there had been subtropical temperatures before," Huber says. "Until now we haven't had much scientific information about what happened."
Before the cooling occurred at the end of the Eocene epoch, the Earth was warm and wet, and even the north and south poles experienced subtropical climates. The dinosaurs were long gone from the planet, but there were mammals and many reptiles and amphibians. Then, as the scientists say, poof, this warm wet world, which had existed for millions of years, dramatically changed. Temperatures fell dramatically, many species of mammals as well as most reptiles and amphibians became extinct, and Antarctica was covered in ice and sea levels fell.
History records this as the beginning of the Oligocene epoch, but the cause of the cooling has been the subject of scientific discussion and debate for many years.
The research team found before the event ocean surface temperatures near present-day Antarctica averaged 77 degrees Fahrenheit (25 degrees Celsius).
Mark Pagani, professor of geology and geophysics at Yale University, says the research found that air and ocean surface temperatures dropped as much as 18 degrees Fahrenheit during the transition.
"Previous reconstructions gave no evidence of high-latitude cooling," Pagani says. "Our data demonstrate a clear temperature drop in both hemispheres during this time."
The research team determined the temperatures of the Earth millions of years ago by using temperature "proxies," or clues. In this case, the geologic detectives looked for the presence of biochemical molecules, which were present in plankton that only lived at certain temperatures. The researchers looked for the temperature proxies in seabed cores collected by drilling in deep-ocean sediments and crusts from around the world.
"Before this work we knew little about the climate during the time when this ice sheet was forming," Huber says.
Once the team identified the global cooling, the next step was to find what caused it.
To find the result, Huber used modern climate modelling tools to look at the prehistoric climate. The models were run on a cluster-type supercomputer on Purdue's campus.
"That's what climate models are good for. They can give you plausible reasons for such an event," Huber says. "We found that the likely culprit was a major drop in greenhouse gases in the atmosphere, especially CO2. From the temperature data and existing proxy records indicating a sharp drop in CO2 near the Eocene-Oligocene boundary, we are establishing a link between the sea surface temperatures and the glaciations of Antarctica."
Huber says the modelling required an unusually large computing effort. Staff at Information Technology at Purdue assisted in the computing runs.
"My simulations produced 50 terabytes of data, which is about the amount of data you could store in 100 desktop computers. This represented 8,000 years of climate simulation," Huber says.
The computation required nearly 2 million computing hours over two years on Pete, Purdue's 664-CPU Linux cluster.
"This required running these simulations for a long time, which would not have been allowed at a national supercomputing centre," Huber says. "Fortunately, we had the resources here on campus, and I was able to use Purdue's Pete to do the simulation."
The research was supported in part by funding from the National Science Foundation.
Source: PURDUE UNIVERSITY
The question for science has been, why? What triggered glaciers to form at the South Pole?
Matthew Huber, assistant professor of earth and atmospheric sciences at Purdue University, says no evidence of global cooling during the period had been found.
"Previous evidence points paradoxically to a stable climate at the same time this event, one of the biggest climate events in Earth's history, was happening," Huber says.
However, in a paper published this week in the journal Science, a team of researchers found evidence of widespread cooling. Additional computer modelling of the cooling suggests that the cooling was caused by a reduction of greenhouse gases in the atmosphere.
Even after the continent of Antarctica had drifted to near its present location, its climate was subtropical. Then, 35.5 million years ago, ice formed on Antarctica in about 100,000 years, which is an "overnight" shift in geological terms.
"Our studies show that just over thirty-five million years ago, 'poof,' there was an ice sheet where there had been subtropical temperatures before," Huber says. "Until now we haven't had much scientific information about what happened."
Before the cooling occurred at the end of the Eocene epoch, the Earth was warm and wet, and even the north and south poles experienced subtropical climates. The dinosaurs were long gone from the planet, but there were mammals and many reptiles and amphibians. Then, as the scientists say, poof, this warm wet world, which had existed for millions of years, dramatically changed. Temperatures fell dramatically, many species of mammals as well as most reptiles and amphibians became extinct, and Antarctica was covered in ice and sea levels fell.
History records this as the beginning of the Oligocene epoch, but the cause of the cooling has been the subject of scientific discussion and debate for many years.
The research team found before the event ocean surface temperatures near present-day Antarctica averaged 77 degrees Fahrenheit (25 degrees Celsius).
Mark Pagani, professor of geology and geophysics at Yale University, says the research found that air and ocean surface temperatures dropped as much as 18 degrees Fahrenheit during the transition.
"Previous reconstructions gave no evidence of high-latitude cooling," Pagani says. "Our data demonstrate a clear temperature drop in both hemispheres during this time."
The research team determined the temperatures of the Earth millions of years ago by using temperature "proxies," or clues. In this case, the geologic detectives looked for the presence of biochemical molecules, which were present in plankton that only lived at certain temperatures. The researchers looked for the temperature proxies in seabed cores collected by drilling in deep-ocean sediments and crusts from around the world.
"Before this work we knew little about the climate during the time when this ice sheet was forming," Huber says.
Once the team identified the global cooling, the next step was to find what caused it.
To find the result, Huber used modern climate modelling tools to look at the prehistoric climate. The models were run on a cluster-type supercomputer on Purdue's campus.
"That's what climate models are good for. They can give you plausible reasons for such an event," Huber says. "We found that the likely culprit was a major drop in greenhouse gases in the atmosphere, especially CO2. From the temperature data and existing proxy records indicating a sharp drop in CO2 near the Eocene-Oligocene boundary, we are establishing a link between the sea surface temperatures and the glaciations of Antarctica."
Huber says the modelling required an unusually large computing effort. Staff at Information Technology at Purdue assisted in the computing runs.
"My simulations produced 50 terabytes of data, which is about the amount of data you could store in 100 desktop computers. This represented 8,000 years of climate simulation," Huber says.
The computation required nearly 2 million computing hours over two years on Pete, Purdue's 664-CPU Linux cluster.
"This required running these simulations for a long time, which would not have been allowed at a national supercomputing centre," Huber says. "Fortunately, we had the resources here on campus, and I was able to use Purdue's Pete to do the simulation."
The research was supported in part by funding from the National Science Foundation.
Source: PURDUE UNIVERSITY
Clemson Scientists Launch Rockets to Test Atmospheric Conditions
Clemson University space physicists have travelled around the world to launch rockets to test atmospheric conditions.
Scientists most recently launched a salvo of four rockets over Alaska to study turbulence in the upper atmosphere. The launches took place at Poker Flat Research Range north of Fairbanks as part of a NASA sounding rocket campaign.
Associate professor of physics and astronomy Gerald Lehmacher is the principal investigator for the experiment and was assisted by graduate students Shelton Simmons and Liyu Guo.
“After six days of cloudy and snowy weather, we had perfect conditions with a clear, moonless night sky over interior Alaska,” said Lehmacher. “We needed excellent viewing conditions from three camera sites to photograph the luminescent trails the payloads produced in the upper atmosphere.”
The rockets were 35-foot, two-stage Terrier Orions. They released trimethyl aluminum that creates a glowing vapour trail nearly 87 miles up. Sensitive cameras on the ground track the trails. From that Lehmacher and his team can analyse upper-atmospheric winds by tracking how the vapour trails form, billow, disperse and diffuse. Two of the rockets had an additional deployable payload with instrumentation to measure electron density and neutral temperature and turbulence.
The instrumented sections are a collaboration of Clemson with Penn State University and the Leibniz-Institute for Atmospheric Physics in Germany. The University of Alaska assisted in the study with ground-based laser radar and other optical instruments. The project is sponsored by a NASA grant for three years.
In January, Clemson physicists travelled to Norway to carry out a joint experiment with Japanese scientists to study atmospheric winds and circulation from heating created by electrical currents associated with Northern Lights displays. The measurements were made with instruments flown on a Japanese S-310 rocket launched from the Andoya Rocket Range in northern Norway, as well as a suite of sensitive radar and camera instruments on the ground.
The experiment was a collaboration between the Japan Aerospace Exploration Agency and the department of physics and astronomy at Clemson. Professor Miguel Larsen was the investigator responsible for the wind measurement aboard the instrumented rocket and was assisted by three undergraduate students, Lucas Hurd, Matt Jenkins and Matt Henderson.
Source: CLEMSON UNIVERSITY
Scientists most recently launched a salvo of four rockets over Alaska to study turbulence in the upper atmosphere. The launches took place at Poker Flat Research Range north of Fairbanks as part of a NASA sounding rocket campaign.
Associate professor of physics and astronomy Gerald Lehmacher is the principal investigator for the experiment and was assisted by graduate students Shelton Simmons and Liyu Guo.
“After six days of cloudy and snowy weather, we had perfect conditions with a clear, moonless night sky over interior Alaska,” said Lehmacher. “We needed excellent viewing conditions from three camera sites to photograph the luminescent trails the payloads produced in the upper atmosphere.”
The rockets were 35-foot, two-stage Terrier Orions. They released trimethyl aluminum that creates a glowing vapour trail nearly 87 miles up. Sensitive cameras on the ground track the trails. From that Lehmacher and his team can analyse upper-atmospheric winds by tracking how the vapour trails form, billow, disperse and diffuse. Two of the rockets had an additional deployable payload with instrumentation to measure electron density and neutral temperature and turbulence.
The instrumented sections are a collaboration of Clemson with Penn State University and the Leibniz-Institute for Atmospheric Physics in Germany. The University of Alaska assisted in the study with ground-based laser radar and other optical instruments. The project is sponsored by a NASA grant for three years.
In January, Clemson physicists travelled to Norway to carry out a joint experiment with Japanese scientists to study atmospheric winds and circulation from heating created by electrical currents associated with Northern Lights displays. The measurements were made with instruments flown on a Japanese S-310 rocket launched from the Andoya Rocket Range in northern Norway, as well as a suite of sensitive radar and camera instruments on the ground.
The experiment was a collaboration between the Japan Aerospace Exploration Agency and the department of physics and astronomy at Clemson. Professor Miguel Larsen was the investigator responsible for the wind measurement aboard the instrumented rocket and was assisted by three undergraduate students, Lucas Hurd, Matt Jenkins and Matt Henderson.
Source: CLEMSON UNIVERSITY
European Satellites Provide New Insight into Ozone-Depleting Species
Using data from the satellite based MIPAS and GOME-2 instruments, scientists have for the first time detected important bromine species in the atmosphere. These new measurements will help scientists to better understand sources of ozone depleting species and to improve simulations of stratospheric ozone chemistry.
Despite the detection of bromine monoxide (BrO) in the atmosphere some 20 years ago, bromine nitrate (BrONO2) was first observed in 2008 when scientists from the Karlsruhe Institute of Technology discovered the gas’s weak signal with data from MIPAS (the Michelson Interferometer for Passive Atmospheric Sounding).
"By comparing the novel MIPAS BrONO2 dataset with model calculations and BrO measurements by SCIAMACHY on Envisat, our general understanding of stratospheric bromine chemistry has been clearly confirmed," said Michael Höpfner of Germany’s Karlsruhe Institute of Technology. "These new observations also enable an independent estimation of the total amount of bromine in the stratosphere, which is important for understanding the origins of stratospheric bromine."
The stratospheric ozone layer that protects life on Earth from harmful ultraviolet rays is vulnerable to the presence of certain chemicals in the atmosphere such as chlorine and bromine. In spite of its much smaller concentrations, bromine is actually, after chlorine, the second most important halogen species destroying ozone in the stratosphere.
Since chlorine levels in the stratosphere have been dropping since the ban on man-made chlorofluorocarbons (CFCs), bromine will become even more important in stratospheric ozone chemistry. Bromine’s importance will increase in part because there are more natural sources, such as volcanoes, for bromine emissions than for chlorine.
Volcanoes have long been known to play an important role in influencing stratospheric ozone chemistry because of the gases and particles they shoot into the atmosphere. New findings from space suggest they are also a very important source of atmospheric bromine.
The reactive chemical bromine monoxide (BrO) has been measured in a number of volcanic plumes around the globe, but until recently it had never been measured by a space instrument.
In August 2008, the Kasatochi Volcano in Alaska's Aleutian Islands erupted explosively, sending a cloud of volcanic ash and gas more than 11 km into the atmosphere.
The following day, scientists from the Brussels based Belgian Institute for Space Aeronomy identified high bromine concentrations in the vicinity of the volcano with Envisat’s SCIAMACHY instrument and the Global Ozone Monitoring Experiment-2 (GOME-2) instrument aboard MetOp-A. (MetOp-A, developed by ESA and the European Organisation for the Exploitation of Meteorological Satellites (EUMETSAT), is Europe's first polar-orbiting satellite dedicated to operational meteorology.
"Because of the good regional coverage of the GOME-2 instrument, the transport of the Kasatochi BrO plume could be followed for six days after the eruption," Michel Van Roozendael from the Belgian Institute for Space Aeronomy said. "Using the Lagrangian dispersion model, results show that the volcanic BrO was directly injected into the upper troposphere/lower stratosphere at altitudes ranging from 8 to 12 km.
"The total mass of reactive bromine released in the atmosphere was estimated around 50 to 120 tons, which corresponds to approximately 25 per cent of the previously estimated total annual mass of reactive bromine emitted by volcanic activity."
Despite the detection of bromine monoxide (BrO) in the atmosphere some 20 years ago, bromine nitrate (BrONO2) was first observed in 2008 when scientists from the Karlsruhe Institute of Technology discovered the gas’s weak signal with data from MIPAS (the Michelson Interferometer for Passive Atmospheric Sounding).
"By comparing the novel MIPAS BrONO2 dataset with model calculations and BrO measurements by SCIAMACHY on Envisat, our general understanding of stratospheric bromine chemistry has been clearly confirmed," said Michael Höpfner of Germany’s Karlsruhe Institute of Technology. "These new observations also enable an independent estimation of the total amount of bromine in the stratosphere, which is important for understanding the origins of stratospheric bromine."
The stratospheric ozone layer that protects life on Earth from harmful ultraviolet rays is vulnerable to the presence of certain chemicals in the atmosphere such as chlorine and bromine. In spite of its much smaller concentrations, bromine is actually, after chlorine, the second most important halogen species destroying ozone in the stratosphere.
Since chlorine levels in the stratosphere have been dropping since the ban on man-made chlorofluorocarbons (CFCs), bromine will become even more important in stratospheric ozone chemistry. Bromine’s importance will increase in part because there are more natural sources, such as volcanoes, for bromine emissions than for chlorine.
Volcanoes have long been known to play an important role in influencing stratospheric ozone chemistry because of the gases and particles they shoot into the atmosphere. New findings from space suggest they are also a very important source of atmospheric bromine.
The reactive chemical bromine monoxide (BrO) has been measured in a number of volcanic plumes around the globe, but until recently it had never been measured by a space instrument.
In August 2008, the Kasatochi Volcano in Alaska's Aleutian Islands erupted explosively, sending a cloud of volcanic ash and gas more than 11 km into the atmosphere.
The following day, scientists from the Brussels based Belgian Institute for Space Aeronomy identified high bromine concentrations in the vicinity of the volcano with Envisat’s SCIAMACHY instrument and the Global Ozone Monitoring Experiment-2 (GOME-2) instrument aboard MetOp-A. (MetOp-A, developed by ESA and the European Organisation for the Exploitation of Meteorological Satellites (EUMETSAT), is Europe's first polar-orbiting satellite dedicated to operational meteorology.
"Because of the good regional coverage of the GOME-2 instrument, the transport of the Kasatochi BrO plume could be followed for six days after the eruption," Michel Van Roozendael from the Belgian Institute for Space Aeronomy said. "Using the Lagrangian dispersion model, results show that the volcanic BrO was directly injected into the upper troposphere/lower stratosphere at altitudes ranging from 8 to 12 km.
"The total mass of reactive bromine released in the atmosphere was estimated around 50 to 120 tons, which corresponds to approximately 25 per cent of the previously estimated total annual mass of reactive bromine emitted by volcanic activity."
Study Predicts When Invasive Species Can Travel More Readily By Air
Global airlines be forewarned: June 2010 could be a busy month for invasive plants, insects and animals seeking free rides to distant lands.
A new study forecasts when climate factors such as temperature, humidity and rainfall will match at geographically distant airline departure and destination points, which could help to shuffle invasive species, and the diseases they may carry, across the globe along existing flight routes. The findings provide a framework that could help people who monitor airline flights, and the people, baggage and cargo aboard, to plan more efficiently and accurately for detecting and intercepting invasives.
Andy Tatem, who holds a joint position at the Emerging Pathogens Institute and the University of Florida’s geography department, said his model uses the latest forecast data for climate change and air traffic volumes.
“The problem is that as the global transport networks expand, we’re getting more and more invasive species and pathogens coming from different parts of the world that have survived isolated for thousands of years,” said Tatem, who joined UF in January. “But now they have this high-speed link going between different regions of the world.”
The study was published online Jan. 22 in the journal Ecography, and the work was performed in his previous position at the University of Oxford.
Tatem predicts a peak risk will be reached in June 2010, when multiple factors converge to create a month when the climate factors at many flight origin and destination airports would be most similar.
“The model shows us that climatic shifts are not greatly significant over the next few years,” Tatem said. “But the great increase in traffic volumes from expanding economies in India and China are likely to have a significant effect on moving species. This gives us much more of a detailed idea on the importance of key risk factors and how these change over time, compared to previous work we did in 2007.”
Tatem reached his conclusions by comparing fine-scale global climate models for 2009 and 2010 prepared by the Hadley Centre for Climate Prediction and Research with models forecasting traffic volumes on existing airline networks, prepared by OAG Worldwide. The airline models include more than 35 million scheduled flights between 3,570 airports on more than 44,000 different routes.
But exactly how native species wind up aboard an outbound passenger or freight aircraft is still being studied. Tatem said it can be a combination of goods, transport and people bringing things aboard either accidentally or knowingly.
“Some studies have shown that mosquitoes can fly on randomly, or they may get into baggage,” he said. “But some things, like plant pathogens, happen when people purposely bring fruit aboard, or they may bring in a plant that makes it through inspections, or they may just have seeds stuck in the soles of their shoes.”
These activities compound over the entire global system, threatening local economies, public health and native ecosystems. In 2007, a biological invasion was documented from a single invasive insect in a study conducted by York University biologists Amro Zayed and Laurence Packer. A different 2007 study by Andrew Liebhold, published in American Entomologist, examined records of US Department of Agricultural inspectors encountering invasive species in airline baggage. Liebhold, a research entomologist with the North-eastern Research Station of the US Forest Service, reported that infested fruit, mainly from the tropics, was the most commonly intercepted commodity, and that flies, cicadas, planthoppers, aphids and scale insects were the most commonly intercepted invasive insects.
Liebhold said Tatem’s study provided fascinating predictions about expected trends in the accidental transport of invasive species among continents.
“Unfortunately, unwitting air passengers have too frequently provided transport of plant pests and human diseases and this trend has increased with elevated intercontinental passenger traffic,” Liebhold said. “Hopefully, government agencies will pay attention to these results and utilise them to strengthen inspection activities at airports in order to protect the world from the devastating impacts of alien species on natural ecosystems as well as on human health.”
A new study forecasts when climate factors such as temperature, humidity and rainfall will match at geographically distant airline departure and destination points, which could help to shuffle invasive species, and the diseases they may carry, across the globe along existing flight routes. The findings provide a framework that could help people who monitor airline flights, and the people, baggage and cargo aboard, to plan more efficiently and accurately for detecting and intercepting invasives.
Andy Tatem, who holds a joint position at the Emerging Pathogens Institute and the University of Florida’s geography department, said his model uses the latest forecast data for climate change and air traffic volumes.
“The problem is that as the global transport networks expand, we’re getting more and more invasive species and pathogens coming from different parts of the world that have survived isolated for thousands of years,” said Tatem, who joined UF in January. “But now they have this high-speed link going between different regions of the world.”
The study was published online Jan. 22 in the journal Ecography, and the work was performed in his previous position at the University of Oxford.
Tatem predicts a peak risk will be reached in June 2010, when multiple factors converge to create a month when the climate factors at many flight origin and destination airports would be most similar.
“The model shows us that climatic shifts are not greatly significant over the next few years,” Tatem said. “But the great increase in traffic volumes from expanding economies in India and China are likely to have a significant effect on moving species. This gives us much more of a detailed idea on the importance of key risk factors and how these change over time, compared to previous work we did in 2007.”
Tatem reached his conclusions by comparing fine-scale global climate models for 2009 and 2010 prepared by the Hadley Centre for Climate Prediction and Research with models forecasting traffic volumes on existing airline networks, prepared by OAG Worldwide. The airline models include more than 35 million scheduled flights between 3,570 airports on more than 44,000 different routes.
But exactly how native species wind up aboard an outbound passenger or freight aircraft is still being studied. Tatem said it can be a combination of goods, transport and people bringing things aboard either accidentally or knowingly.
“Some studies have shown that mosquitoes can fly on randomly, or they may get into baggage,” he said. “But some things, like plant pathogens, happen when people purposely bring fruit aboard, or they may bring in a plant that makes it through inspections, or they may just have seeds stuck in the soles of their shoes.”
These activities compound over the entire global system, threatening local economies, public health and native ecosystems. In 2007, a biological invasion was documented from a single invasive insect in a study conducted by York University biologists Amro Zayed and Laurence Packer. A different 2007 study by Andrew Liebhold, published in American Entomologist, examined records of US Department of Agricultural inspectors encountering invasive species in airline baggage. Liebhold, a research entomologist with the North-eastern Research Station of the US Forest Service, reported that infested fruit, mainly from the tropics, was the most commonly intercepted commodity, and that flies, cicadas, planthoppers, aphids and scale insects were the most commonly intercepted invasive insects.
Liebhold said Tatem’s study provided fascinating predictions about expected trends in the accidental transport of invasive species among continents.
“Unfortunately, unwitting air passengers have too frequently provided transport of plant pests and human diseases and this trend has increased with elevated intercontinental passenger traffic,” Liebhold said. “Hopefully, government agencies will pay attention to these results and utilise them to strengthen inspection activities at airports in order to protect the world from the devastating impacts of alien species on natural ecosystems as well as on human health.”
UN Reaches Landmark Agreement to Reduce Global Mercury Pollution
Obama Administration Reverses US Position, Takes Leadership Role in Negotiations
Representatives from more than 140 countries today committed to reduce global mercury pollution, which will help protect the world's citizens from the dangerous neurotoxin. This agreement was propelled by the United States' reversal in policy, which also influenced policy reversals of other countries, including China and India. The announcement is a historic step forward in the fight against mercury pollution, according to scientists and policy experts at the Natural Resources Defence Council (NRDC).
"This is great news for reducing mercury pollution around the world, and shows a commitment from the Obama Administration to international environmental issues," said Susan Egan Keane, policy analyst for NRDC. "The United States has taken a leadership role that will chart a new course on mercury protections around the world. We have set a strong example that is already influencing others to do the same."
The committed countries will reduce risks to human health and the environment from mercury by coordinating global cuts in the use and release of mercury into our air, water and land. The United Nations Environment Program Governing Council, which is meeting this week in Nairobi, Kenya, will now develop a legally binding treaty to be enacted by 2013. The treaty will include actions to reduce global mercury pollution and human exposure to the chemical, by reducing intentional use of mercury in industrial processes and products and reducing emissions from coal plants and smelters. It will also address the problems posed by mercury waste sites.
"Today we have won a momentous human health victory that will reduce illness and save lives both here and abroad," said Keane. "This globally coordinated plan will substantially reduce mercury contamination in fish, prevent the contamination of our water, and shield our children from a dangerous chemical."
Mercury is a dangerous neurotoxin and global pollutant that moves thousands of miles from its original source. Its travels through air and water, accumulating in large predatory fish, and poisons people mainly through the consumption of contaminated fish, including tuna. It is especially dangerous for pregnant women, babies and small children, as it can gravely impede brain development.
Coal-fired power plants are the largest source of mercury air emissions worldwide, emitting 50 tons of mercury pollution every year in the US alone. As the price of oil has risen, coal has become a more economically attractive source of energy in countries where it is abundant and inexpensive. Currently, coal-fired power plants supply 75 per cent of China's energy; in the next eight years, China was expected to add more than 560 new coal plants, a pace of more than one new plant each week. Chemical manufacturing facilities in the European Union, India and China and small-scale gold mines in the developing world are also among the biggest mercury pollution sources.
NRDC has worked to enact mercury protections at the national and global levels for decades. Last year, NRDC successfully advocated for a new US ban on the export of mercury, working closely with members of Congress, including the bill's sponsor, then-Senator Obama.
Representatives from more than 140 countries today committed to reduce global mercury pollution, which will help protect the world's citizens from the dangerous neurotoxin. This agreement was propelled by the United States' reversal in policy, which also influenced policy reversals of other countries, including China and India. The announcement is a historic step forward in the fight against mercury pollution, according to scientists and policy experts at the Natural Resources Defence Council (NRDC).
"This is great news for reducing mercury pollution around the world, and shows a commitment from the Obama Administration to international environmental issues," said Susan Egan Keane, policy analyst for NRDC. "The United States has taken a leadership role that will chart a new course on mercury protections around the world. We have set a strong example that is already influencing others to do the same."
The committed countries will reduce risks to human health and the environment from mercury by coordinating global cuts in the use and release of mercury into our air, water and land. The United Nations Environment Program Governing Council, which is meeting this week in Nairobi, Kenya, will now develop a legally binding treaty to be enacted by 2013. The treaty will include actions to reduce global mercury pollution and human exposure to the chemical, by reducing intentional use of mercury in industrial processes and products and reducing emissions from coal plants and smelters. It will also address the problems posed by mercury waste sites.
"Today we have won a momentous human health victory that will reduce illness and save lives both here and abroad," said Keane. "This globally coordinated plan will substantially reduce mercury contamination in fish, prevent the contamination of our water, and shield our children from a dangerous chemical."
Mercury is a dangerous neurotoxin and global pollutant that moves thousands of miles from its original source. Its travels through air and water, accumulating in large predatory fish, and poisons people mainly through the consumption of contaminated fish, including tuna. It is especially dangerous for pregnant women, babies and small children, as it can gravely impede brain development.
Coal-fired power plants are the largest source of mercury air emissions worldwide, emitting 50 tons of mercury pollution every year in the US alone. As the price of oil has risen, coal has become a more economically attractive source of energy in countries where it is abundant and inexpensive. Currently, coal-fired power plants supply 75 per cent of China's energy; in the next eight years, China was expected to add more than 560 new coal plants, a pace of more than one new plant each week. Chemical manufacturing facilities in the European Union, India and China and small-scale gold mines in the developing world are also among the biggest mercury pollution sources.
NRDC has worked to enact mercury protections at the national and global levels for decades. Last year, NRDC successfully advocated for a new US ban on the export of mercury, working closely with members of Congress, including the bill's sponsor, then-Senator Obama.
World's Fisheries Face Climate Change Threat
A new international study has warned that millions of people dependent on fisheries in Africa, Asia and South America could face unprecedented hardship as a consequence of climate change.
Researchers examined the fisheries of 132 nations to determine which were the most vulnerable, based on the potential environmental impact of climate change, how dependent their economy and diet were on fisheries, and the capacity of the country to adapt.
Climate change can affect the temperature of inland lakes, the health of reefs and how nutrients circulate in the oceans, the researchers say.
They identified 33 countries as "highly vulnerable" to the effects of global warming on fisheries.
These countries produce 20 per cent of the world's fish exports and 22 are already classified by the UN as "least developed". Inhabitants of vulnerable countries are also more dependent on fish for protein, 27 per cent of dietary protein is gained from fish, compared with 13 per cent in other countries. Two-thirds of the most vulnerable nations identified are in tropical Africa.
The study, led by the Malaysia-based WorldFish Centre, was published in Fish and Fisheries this month.
Using an approach developed from the International Panel on Climate Change's methods for assessing the vulnerability of nations to climate change as a whole, the authors determined that both coastal and landlocked African countries such as Guinea, Malawi, Senegal and Uganda; Asian countries including Bangladesh, Cambodia, Pakistan, and Yemen; and Colombia and Peru in South America, are among the most vulnerable.
The 33 countries should be a priority for climate change adaptation efforts and, more importantly, their fisheries should be maintained or enhanced to ensure they can make contributions to poverty reduction, say the authors.
Edward Allison, director of policy, economic and social science at WorldFish Centre and the paper's lead author, says that to ensure fisheries continue to support the poorest people policies should be implemented on two fronts, mitigation and adaptation.
But while mitigation can be valuable, because of relationships between emission reductions, energy saving and responsible fisheries, "the challenge of adaptation is both significant and potentially urgent", he says.
"Policy support for adaptation involves supporting measures to reduce exposure of fishing people to climate related risks, reducing dependence of peoples' livelihoods on climate sensitive resources, and supporting people's capacity to anticipate and cope with climate-related changes", he concludes.
Researchers examined the fisheries of 132 nations to determine which were the most vulnerable, based on the potential environmental impact of climate change, how dependent their economy and diet were on fisheries, and the capacity of the country to adapt.
Climate change can affect the temperature of inland lakes, the health of reefs and how nutrients circulate in the oceans, the researchers say.
They identified 33 countries as "highly vulnerable" to the effects of global warming on fisheries.
These countries produce 20 per cent of the world's fish exports and 22 are already classified by the UN as "least developed". Inhabitants of vulnerable countries are also more dependent on fish for protein, 27 per cent of dietary protein is gained from fish, compared with 13 per cent in other countries. Two-thirds of the most vulnerable nations identified are in tropical Africa.
The study, led by the Malaysia-based WorldFish Centre, was published in Fish and Fisheries this month.
Using an approach developed from the International Panel on Climate Change's methods for assessing the vulnerability of nations to climate change as a whole, the authors determined that both coastal and landlocked African countries such as Guinea, Malawi, Senegal and Uganda; Asian countries including Bangladesh, Cambodia, Pakistan, and Yemen; and Colombia and Peru in South America, are among the most vulnerable.
The 33 countries should be a priority for climate change adaptation efforts and, more importantly, their fisheries should be maintained or enhanced to ensure they can make contributions to poverty reduction, say the authors.
Edward Allison, director of policy, economic and social science at WorldFish Centre and the paper's lead author, says that to ensure fisheries continue to support the poorest people policies should be implemented on two fronts, mitigation and adaptation.
But while mitigation can be valuable, because of relationships between emission reductions, energy saving and responsible fisheries, "the challenge of adaptation is both significant and potentially urgent", he says.
"Policy support for adaptation involves supporting measures to reduce exposure of fishing people to climate related risks, reducing dependence of peoples' livelihoods on climate sensitive resources, and supporting people's capacity to anticipate and cope with climate-related changes", he concludes.
Electricity Systems Can Cope With Large Scale Wind Power
Recent research proves that Dutch power stations are able to cope at any time in the future with variations in demand for electricity and supply of wind power, as long as use is made of up-to-date wind forecasts. PhD candidate Bart Ummels demonstrates that there is no need for energy storage facilities. Ummels will receive his PhD on this topic in February.
Wind is variable and can only partially be predicted. The large-scale use of wind power in the electricity system is therefore tricky. PhD candidate Bart Ummels investigated the consequences of using a substantial amount of wind power within the Dutch electricity system. He used simulation models, such as those developed by Dutch transmission system operator TenneT, to pinpoint potential problems (and solutions).
His results indicate that wind power requires greater flexibility from existing power stations. Sometimes larger reserves are needed, but more frequently power stations will have to decrease production in order to make room for wind-generated power. It is therefore essential to continually recalculate the commitment of power stations using the latest wind forecasts. This reduces potential forecast errors and enables wind power to be integrated more efficiently.
Ummels looked at wind power up to 12 GW, 8 GW of which at sea, which is enough to meet about one third of the Netherlands' demand for electricity. Dutch power stations are able to cope at any time in the future with variations in demand for electricity and supply of wind power, as long as use is made of up-to-date, improved wind forecasts. It is TenneT's task to integrate large-scale wind power into the electricity grid. Lex Hartman, TenneT's Director of Corporate Development says, “n a joint effort, TU Delft and TenneT further developed the simulation model that can be used to study the integration of large-scale wind power. The results show that in the Netherlands we can integrate between 4 GW and 10 GW into the grid without needing any additional measures.”
Surpluses
Instead of the common question 'What do we do when the wind isn't blowing?’, the more relevant question is 'Where do we put all the electricity if it is very windy at night?'. This is because, for instance, a coal-fired power station cannot simply be turned off. One solution is provided by the international trade in electricity, because other countries often can use the surplus. Moreover, a broadening of the 'opening hours' of the international electricity market benefits wind power. At the moment, utilities determine one day ahead how much electricity they intend to purchase or sell abroad. Wind power can be better used if the time difference between the trade and the wind forecast is smaller.
No energy storage
Ummels' research also demonstrates that energy storage is not required. The results indicate that the international electricity market is a promising and cheaper solution for the use of wind power.
Making power stations more flexible is also better than storage. The use of heating boilers, for instance, means that combined heat and power plants operate more flexibly, which can consequently free up capacity for wind power at night.
The use of wind power in the Dutch electricity system could lead to a reduction in production costs of EUR1.5 billion annually and a reduction in CO2 emissions of 19 million tons a year.
Wind is variable and can only partially be predicted. The large-scale use of wind power in the electricity system is therefore tricky. PhD candidate Bart Ummels investigated the consequences of using a substantial amount of wind power within the Dutch electricity system. He used simulation models, such as those developed by Dutch transmission system operator TenneT, to pinpoint potential problems (and solutions).
His results indicate that wind power requires greater flexibility from existing power stations. Sometimes larger reserves are needed, but more frequently power stations will have to decrease production in order to make room for wind-generated power. It is therefore essential to continually recalculate the commitment of power stations using the latest wind forecasts. This reduces potential forecast errors and enables wind power to be integrated more efficiently.
Ummels looked at wind power up to 12 GW, 8 GW of which at sea, which is enough to meet about one third of the Netherlands' demand for electricity. Dutch power stations are able to cope at any time in the future with variations in demand for electricity and supply of wind power, as long as use is made of up-to-date, improved wind forecasts. It is TenneT's task to integrate large-scale wind power into the electricity grid. Lex Hartman, TenneT's Director of Corporate Development says, “n a joint effort, TU Delft and TenneT further developed the simulation model that can be used to study the integration of large-scale wind power. The results show that in the Netherlands we can integrate between 4 GW and 10 GW into the grid without needing any additional measures.”
Surpluses
Instead of the common question 'What do we do when the wind isn't blowing?’, the more relevant question is 'Where do we put all the electricity if it is very windy at night?'. This is because, for instance, a coal-fired power station cannot simply be turned off. One solution is provided by the international trade in electricity, because other countries often can use the surplus. Moreover, a broadening of the 'opening hours' of the international electricity market benefits wind power. At the moment, utilities determine one day ahead how much electricity they intend to purchase or sell abroad. Wind power can be better used if the time difference between the trade and the wind forecast is smaller.
No energy storage
Ummels' research also demonstrates that energy storage is not required. The results indicate that the international electricity market is a promising and cheaper solution for the use of wind power.
Making power stations more flexible is also better than storage. The use of heating boilers, for instance, means that combined heat and power plants operate more flexibly, which can consequently free up capacity for wind power at night.
The use of wind power in the Dutch electricity system could lead to a reduction in production costs of EUR1.5 billion annually and a reduction in CO2 emissions of 19 million tons a year.
Stimulus Funds to Recharge Electric Infrastructure
The state is vetting hundreds of conservation and renewable energy projects as it readies for President Obama's stimulus money, including a long-term plan to modernise its aging electric grid.
"The future of the grid is a priority,'' said state Secretary of Energy and Environmental Affairs Ira Bowles about the need to upgrade electric infrastructure to boost alternative energy production.
But Massachusetts may not be ready for several years to fully implement one key stimulus initiative. The USD787 billion package has USD37.5 billion aimed at energy investments, including USD11 billion for electric grid updates and so-called “smart” meters.
Obama has called for the installation of 40 million smart meters in American homes. Advocates say that could be a key to long-term conservation and efficiency.
Smart meters, which enable two way radio communication between the utility and monitors in the home, offer consumers real time info on their energy usage.
The goal is behaviour modification, promoting energy conscious behaviour by matching usage with pricing options. Cell phone users, for example, can already track costs per call in real time.
Smart meters “are the kind of technology that does lend itself to major federal intervention,” said Bowles. He said the systems need to be researched before utilities, consumers and regulators can go forward in Massachusetts.
Nstar is launching a pilot program in 2010, providing smart meters to 2,750 volunteer customers at no cost to them. The pilot program was mandated by the Legislature in 2008 as part of the Green Communities Act.
"The program will give Nstara chance to see what kind of reaction we get from residential customers when we ask them to conserve on some of the hottest days of the year when the overall electricity grid is pushed to extremes," said Nstar spokeswoman Caroline Allen. The grid is the interconnected network for delivering electricity from suppliers to consumers.
Like Bowles, Allen believes the major financial push from Washington for smart meter installation could be a tipping point.
"Anything that defrays the costs would be beneficial in getting new technologies in place," Allen said. "Consistent with other policies like renewable energy and energy efficiency, investments in the grid and emerging smart grid programs will only help move things along."
Massachusetts' list of potential energy projects will contain plenty of "low-hanging fruit," such as weatherisation and public building conservation, that can be started quickly and help create jobs in the conservation and renewable energy sectors, Bowles said.
The immediate focus "is on energy efficiency and more renewable energy," he said. But as the state makes long-term plans for transforming the grid and installing smart meters, it will look to models in other states. For example, in Texas, electricity distributor Oncor has embarked on a USD690 million project to install three million smart meters within four years.
Oncor households will pay for the program at USD2.22 a month over the next 11 years. If used properly, the system will generate conservation savings from six per cent to 15 per cent annually by allowing consumers access to near real-time information about usage via a monitor in the home or the Internet, Oncor spokesman Jim Greer said.
“Advanced metering systems are fundamental for our future and critical for the state as an economic engine,” Greer said. The company estimates conservation savings could negate the need for two or three CO2-emitting power plants to meet expected increases in demand.
Some three million more smart meters are expected to be installed by other Texas utility companies by 2015. A Bay State smart meter expert, who develops the software code for the devices being used across the country, said the future has at last arrived.
"Electric meters have been undergoing a transformation over the past few decades, but this transformation has now increased in size and scope," said Skip Ashton of Boston-based Ember, which develops radio chips, networking software and application code used in smart meters. "Installation has been slow because utilities do not rush into new technologies."
"The future of the grid is a priority,'' said state Secretary of Energy and Environmental Affairs Ira Bowles about the need to upgrade electric infrastructure to boost alternative energy production.
But Massachusetts may not be ready for several years to fully implement one key stimulus initiative. The USD787 billion package has USD37.5 billion aimed at energy investments, including USD11 billion for electric grid updates and so-called “smart” meters.
Obama has called for the installation of 40 million smart meters in American homes. Advocates say that could be a key to long-term conservation and efficiency.
Smart meters, which enable two way radio communication between the utility and monitors in the home, offer consumers real time info on their energy usage.
The goal is behaviour modification, promoting energy conscious behaviour by matching usage with pricing options. Cell phone users, for example, can already track costs per call in real time.
Smart meters “are the kind of technology that does lend itself to major federal intervention,” said Bowles. He said the systems need to be researched before utilities, consumers and regulators can go forward in Massachusetts.
Nstar is launching a pilot program in 2010, providing smart meters to 2,750 volunteer customers at no cost to them. The pilot program was mandated by the Legislature in 2008 as part of the Green Communities Act.
"The program will give Nstara chance to see what kind of reaction we get from residential customers when we ask them to conserve on some of the hottest days of the year when the overall electricity grid is pushed to extremes," said Nstar spokeswoman Caroline Allen. The grid is the interconnected network for delivering electricity from suppliers to consumers.
Like Bowles, Allen believes the major financial push from Washington for smart meter installation could be a tipping point.
"Anything that defrays the costs would be beneficial in getting new technologies in place," Allen said. "Consistent with other policies like renewable energy and energy efficiency, investments in the grid and emerging smart grid programs will only help move things along."
Massachusetts' list of potential energy projects will contain plenty of "low-hanging fruit," such as weatherisation and public building conservation, that can be started quickly and help create jobs in the conservation and renewable energy sectors, Bowles said.
The immediate focus "is on energy efficiency and more renewable energy," he said. But as the state makes long-term plans for transforming the grid and installing smart meters, it will look to models in other states. For example, in Texas, electricity distributor Oncor has embarked on a USD690 million project to install three million smart meters within four years.
Oncor households will pay for the program at USD2.22 a month over the next 11 years. If used properly, the system will generate conservation savings from six per cent to 15 per cent annually by allowing consumers access to near real-time information about usage via a monitor in the home or the Internet, Oncor spokesman Jim Greer said.
“Advanced metering systems are fundamental for our future and critical for the state as an economic engine,” Greer said. The company estimates conservation savings could negate the need for two or three CO2-emitting power plants to meet expected increases in demand.
Some three million more smart meters are expected to be installed by other Texas utility companies by 2015. A Bay State smart meter expert, who develops the software code for the devices being used across the country, said the future has at last arrived.
"Electric meters have been undergoing a transformation over the past few decades, but this transformation has now increased in size and scope," said Skip Ashton of Boston-based Ember, which develops radio chips, networking software and application code used in smart meters. "Installation has been slow because utilities do not rush into new technologies."
Climate Change Is Not Taken Seriously Because Media Is Not Highlighting Its Significance
Climate change will not be taken seriously until the media highlights its significance, say researchers at the University of Liverpool.
Dr Neil Gavin, from the School of Politics and Communication Studies, believes the way the media handles issues like climate change shapes the public’s perception of its importance. Limited coverage is unlikely to convince readers that climate change is a serious problem that warrants immediate and decisive action.
Researchers found that the total number of articles on climate change printed over three years was fewer than one month’s worth of articles featuring health issues. The articles offered mixed messages about the seriousness and imminence of problems facing the environment.
Dr Gavin explains, “Our research suggests that the media is not treating these issues with the seriousness that scientists would say they deserve. The research company lpsos-MORI found that 50 per cent of people think the jury is still out on the causes of global warming. The limited amount of media coverage, which tends to be restricted to the broadsheets, means that this statistic is unlikely to alter in the short-term.
“Climate change, therefore, may not be high enough on the media agenda to stimulate the sort of public concern that prompts concerted political action. The media may well continue to focus its attention on health, the economy or crime, thereby drawing public attention away from the issue of climate change.”
“This is more likely when resources are stretched, government popularity is on the wane, or where more pressing, non-climate-related issues force the government to direct expenditure or invest its political capital and energy elsewhere.”
“Even if the British Government wanted to push climate change further up the media agenda, it is not necessarily in a position to shape the debate that takes place in the media,” he added.
Dr Neil Gavin, from the School of Politics and Communication Studies, believes the way the media handles issues like climate change shapes the public’s perception of its importance. Limited coverage is unlikely to convince readers that climate change is a serious problem that warrants immediate and decisive action.
Researchers found that the total number of articles on climate change printed over three years was fewer than one month’s worth of articles featuring health issues. The articles offered mixed messages about the seriousness and imminence of problems facing the environment.
Dr Gavin explains, “Our research suggests that the media is not treating these issues with the seriousness that scientists would say they deserve. The research company lpsos-MORI found that 50 per cent of people think the jury is still out on the causes of global warming. The limited amount of media coverage, which tends to be restricted to the broadsheets, means that this statistic is unlikely to alter in the short-term.
“Climate change, therefore, may not be high enough on the media agenda to stimulate the sort of public concern that prompts concerted political action. The media may well continue to focus its attention on health, the economy or crime, thereby drawing public attention away from the issue of climate change.”
“This is more likely when resources are stretched, government popularity is on the wane, or where more pressing, non-climate-related issues force the government to direct expenditure or invest its political capital and energy elsewhere.”
“Even if the British Government wanted to push climate change further up the media agenda, it is not necessarily in a position to shape the debate that takes place in the media,” he added.
Global 100 Most Sustainable Corporations list unveiled in Davos
Corporate Knights Inc. and Innovest Strategic Value Advisors today announced the fifth annual Global 100 list of the most sustainable large corporations in the world.
The Global 100 includes companies from 15 countries encompassing all sectors of the economy that were evaluated according to how effectively they manage environmental, social and governance risks and opportunities, relative to their industry peers.
Five Canadian companies are included in this year’s list of the 100 most sustainable large companies in the world, including EnCana Corp and Royal Bank of Canada, both of whom are members of the EXCEL Partnership, a GLOBE Foundation initiative uniting leading Canadian companies committed to integrating environmental, economical and social performance with their business strategies.
Other Canadian corporations on the top 100 list are Toronto-Dominion Bank, Telus Corp., and TransCanada Corp.
"I’m proud that RBC and our employees continue to be recognised for our efforts to do business in a sustainable and responsible manner," said Gordon M. Nixon, President and Chief Executive Officer, RBC. "RBC is committed to doing better for our clients, our investors, our employees and our communities, through a focused approach to corporate responsibility, a disciplined strategy, sound risk management, strong balance sheet, and a diversified business mix."
The United States led the way with 20 Global 100 companies (four more than in 2008). The United Kingdom followed with 19 (down from 24 in 2008) and Japan improved by two on its 2008 tally with a total of 15 companies qualifying in 2009. Rounding out the top five countries with the most constituents were France (eight) and Germany (seven), while Canada, Finland, and Sweden each registered five Global 100 constituents. Two-thirds (65/100) of the 2008 companies remained on the list in 2009.
As the Global 100 companies are meant to isolate those firms best equipped to thrive in the long-term because of their holistic approach to managing stakeholder relationships, this year for the first time, The Global 100 traced back all 100 constituents to their year of origin to see what kind of longevity they had demonstrated to date.
The average age of the 2009 Global 100 companies was 102 years old, ranging from Stora Enso OYJ (1122 AD) to Telus Corporation (1999 AD). In all, 46 of the 2009 Global 100 companies have been in existence for at least 100 years.
Toby Heaps, Editor of Corporate Knights magazine, says, "While markets go up and down, companies like the Global 100 members that prudently take care of the interests of all their stakeholders, offer the best bet for society and investors in the long-term."
From its inception in February 2005, the Global 100 Most Sustainable Corporations has outperformed its benchmark (the MSCI World Index) by 480 basis points per annum to end of year 2008.
Matthew Kiernan, CEO of GLOBE Award winner Innovest, a New York-based investment advisory firm, whose analysis underpins the list, notes: "The continuing out-performance of the Global 100, even in the midst of the current global financial crisis, provides eloquent testimony-and yet more evidence-for investors, company executives, governments and civil society alike. Superior positioning and performance on environmental, social, and governance issues does provide a valuable leading indicator of better-managed, more agile, ‘future-proof’ companies."
"And we expect this ‘sustainability premium’ to become even larger in the coming years," he added.
This year’s Global 100 were recognised at the Davos World Economic Forum at a private dinner hosted by Corporate Knights and Innovest. The dinner discussion explored the question of what will be the next motor to power the global economy, and how investors and policy makers can best rev it up. The dinner featured remarks from billionaire investor George Soros, Chairman of Soros Fund Management, Lord Nicholas Stern, and Nobel laureate economist Joseph Stiglitz.
The Global 100 includes companies from 15 countries encompassing all sectors of the economy that were evaluated according to how effectively they manage environmental, social and governance risks and opportunities, relative to their industry peers.
Five Canadian companies are included in this year’s list of the 100 most sustainable large companies in the world, including EnCana Corp and Royal Bank of Canada, both of whom are members of the EXCEL Partnership, a GLOBE Foundation initiative uniting leading Canadian companies committed to integrating environmental, economical and social performance with their business strategies.
Other Canadian corporations on the top 100 list are Toronto-Dominion Bank, Telus Corp., and TransCanada Corp.
"I’m proud that RBC and our employees continue to be recognised for our efforts to do business in a sustainable and responsible manner," said Gordon M. Nixon, President and Chief Executive Officer, RBC. "RBC is committed to doing better for our clients, our investors, our employees and our communities, through a focused approach to corporate responsibility, a disciplined strategy, sound risk management, strong balance sheet, and a diversified business mix."
The United States led the way with 20 Global 100 companies (four more than in 2008). The United Kingdom followed with 19 (down from 24 in 2008) and Japan improved by two on its 2008 tally with a total of 15 companies qualifying in 2009. Rounding out the top five countries with the most constituents were France (eight) and Germany (seven), while Canada, Finland, and Sweden each registered five Global 100 constituents. Two-thirds (65/100) of the 2008 companies remained on the list in 2009.
As the Global 100 companies are meant to isolate those firms best equipped to thrive in the long-term because of their holistic approach to managing stakeholder relationships, this year for the first time, The Global 100 traced back all 100 constituents to their year of origin to see what kind of longevity they had demonstrated to date.
The average age of the 2009 Global 100 companies was 102 years old, ranging from Stora Enso OYJ (1122 AD) to Telus Corporation (1999 AD). In all, 46 of the 2009 Global 100 companies have been in existence for at least 100 years.
Toby Heaps, Editor of Corporate Knights magazine, says, "While markets go up and down, companies like the Global 100 members that prudently take care of the interests of all their stakeholders, offer the best bet for society and investors in the long-term."
From its inception in February 2005, the Global 100 Most Sustainable Corporations has outperformed its benchmark (the MSCI World Index) by 480 basis points per annum to end of year 2008.
Matthew Kiernan, CEO of GLOBE Award winner Innovest, a New York-based investment advisory firm, whose analysis underpins the list, notes: "The continuing out-performance of the Global 100, even in the midst of the current global financial crisis, provides eloquent testimony-and yet more evidence-for investors, company executives, governments and civil society alike. Superior positioning and performance on environmental, social, and governance issues does provide a valuable leading indicator of better-managed, more agile, ‘future-proof’ companies."
"And we expect this ‘sustainability premium’ to become even larger in the coming years," he added.
This year’s Global 100 were recognised at the Davos World Economic Forum at a private dinner hosted by Corporate Knights and Innovest. The dinner discussion explored the question of what will be the next motor to power the global economy, and how investors and policy makers can best rev it up. The dinner featured remarks from billionaire investor George Soros, Chairman of Soros Fund Management, Lord Nicholas Stern, and Nobel laureate economist Joseph Stiglitz.
Costa Rican Hotels Improved Sustainability with the Rainforest Alliance
Hotels that signed on with the Rainforest Alliance to execute sustainable tourism practices in environmental, social and managerial processes improved their compliance with baseline criteria set by the Sustainable Tourism Certification Network of the Americas, according to a new study released by the Rainforest Alliance.
In a period of 18 months, five hotels in the Sarapiquí region in northern Costa Rica increased overall compliance to 7.8 from 4.5 on a scale of one to ten, with one being non-compliance and ten being full compliance. Criteria cover environmental aspects such as wastewater treatment and wildlife protection, social aspects such as worker safety and community interaction, and business aspects such as profitability and quality of services.
Collectively, hotels increased compliance in all criteria categories, showing their dedication to advancing sustainability, even if they were only able to afford small improvements at first. Areas with the highest level of improvement were “Socio-cultural Activities,” which could be supporting local artisans, or hiring local people, “Monitoring and Corrective Action” (e.g. monitoring water consumption, or writing hotel management policies) and “Solid Waste.”
“These hotels improved interactions with their environment, communities, guests and staff and set themselves apart in a competitive global marketplace,” said Ronald Sanabria, director of Rainforest Alliance’s sustainable tourism program. “As a result, the Sarapiquí region is seen as a major destination for sustainable tourism.”
Hacienda Pozo Azul spent USD6,000 to implement its sustainable program, which included buying an anaerobic digester to process wastewater and organic wastes, and the small hotel has so far seen a savings of USD150 per month on reduced electricity bills alone. Other hotels included in the study were Ara Ambigua, La Quinta Country Inn, Organisation for Tropical Studies-La Selva and Selva Verde Lodge and Rainforest Reserve. The hotels are continuing to improve their compliance ratings, and the Rainforest Alliance is collecting data on 258 other hotels completing the same program.
The Rainforest Alliance’s Best Management Practices (BMP) program provides tourism operations with dynamic workshops and seminars, training materials, technical assistance and diagnostic evaluations.
The Rainforest Alliance recently partnered with the United Nations Environment Programme (UNEP), United Nations Foundation and the United Nations World Tourism Organisation (UNWTO) to establish the Global Sustainable Tourism Criteria, which are universal guidelines for sustainable tourism.
The Rainforest Alliance works to conserve biodiversity and ensure sustainable livelihoods by transforming land-use practices, business practices and consumer behaviour.
For more information, visit www.rainforest-alliance.org.
In a period of 18 months, five hotels in the Sarapiquí region in northern Costa Rica increased overall compliance to 7.8 from 4.5 on a scale of one to ten, with one being non-compliance and ten being full compliance. Criteria cover environmental aspects such as wastewater treatment and wildlife protection, social aspects such as worker safety and community interaction, and business aspects such as profitability and quality of services.
Collectively, hotels increased compliance in all criteria categories, showing their dedication to advancing sustainability, even if they were only able to afford small improvements at first. Areas with the highest level of improvement were “Socio-cultural Activities,” which could be supporting local artisans, or hiring local people, “Monitoring and Corrective Action” (e.g. monitoring water consumption, or writing hotel management policies) and “Solid Waste.”
“These hotels improved interactions with their environment, communities, guests and staff and set themselves apart in a competitive global marketplace,” said Ronald Sanabria, director of Rainforest Alliance’s sustainable tourism program. “As a result, the Sarapiquí region is seen as a major destination for sustainable tourism.”
Hacienda Pozo Azul spent USD6,000 to implement its sustainable program, which included buying an anaerobic digester to process wastewater and organic wastes, and the small hotel has so far seen a savings of USD150 per month on reduced electricity bills alone. Other hotels included in the study were Ara Ambigua, La Quinta Country Inn, Organisation for Tropical Studies-La Selva and Selva Verde Lodge and Rainforest Reserve. The hotels are continuing to improve their compliance ratings, and the Rainforest Alliance is collecting data on 258 other hotels completing the same program.
The Rainforest Alliance’s Best Management Practices (BMP) program provides tourism operations with dynamic workshops and seminars, training materials, technical assistance and diagnostic evaluations.
The Rainforest Alliance recently partnered with the United Nations Environment Programme (UNEP), United Nations Foundation and the United Nations World Tourism Organisation (UNWTO) to establish the Global Sustainable Tourism Criteria, which are universal guidelines for sustainable tourism.
The Rainforest Alliance works to conserve biodiversity and ensure sustainable livelihoods by transforming land-use practices, business practices and consumer behaviour.
For more information, visit www.rainforest-alliance.org.
Restructuring the US Transport System: The Potential of High-Speed Rail
"Aside from the overriding need to stabilise atmospheric carbon dioxide (CO2) levels to stabilise climate, there are several other reasons for countries to restructure their transport systems, including the need to prepare for falling oil production, to alleviate traffic congestion, and to reduce air pollution," says Lester R. Brown, President of Earth Policy Institute, in a, recent release. "The US car-cantered transportation model that much of the world aspires to will not likely be viable over the long term even for the United States, much less for everywhere else."
The shape of future transportation systems centres around the changing role of the automobile. This in turn is being influenced by the transition from a predominantly rural global society to a largely urban one. By 2020 close to 55 per cent of us will be living in cities, where the role of cars is diminishing.
With world oil output close to peaking, there will not be enough economically recoverable oil to support a world fleet expansion along US lines or, indeed, to sustain the US fleet. Oil shocks are now a major security risk. The United States, where 88 per cent of the 133 million working people travel to work by car, is dangerously vulnerable.
While the future of transportation in cities lies with a mix of light rail, buses, bicycles, cars, and walking, the future of intercity travel over distances of 500 miles or less belongs to high-speed trains. Japan, with its high-speed bullet trains, has pioneered this mode of travel. Operating at speeds up to 190 miles per hour, Japan’s bullet trains carry almost a million passengers a day.
While the first European high-speed line, from Paris to Lyon, did not begin operation until 1981, Europe has made strides since then. As of early 2007 there were 3,034 miles of high-speed rail operating in Europe, with 1,711 more miles to be added by 2010. The goal is to have a Europe-wide high-speed rail system integrating the new eastern countries into a continental network by 2020.
Carbon dioxide emissions per passenger mile on Europe’s high-speed trains are one third those of its cars and only one fourth those of its planes. In the Plan B economy, CO2 emissions from trains will essentially be zero, since they will be powered by green electricity. In addition to being comfortable and convenient, these rail links reduce air pollution, congestion, noise, and accidents.
In the United States, the threat of climate change and the insecurity of oil supplies both argue for the construction of a high-speed electrified rail system, for both passenger and freight traffic. The relatively small amount of additional electricity needed could come from renewable sources, mainly wind farms.
Any meaningful global effort to cut transport CO2 emissions begins with the United States, which consumes more gasoline than the next 20 countries combined. Three initiatives are needed. One is phasing in a gasoline tax for the next 12 years and offsetting it with a reduction in income taxes. This would raise the US gasoline tax to that prevailing today in Europe. Combined with the rising price of gas itself, such a tax should encourage a shift to more fuel-efficient cars. The second measure is raising the fuel-efficiency standard to 45 miles per gallon by 2020, a larger increase than the 35 miles per gallon approved by Congress in late 2007. Third, reaching CO2 reduction goals depends on a heavy shift of transportation funds from highway construction to urban transit and intercity rail construction.
The shape of future transportation systems centres around the changing role of the automobile. This in turn is being influenced by the transition from a predominantly rural global society to a largely urban one. By 2020 close to 55 per cent of us will be living in cities, where the role of cars is diminishing.
With world oil output close to peaking, there will not be enough economically recoverable oil to support a world fleet expansion along US lines or, indeed, to sustain the US fleet. Oil shocks are now a major security risk. The United States, where 88 per cent of the 133 million working people travel to work by car, is dangerously vulnerable.
While the future of transportation in cities lies with a mix of light rail, buses, bicycles, cars, and walking, the future of intercity travel over distances of 500 miles or less belongs to high-speed trains. Japan, with its high-speed bullet trains, has pioneered this mode of travel. Operating at speeds up to 190 miles per hour, Japan’s bullet trains carry almost a million passengers a day.
While the first European high-speed line, from Paris to Lyon, did not begin operation until 1981, Europe has made strides since then. As of early 2007 there were 3,034 miles of high-speed rail operating in Europe, with 1,711 more miles to be added by 2010. The goal is to have a Europe-wide high-speed rail system integrating the new eastern countries into a continental network by 2020.
Carbon dioxide emissions per passenger mile on Europe’s high-speed trains are one third those of its cars and only one fourth those of its planes. In the Plan B economy, CO2 emissions from trains will essentially be zero, since they will be powered by green electricity. In addition to being comfortable and convenient, these rail links reduce air pollution, congestion, noise, and accidents.
In the United States, the threat of climate change and the insecurity of oil supplies both argue for the construction of a high-speed electrified rail system, for both passenger and freight traffic. The relatively small amount of additional electricity needed could come from renewable sources, mainly wind farms.
Any meaningful global effort to cut transport CO2 emissions begins with the United States, which consumes more gasoline than the next 20 countries combined. Three initiatives are needed. One is phasing in a gasoline tax for the next 12 years and offsetting it with a reduction in income taxes. This would raise the US gasoline tax to that prevailing today in Europe. Combined with the rising price of gas itself, such a tax should encourage a shift to more fuel-efficient cars. The second measure is raising the fuel-efficiency standard to 45 miles per gallon by 2020, a larger increase than the 35 miles per gallon approved by Congress in late 2007. Third, reaching CO2 reduction goals depends on a heavy shift of transportation funds from highway construction to urban transit and intercity rail construction.
Obama Starts Delivering on Environmental Promises
Not in recent memory has the word of a politician meant so much to so many. Barrack Obama said repeatedly his plan to stimulate the economy would directly be related to the environmental cleansing of the United States. True to his word it looks like that is exactly what he intends to do.
But can he do it? Two of the tougher questions are:
1. Can the US really turn around its past reluctance to embrace environmental issues?
2. Will the infamous US federal bureaucracy and political partisanship hold back needed reforms?
Since his inauguration he has done several things that show that he will be true to his word.
He signed an executive order directing the Environmental Protection Agency (EPA) to re-examine whether states should be allowed to impose their own tougher auto emission standards rather than a less onerous national standard. California sought to impose its own standard which was tougher than that imposed by previous administrations, and 13 other states agreed to adopt similar legislation.
He has also announced a task force with a first order of business to find ways to create new jobs that pay well; reduce pollution; and lessen America’s reliance on foreign oil. By embracing the path to a greener economy Obama is placing America in a situation where they can be leaders in the environmental movement rather than not so innocent bystanders.
Most recently, he has ordered quicker and new efforts to make appliances more energy efficient. The thinking is that more energy efficient appliances like stoves, lamps, dishwashers, washers and dryers will save consumers money and help with the stimulus of the economy. While the appliances may be more expensive in the short term the White House estimates that the tighter standards could save consumers more than USD500 billion over the next 30 years.
President Obama said in his first days, as President, that he would strive to have the amount of electricity generated by renewable resources equal to 10 per cent of the United States usage by 2012. This is a lofty goal and one that has the support of the environmental community.
It may be early but if President Obama continues with this path and along the way gets the support of the environmental movement it will be very hard to beat him in 4 years no matter who the opposing candidate is. More importantly, future presidential candidates will have no choice but to see what kind of power the environmental movement can bring to them.
In the interim we would do well to pay attention to the message that President Obama is trying to deliver. More pointedly, who is the US learning from – Europe, Asia or even Canada?
A question we all might ask of ourselves is "What am I willing or able to do to help him?"
But can he do it? Two of the tougher questions are:
1. Can the US really turn around its past reluctance to embrace environmental issues?
2. Will the infamous US federal bureaucracy and political partisanship hold back needed reforms?
Since his inauguration he has done several things that show that he will be true to his word.
He signed an executive order directing the Environmental Protection Agency (EPA) to re-examine whether states should be allowed to impose their own tougher auto emission standards rather than a less onerous national standard. California sought to impose its own standard which was tougher than that imposed by previous administrations, and 13 other states agreed to adopt similar legislation.
He has also announced a task force with a first order of business to find ways to create new jobs that pay well; reduce pollution; and lessen America’s reliance on foreign oil. By embracing the path to a greener economy Obama is placing America in a situation where they can be leaders in the environmental movement rather than not so innocent bystanders.
Most recently, he has ordered quicker and new efforts to make appliances more energy efficient. The thinking is that more energy efficient appliances like stoves, lamps, dishwashers, washers and dryers will save consumers money and help with the stimulus of the economy. While the appliances may be more expensive in the short term the White House estimates that the tighter standards could save consumers more than USD500 billion over the next 30 years.
President Obama said in his first days, as President, that he would strive to have the amount of electricity generated by renewable resources equal to 10 per cent of the United States usage by 2012. This is a lofty goal and one that has the support of the environmental community.
It may be early but if President Obama continues with this path and along the way gets the support of the environmental movement it will be very hard to beat him in 4 years no matter who the opposing candidate is. More importantly, future presidential candidates will have no choice but to see what kind of power the environmental movement can bring to them.
In the interim we would do well to pay attention to the message that President Obama is trying to deliver. More pointedly, who is the US learning from – Europe, Asia or even Canada?
A question we all might ask of ourselves is "What am I willing or able to do to help him?"
Team North Builds an Innovative Solar Powered Home
North House, an advanced solar-powered home being developed by a team of students from Ontario and British Columbia, will demonstrate Canada’s commitment to sustainability and promote alternative energy sources, its university organisers say.
Team North, which involves students and faculty at the University of Waterloo, Ryerson University and Simon Fraser University, along with industry partners, is one of only two Canadian entries selected to participate in the prestigious the 2009 Solar Decathlon competition, sponsored by the US Department of Energy and the National Renewable Energy Laboratory.
The decathlon will be held from October 9-18 on the National Mall in Washington, D.C., drawing 20 university teams with prototype solar homes from around the world. The teams will build a full-scale house to compete in 10 categories measuring the quality and performance of a solar-powered home.
Team North is developing North House as a marketable and interactive solar-powered home for people with active lifestyles. The team aims to combine green building, solar and interactive technologies in order to reduce energy demand, foster a conservation ethic, and boost the quality of life for all Canadians.
"North House will offer powerful solutions by using energy more efficiently and using energy from renewable sources," said Maun Demchenko, Team North’s director of public relations. "North House will serve as a vehicle for teaching the public about solar technologies and how they can be used in new and existing housing. It will showcase innovative and sustainable green construction building practices in Canada on a world stage."
North House, deploying the latest in energy-efficient technologies and materials, will demonstrate how design can promote lifestyles that reduce energy use while maintaining a high quality of life.
Such an integrated approach to new construction draws on the interactions of all building components and systems to create a more comfortable building, save energy, and reduce environmental impact.
In 2007 a team of Montreal-area architecture and engineering students competed in the Solar Decathlon with a design entry entitled "Lumen-Essence". The only Canadian entry in the competition that year, Team Montreal placed 8th overall out of 20 teams from around the world.
This year’s competition features two Canadian entries. In addition to the North House entry, the Alberta Solar Decathlon Team from the University of Calgary is developing an energy-efficient design that incorporates traditional Western Canadian wooden post-and-beam structure, a south-facing deck, a rooftop patio accessible from inside the home, and integrated photovoltaic cells incorporated into the roof, clerestory windows, roof balcony railing and solar louvers to maximise solar collection.
Team North, which involves students and faculty at the University of Waterloo, Ryerson University and Simon Fraser University, along with industry partners, is one of only two Canadian entries selected to participate in the prestigious the 2009 Solar Decathlon competition, sponsored by the US Department of Energy and the National Renewable Energy Laboratory.
The decathlon will be held from October 9-18 on the National Mall in Washington, D.C., drawing 20 university teams with prototype solar homes from around the world. The teams will build a full-scale house to compete in 10 categories measuring the quality and performance of a solar-powered home.
Team North is developing North House as a marketable and interactive solar-powered home for people with active lifestyles. The team aims to combine green building, solar and interactive technologies in order to reduce energy demand, foster a conservation ethic, and boost the quality of life for all Canadians.
"North House will offer powerful solutions by using energy more efficiently and using energy from renewable sources," said Maun Demchenko, Team North’s director of public relations. "North House will serve as a vehicle for teaching the public about solar technologies and how they can be used in new and existing housing. It will showcase innovative and sustainable green construction building practices in Canada on a world stage."
North House, deploying the latest in energy-efficient technologies and materials, will demonstrate how design can promote lifestyles that reduce energy use while maintaining a high quality of life.
Such an integrated approach to new construction draws on the interactions of all building components and systems to create a more comfortable building, save energy, and reduce environmental impact.
In 2007 a team of Montreal-area architecture and engineering students competed in the Solar Decathlon with a design entry entitled "Lumen-Essence". The only Canadian entry in the competition that year, Team Montreal placed 8th overall out of 20 teams from around the world.
This year’s competition features two Canadian entries. In addition to the North House entry, the Alberta Solar Decathlon Team from the University of Calgary is developing an energy-efficient design that incorporates traditional Western Canadian wooden post-and-beam structure, a south-facing deck, a rooftop patio accessible from inside the home, and integrated photovoltaic cells incorporated into the roof, clerestory windows, roof balcony railing and solar louvers to maximise solar collection.
Honda Civic GX is the Greenest Vehicles Of 2009
Recognising Honda’s dedication to fuel efficient and alternative fuel technologies, three Honda vehicles earned recognition from the American Council for an Energy-Efficient Economy (ACEEE) as the “greenest vehicles of 2009” with the Civic GX natural gas car taking the title of the greenest vehicle for the sixth consecutive year. In the 12th annual ACEEE’s “Green Book® Online” ranking of environmentally responsible vehicles (available at www.greenercars.org), the natural gas-powered Civic GX ranked first with the gasoline Fit and Civic Hybrid joining the list of the 12 most environment-friendly vehicles available to the public. This is the ninth year in a row that a Honda vehicle received the number one ranking.
“Honda is proud to receive this recognition from ACEEE and will continue to look for new ways to reduce our environmental impact,” said Dan Bonawitz, executive vice president of American Honda. "As part of our philosophy to be a company that society wants to exist, Honda is deeply committed to developing a wide range of both fuel-efficient and alternative fuel vehicles available to the public.”
Using a singular measure that incorporates fuel economy, health-related pollution impacts and global warming emissions, all vehicles are analysed and given a "Green Score.” This score is used in ACEEE’s ranking system, resulting in a ranking of each vehicle’s total environmental performance, including a list of the 12 "greenest" and 12 "meanest" vehicles. The Civic GX, first introduced in 1998, is the cleanest internal combustion vehicle certified by the Environmental Protection Agency, and is 90 per cent cleaner than the average gasoline-powered car on the road today.
Honda has a long history of environmental leadership including the introduction of America’s first hybrid, the Honda Insight, delivery of the first fuel cell vehicle in the US, and the first vehicles to meet stricter emissions standards, including:
• The first gasoline Low Emissions Vehicle (LEV), the 1996 Honda Civic.
• The first gasoline Ultra-Low Emissions vehicle (ULEV), the 1998 Honda Accord.
• The first gasoline Super Ultra-Low Emissions Vehicle (SULEV), the 2000 Honda Accord.
• The first Advanced Technology Partial-Zero Emissions Vehicle (AT-PZEV), the 2001 Civic GX natural gas vehicle.
The American Council for an Energy-Efficient Economy is a non-profit organisation dedicated to advancing energy efficiency as a means of promoting both economic prosperity and environmental protection.
“Honda is proud to receive this recognition from ACEEE and will continue to look for new ways to reduce our environmental impact,” said Dan Bonawitz, executive vice president of American Honda. "As part of our philosophy to be a company that society wants to exist, Honda is deeply committed to developing a wide range of both fuel-efficient and alternative fuel vehicles available to the public.”
Using a singular measure that incorporates fuel economy, health-related pollution impacts and global warming emissions, all vehicles are analysed and given a "Green Score.” This score is used in ACEEE’s ranking system, resulting in a ranking of each vehicle’s total environmental performance, including a list of the 12 "greenest" and 12 "meanest" vehicles. The Civic GX, first introduced in 1998, is the cleanest internal combustion vehicle certified by the Environmental Protection Agency, and is 90 per cent cleaner than the average gasoline-powered car on the road today.
Honda has a long history of environmental leadership including the introduction of America’s first hybrid, the Honda Insight, delivery of the first fuel cell vehicle in the US, and the first vehicles to meet stricter emissions standards, including:
• The first gasoline Low Emissions Vehicle (LEV), the 1996 Honda Civic.
• The first gasoline Ultra-Low Emissions vehicle (ULEV), the 1998 Honda Accord.
• The first gasoline Super Ultra-Low Emissions Vehicle (SULEV), the 2000 Honda Accord.
• The first Advanced Technology Partial-Zero Emissions Vehicle (AT-PZEV), the 2001 Civic GX natural gas vehicle.
The American Council for an Energy-Efficient Economy is a non-profit organisation dedicated to advancing energy efficiency as a means of promoting both economic prosperity and environmental protection.
Subscribe to:
Posts (Atom)