Tag Archive: CArbon Dioxide


File:Boomstronken.jpg

Description  :  Boomstronken; foto door Fruggo, juni 2003.

Attribution: Fruggo from nl

Creative Commons Attribution-Share Alike 3.0 Unported

…..

New Research Shows Tree Roots Regulate CO2, Keep Climate Stable

Climate News Network | February 19, 2014 8:30 am

The argument, put forward by a team from Oxford and Sheffield Universities in the journal Geophysical Research Letters, begins with temperature. Warmer climates mean more vigorous tree growth and more leaf litter, and more organic content in the soil. So the tree’s roots grow more vigorously, said Dr. Christopher Doughty of Oxford and colleagues.

They get into the bedrock, and break up the rock into its constituent minerals. Once that happens, the rock starts to weather, combining with carbon dioxide. This weathering draws carbon dioxide out of the atmosphere, and in the process cools the planet down a little. So mountain ecosystems—mountain forests are usually wet and on conspicuous layers of rock—are in effect part of the global thermostat, preventing catastrophic overheating.

The tree is more than just a sink for carbon, it is an agency for chemical weathering that removes carbon from the air and locks it up in carbonate rock.

That mountain weathering and forest growth are part of the climate system has never been in much doubt: the questions have always been about how big a forest’s role might be, and how to calculate its contribution.

Keeping climate stable

U.S. scientists recently studied the rainy slopes of New Zealand’s Southern Alps to begin to put a value on mountain ecosystem processes. Dr. Doughty and his colleagues measured tree roots at varying altitudes in the tropical rain forests of Peru, from the Amazon lowlands to 3,000 meters of altitude in the higher Andes.

They measured the growth to 30 cm below the surface every three months and did so for a period of years. They recorded the thickness of the soil’s organic layer, and they matched their observations with local temperatures, and began to calculate the rate at which tree roots might turn Andean granite into soil.

Then they scaled up the process, and extended it through long periods of time. Their conclusion: that forests served to moderate temperatures in a much hotter world 65 million years ago.

Read More Here

Enhanced by Zemanta

ENS

Nuclear power plants world-wide, in operation, as of 18 January 2013

Number of reactors in operation, worldwide

…..

WashingtonsBlog

Former NRC Commissioner: Trying To Solve Global Warming By Building Nuclear Power Plants Is Like Trying To Solve Global Hunger By Serving Everyone Caviar

And Nuclear Pumps Out a Lot of Carbon Dioxide

It is well-documented that nuclear energy is very expensive and bad for the environment.

Former U.S. Nuclear Regulatory Commissioner Peter Bradford notes:

If asked whether we should increase our reliance on caviar to fight world hunger, most people would laugh. Relying on an overly expensive commodity to perform an essential task spends too much money for too little benefit, while foreclosing more-promising approaches.

That is nuclear power’s fundamental flaw in the search for plentiful energy without climate repercussions, though reactors are also more dangerous than caviar unless you’re a sturgeon.

***

Nuclear power is so much more expensive than alternative ways of providing energy that the world can only increase its nuclear reliance through massive government subsidy—like the $8 billion loan guarantee offered by the federal government to a two-reactor project in Georgia approved by the Nuclear Regulatory Commission earlier this year.

***

Many more such direct government subsidies will be needed to scale up nuclear power to any great extent.

***

John Rowe, former chief executive of Exelon Corp., an energy company that relies heavily on nuclear power, recently said, “At today’s [natural] gas prices, a new nuclear power plant is out of the money by a factor of two.” He added, “It’s not something where you can go sharpen the pencil and play. It’s economically wrong.” His successor, Christopher Crane, recently said gas prices would have to increase roughly fivefold for nuclear to be competitive in the U.S.

***

Countries that choose power supplies through democratic, transparent and market-based methods aren’t building new reactors.

Indeed, nuclear is not only crazily expensive, but it also pumps out a huge amount of carbon dioxide during construction, and crowds out development of clean energy.

Nuclear may also provide a lower return on energy invested than renewable forms of alternative energy. In other words, it might take more energy to create nuclear energy than other forms of power … which is worse for the environment.

Read More Here

…..

ENS

Number of reactors in operation, worldwide, 2013-01-18 (IAEA 2013, modified)

Nuclear Power Plants July 2012

…..

ENS

Number of reactors under construction

Number of reactors under construction, 2013-01-18 (IAEA 2013, modified)

…..

ENS

Nuclear share in electricity generation
Nuclear share in electricity generation, 2011 (IAEA 2012, modified)

…..

WashingtonsBlog

Nuclear Power Is Expensive and Bad for the Environment … It’s Being Pushed Because It Is Good For Making Bombs

Since the 1980s, the U.S. Has Secretly Helped Japan Build Up Its Nuclear Weapons Program … Pretending It Was “Nuclear Energy” and “Space Exploration” …

As demonstrated below, nuclear energy is expensive and bad for the environment.

The real reason it is being pushed is because it is good for helping countries like Japan and the U.S. build nuclear weapons.

Nuclear Energy Is Expensive

Forbes points out:

Nuclear power is no longer an economically viable source of new energy in the United States, the freshly-retired CEO of Exelon, America’s largest producer of nuclear power [who also served on the president’s Blue Ribbon Commission on America’s Nuclear Future], said in Chicago Thursday.

And it won’t become economically viable, he said, for the forseeable future.

***

“I’m the nuclear guy,” Rowe said. “And you won’t get better results with nuclear. It just isn’t economic, and it’s not economic within a foreseeable time frame.”

U.S. News and World Report notes:

After the Fukushima power plant disaster in Japan last year, the rising costs of nuclear energy could deliver a knockout punch to its future use in the United States, according to a researcher at the Vermont Law School Institute for Energy and the Environment.

“From my point of view, the fundamental nature of [nuclear] technology suggests that the future will be as clouded as the past,” says Mark Cooper, the author of the report. New safety regulations enacted or being considered by the U.S. Nuclear Regulatory Commission would push the cost of nuclear energy too high to be economically competitive.

The disaster insurance for nuclear power plants in the United States is currently underwritten by the federal government, Cooper says. Without that safeguard, “nuclear power is neither affordable nor worth the risk. If the owners and operators of nuclear reactors had to face the full liability of a Fukushima-style nuclear accident or go head-to-head with alternatives in a truly competitive marketplace, unfettered by subsidies, no one would have built a nuclear reactor in the past, no one would build one today, and anyone who owns a reactor would exit the nuclear business as quickly as possible.”

Alternet reports:

An authoritative study by the investment bank Lazard Ltd. found that wind beat nuclear and that nuclear essentially tied with solar. But wind and solar, being simple and safe, are coming on line faster. Another advantage wind and solar have is that capacity can be added bit by bit; a wind farm can have more or less turbines without scuttling the whole project. As economies of scale are created within the alternative energy supply chains and the construction process becomes more efficient, prices continue to drop. Meanwhile, the cost of stalled nukes moves upward.

AP noted last year:

Nuclear power is a viable source for cheap energy only if it goes uninsured.

***

Governments that use nuclear energy are torn between the benefit of low-cost electricity and the risk of a nuclear catastrophe, which could total trillions of dollars and even bankrupt a country.

The bottom line is that it’s a gamble: Governments are hoping to dodge a one-off disaster while they accumulate small gains over the long-term.

The cost of a worst-case nuclear accident at a plant in Germany, for example, has been estimated to total as much as €7.6 trillion ($11 trillion), while the mandatory reactor insurance is only €2.5 billion.

“The €2.5 billion will be just enough to buy the stamps for the letters of condolence,” said Olav Hohmeyer, an economist at the University of Flensburg who is also a member of the German government’s environmental advisory body.

The situation in the U.S., Japan, China, France and other countries is similar.

***

“Around the globe, nuclear risks — be it damages to power plants or the liability risks resulting from radiation accidents — are covered by the state. The private insurance industry is barely liable,” said Torsten Jeworrek, a board member at Munich Re, one of the world’s biggest reinsurance companies.

***

In financial terms, nuclear incidents can be so devastating that the cost of full insurance would be so high as to make nuclear energy more expensive than fossil fuels.

***

Ultimately, the decision to keep insurance on nuclear plants to a minimum is a way of supporting the industry.

“Capping the insurance was a clear decision to provide a non-negligible subsidy to the technology,” Klaus Toepfer, a former German environment minister and longtime head of the United Nations Environment Programme (UNEP), said.

See this and this.

This is an ongoing battle, not ancient history. As Harvey Wasserman reports:

The only two US reactor projects now technically under construction are on the brink of death for financial reasons.

If they go under, there will almost certainly be no new reactors built here.

***

Georgia’s double-reactor Vogtle project has been sold on the basis of federal loan guarantees. Last year President Obama promised the Southern Company, parent to Georgia Power, $8.33 billion in financing from an $18.5 billion fund that had been established at the Department of Energy by George W. Bush. Until last week most industry observers had assumed the guarantees were a done deal. But the Nuclear Energy Institute, an industry trade group, has publicly complained that the Office of Management and Budget may be requiring terms that are unacceptable to the builders.

***

The climate for loan guarantees has changed since this one was promised. The $535 million collapse of Solyndra prompted a rash of angry Congressional hearings and cast a long shadow over the whole range of loan guarantees for energy projects. Though the Vogtle deal comes from a separate fund, skepticism over stalled negotiations is rising.

So is resistance among Georgia ratepayers. To fund the new Vogtle reactors, Southern is forcing “construction work in progress” rate hikes that require consumers to pay for the new nukes as they’re being built. Southern is free of liability, even if the reactors are not completed. Thus it behooves the company to build them essentially forever, collecting payment whether they open or not.

All that would collapse should the loan guarantee package fail.

Bad for the Environment

Alternet points out:

Mark Cooper, senior fellow for economic analysis at the Vermont Law School … found that the states that invested heavily in nuclear power had worse track records on efficiency and developing renewables than those that did not have large nuclear programs. In other words, investing in nuclear technology crowded out developing clean energy.

Many experts also say that the “energy return on investment” from nuclear power is lower than many other forms of energy. In other words, non-nuclear energy sources produce more energy for a given input.

And decentralizing energy production and storage is the real solution for the environment … not building more centralized nuclear plants.

Read More Here

…..

Enhanced by Zemanta

European Geosciences Union (EGU)

23 January 2014

Mineral weathering by fungi
Mineral weathering by fungi (Credit: Joe Quirk)

UK researchers have identified a biological mechanism that could explain how the Earth’s atmospheric carbon dioxide and climate were stabilised over the past 24 million years. When CO2 levels became too low for plants to grow properly, forests appear to have kept the climate in check by slowing down the removal of carbon dioxide from the atmosphere. The results are now published in Biogeosciences, an open access journal of the European Geosciences Union (EGU).

“As CO2 concentrations in the atmosphere fall, the Earth loses its greenhouse effect, which can lead to glacial conditions,” explains lead-author Joe Quirk from the University of Sheffield. “Over the last 24 million years, the geologic conditions were such that atmospheric CO2 could have fallen to very low levels – but it did not drop below a minimum concentration of about 180 to 200 parts per million. Why?”

Before fossil fuels, natural processes kept atmospheric carbon dioxide in check. Volcanic eruptions, for example, release CO2, while weathering on the continents removes it from the atmosphere over millions of years. Weathering is the breakdown of minerals within rocks and soils, many of which include silicates. Silicate minerals weather in contact with carbonic acid (rain and atmospheric CO2) in a process that removes carbon dioxide from the atmosphere. Further, the products of these reactions are transported to the oceans in rivers where they ultimately form carbonate rocks like limestone that lock away carbon on the seafloor for millions of years, preventing it from forming carbon dioxide in the atmosphere.

Forests increase weathering rates because trees, and the fungi associated with their roots, break down rocks and minerals in the soil to get nutrients for growth. The Sheffield team found that when the CO2 concentration was low – at about 200 parts per million (ppm) – trees and fungi were far less effective at breaking down silicate minerals, which could have reduced the rate of CO2 removal from the atmosphere.

“We recreated past environmental conditions by growing trees at low, present-day and high levels of CO2 in controlled-environment growth chambers,” says Quirk. “We used high-resolution digital imaging techniques to map the surfaces of mineral grains and assess how they were broken down and weathered by the fungi associated with the roots of the trees.”

As reported in Biogeosciences, the researchers found that low atmospheric CO2 acts as a ‘carbon starvation’ brake. When the concentration of carbon dioxide falls from 1500 ppm to 200 ppm, weathering rates drop by a third, diminishing the capacity of forests to remove CO2 from the atmosphere.

The weathering rates by trees and fungi drop because low CO2 reduces plants’ ability to perform photosynthesis, meaning less carbon-energy is supplied to the roots and their fungi. This, in turn, means there is less nutrient uptake from minerals in the soil, which slows down weathering rates over millions of years.

“The last 24 million years saw significant mountain building in the Andes and Himalayas, which increased the amount of silicate rocks and minerals on the land that could be weathered over time. This increased weathering of silicate rocks in certain parts of the world is likely to have caused global CO2 levels to fall,” Quirk explains. But the concentration of CO2 never fell below 180-200 ppm because trees and fungi broke down minerals at low rates at those concentrations of atmospheric carbon dioxide.

“It is important that we understand the processes that affect and regulate climates of the past and our study makes an important step forward in understanding how Earth’s complex plant life has regulated and modified the climate we know on Earth today,” concludes Quirk.

Press Release Page Link

More information

This research is presented in the paper ‘Weathering by tree root-associating fungi diminishes under simulated Cenozoic atmospheric CO2 decline’ published in the EGU open access journal Biogeosciences on 23 January 2014.

Full citation: Quirk, J., Leake, J. R., Banwart, S. A., Taylor, L. L., and Beerling, D. J.: Weathering by tree-root-associating fungi diminishes under simulated Cenozoic atmospheric CO2 decline, Biogeosciences, 11, 321-331, doi:10.5194/bg-11-321-2014, 2014.

The team is composed of J. Quirk, J. R. Leake, S. A. Banwart, L. L. Taylor and D. J. Beerling, from the University of Sheffield, UK.

Dr. Joe Quirk
Post Doctoral Research Associate
Department of Animal and Plant Sciences
University of Sheffield, UK
Tel: +44 (0)114 22 20093
Email: j.quirk@sheffield.ac.uk

Prof. David Beerling (Principal Investigator)
Department of Animal and Plant Sciences
University of Sheffield, UK
Tel: +44 (0)114 22 24359
Email: d.j.beerling@sheffield.ac.uk

Bárbara Ferreira
EGU Media and Communications Manager
Munich, Germany
Tel: +49-89-2180-6703
Email: media@egu.eu

Enhanced by Zemanta

Earth Watch Report  –  Volcanic  Activity

 

Dieng volcano (Central Java, Indonesia): dangerous gas emissions, alert level raised to orange

Thursday Mar 28, 2013 18:10 PM | BY: T
Volcano Discovery

VSI raised the alert level to the third highest level Siaga (3 out of 4), because significant changes were observed at the crater lake. The most spectacular was the change of the lake water color to dark brown on 24 March.
In addition, a significant increase in CO2 concentration within 500 m from the Timbang crater was measured, from from 0.01% (by volume) in early March to 2.5% between 11 and 15 March. Also the emissions of the magmatic gas H2S increased. The now elevated gas concentrations are becoming a significant hazard (illustrated by a cat found suffocated by CO2).
Therefore, it is strongly advised not to approach the Timbang crater within one kilometer to avoid the risk of suffocation due to the high CO2 concentrations (note that CO2 is absolutely odorless and lethal when inhaled in larger quantities).

 

 

 

CLIMATE SCIENCE

As predators decline, carbon emissions rise
by Staff Writers
Vancouver, Canada (SPX) Feb 22, 2013


Trisha Atwood at a stream site in UBC Malcolm Knapp Research Forest. Photo by: Amanda Klemmer.

“Predators are disappearing from our ecosystems at alarming rates because of hunting and fishing pressure and because of human induced changes to their habitats,” says Trisha Atwood, a PhD candidate in the Department of Forest and Conservation Sciences in the Faculty of Forestry at UBC.

For their study, published in the journal Nature Geoscience, Atwood and her colleagues wanted to measure the role predators play in regulating carbon emissions to better understand the consequences of losing these animals.

 

Read Full Article Here

 

CLIMATE SCIENCE

Geo-engineering against climate change

by Staff Writers
Washington DC (SPX)


Calculations suggest that on average, a single ocean iron fertilization will result in a net sequestration of just 10 tonnes of carbon per square kilometer sequestered for a century or more at a cost of almost US$500 per tonne of carbon dioxide.

One such technology involves dispersing large quantities of iron salts in the oceans to fertilize otherwise barren parts of the sea and trigger the growth of algal blooms and other photosynthesizing marine life.

Photosynthesis requires carbon dioxide as its feedstock and when the algae die they will sink to the bottom of the sea taking the locked in carbon with them.

Unfortunately, present plans for seeding the oceans with iron fail to take into account several factors that could scupper those plans, according to Daniel Harrison of the University of Sydney Institute of Marine Science, NSW.

Writing in the International Journal of Global Warming, Harrison has calculated the impact of iron seeding schemes in terms of the efficiency of spreading the iron, the impact it will most likely have on algal growth the tonnage of carbon dioxide per square kilometer of ocean surface that will be actually absorbed compared to the hypothetical figures suggested by advocates of the approach.

“If society wishes to limit the contribution of anthropogenic carbon dioxide to global warming then the need to find economical methods of carbon dioxide sequestration is now urgent,” Harrison’s new calculations take into account not only the carbon dioxide that will be certainly be sequestered permanently to the deep ocean but also subtracts the many losses due to ventilation, nutrient stealing, greenhouse gas production and the carbon dioxide emitted by the burning of fossil fuels to produce the iron salts and to power their transportation and distribution at sea.

His calculations suggest that on average, a single ocean iron fertilization will result in a net sequestration of just 10 tonnes of carbon per square kilometer sequestered for a century or more at a cost of almost US$500 per tonne of carbon dioxide.

 

Read Full Article Here

Crossroads News : Changes In The World Around Us And Our Place In It

Pollution

 

FARM NEWS

More Potent than CO2, N2O Levels in California May be Nearly Three Times Higher Than Previously Thought

by Staff Writers
Berkeley CA (SPX)


This map shows the amount of nitrous oxide emissions in California (in nanomoles per square meter per second). The “x” represents the location of the measurement tower in Walnut Grove, CA.

Using a new method for estimating greenhouse gases that combines atmospheric measurements with model predictions, Lawrence Berkeley National Laboratory (Berkeley Lab) researchers have found that the level of nitrous oxide, a potent greenhouse gas, in California may be 2.5 to 3 times greater than the current inventory.

At that level, total N2O emissions-which are believed to come primarily from nitrogen fertilizers used in agricultural production-would account for about 8 percent of California’s total greenhouse gas emissions.

The findings were recently published in a paper titled “Seasonal variations in N2O emissions from central California” in Geophysical Research Letters. Earlier this year, using the same methodology, the researchers found that levels of methane, another potent greenhouse gas, in California may be up to 1.8 times greater than previous estimates.

“If our results are accurate, then it suggests that N2O makes up not 3 percent of California’s total effective greenhouse gases but closer to 10 percent,” said Marc Fischer, lead researcher on both studies.

“And taken together with our previous estimates of methane emissions, that suggests those two gases may make up 20 to 25 percent of California’s total emissions. That’s starting to become roughly comparable to emissions from fossil fuel CO2.”

Accurate estimates of the California’s greenhouse gas emissions are important as the state works to reduce emissions to 1990 levels by 2020, as mandated by a law known as AB 32. The vast majority of the reduction efforts have been focused on CO2.

Nitrous oxide, better known as laughing gas, is an especially potent greenhouse gas because it traps far more infrared radiation than both carbon dioxide and methane. “It’s present in the atmosphere at tiny concentrations-one-thousandth that of CO2-but it is very potent,” Fischer said. “It has a global warming potential of approximately 300, meaning it is 300 times more active than CO2 per unit mass. And it’s 10 to 15 times more potent than methane.”

Worldwide levels of N2O have been rising rapidly for decades, and the major culprit was recently confirmed to be the heavy use of nitrogen fertilizers to grow the world’s food. Other less significant sources of N2O emissions include wetlands, animal and industrial waste and automobiles.

The standard method for estimating emissions levels has been to do what is called a “bottom-up inventory.” This process involves listing all the activities that emit N2O, assigning an emission factor for each activity, then tallying up the emissions. However, this method can result in large uncertainties because of the way N2O is produced.

“The biogeochemical processes that produce N2O are sensitive to environmental conditions and very small changes in things like temperature, moisture, the type of soil and when the fertilizer is applied,” Fischer said.

“All those factors can result in big differences in the amount of N2O that’s produced. If you try to use a single number for a given patch of land, you’re almost certainly going to get a variable result.”

While there are models that try to capture these factors, “it is still likely the numbers are going to have relatively large uncertainties, especially compared to thing like burning fossil fuels to make CO2, where pretty much every mole of carbon becomes CO2,” Fischer said.

The method that Fischer and his colleagues describe in their paper compares measurements taken from a 2,000-foot tower in Walnut Grove, California to model predictions of expected N2O levels based on the bottom-up inventory to arrive at the new estimate.

“This is the first study of its kind to look at a full annual cycle of emissions-actually it’s two years-from a large region of California that includes the sources that we believe are most important,” Fischer said. “In general, we found that the measured signals were much bigger than the predicted signals.”

Research reveals link between late arrival of next ice age and emissions
Research reveals link between late arrival of next ice age and emissions

Mankind’s emissions of fossil carbon and the resulting increase in temperature could prove to be our salvation from the next ice age.

According to new research from the University of Gothenburg, Sweden, the current increase in the extent of peatland is having the opposite effect.

“We are probably entering a new ice age right now. However, we’re not noticing it due to the effects of carbon dioxide”, says researcher Professor Lars Franzén.

Looking back over the past three million years, the earth has experienced at least 30 periods of ice age, known as ice age pulses. The periods in between are called interglacials.

The researchers believe that the Little Ice Age of the 16th to 18th centuries may have been halted as a result of human activity. Increased felling of woodlands and growing areas of agricultural land, combined with the early stages of industrialisation, resulted in increased emissions of carbon dioxide which probably slowed down, or even reversed, the cooling trend.

“It is certainly possible that mankind’s various activities contributed towards extending our ice age interval by keeping carbon dioxide levels high enough,” explains Lars Franzén, Professor of Physical Geography at the University of Gothenburg.

“Without the human impact, the inevitable progression towards an ice age would have continued. The spread of peatlands is an important factor.”

Peatlands act as carbon sinks, meaning that they absorb carbon dioxide from the atmosphere. They are a dynamic landscape element and currently cover around four percent of the earth’s land area. Most peatlands are found in temperate areas north and south of the 45th parallel.

Around 16 percent of Sweden is covered by peatland. Peatlands grow in height and spread across their surroundings by waterlogging woodlands. They are also one of the biggest terrestrial sinks of atmospheric carbon dioxide. Each year, around 20 grams of carbon are absorbed by every square metre of peatland.

“By using the National Land Survey of Sweden’s altitude database, we have calculated how much of Sweden could be covered by peatlands during an interglacial. We have taken a maximum terrain incline of three degrees as our upper limit, and have also excluded all lakes and areas with substrata that are unsuitable for peatland formation.”

The researchers found that around half of Sweden’s surface could be covered by peat. In such a case, the carbon dioxide sink would increase by a factor of between six and ten compared with the current situation.

“If we accept that rising levels of carbon dioxide in the atmosphere lead to an increase in global temperature, the logical conclusion must be that reduced levels lead to a drop in temperature.”

The relationship between carbon dioxide and temperature is not linear. Instead, lower levels result in a greater degree of cooling than the degree of warming achieved by a corresponding increase.

“There have been no emissions of fossil carbon during earlier interglacials. Carbon sequestration in peatland may therefore be one of the main reasons why ice age conditions have occurred time after time.”

Using calculations for Swedish conditions, the researchers are also producing a rough estimate of the global carbon sink effect if all temperate peatlands were to grow in the same way.

“Our calculations show that the peatlands could contribute towards global cooling equivalent to five watts per square metre. There is a great deal of evidence to suggest that we are near the end of the current interglacial.”

 

 

Carbon Eaters on the Black Sea

by Staff Writers
Greenbelt MD (SPX)


During their lifespan, coccolithophores remove carbon from the air, “fix” or integrate it into what is effectively limestone, and take it with them to the seafloor when they die and sink or when they are consumed (and eventually excreted) by zooplankton and fish.

This brilliant cyan pattern scattered across the surface of the Black Sea is a bloom of microscopic phytoplankton. The multitude of single-celled algae in this image are most likely coccolithophores, one of Earth’s champions of carbon pumping.

Coccolithophores constantly remove carbon dioxide from the atmosphere and slowly send it down to the seafloor, an action that helps to stabilize the Earth’s climate.

This image of this swirling blue bloom was captured on July 15, 2012, by the Moderate Resolution Imaging Spectroradiometer (MODIS) on NASA’s Aqua satellite. Note that the image is rotated so that north is to the right.

Ocean scientist Norman Kuring of NASA’s Goddard Space Flight Center suggested the bloom was likely Emiliania huxleyi, though it is impossible to know the species for sure without direct sampling of the water.

Coccolithophores use carbon, calcium, and oxygen to produce tiny plates of calcium carbonate (coccoliths). Often called “stones” by researchers, coccoliths resemble hubcaps.

During their lifespan, coccolithophores remove carbon from the air, “fix” or integrate it into what is effectively limestone, and take it with them to the seafloor when they die and sink or when they are consumed (and eventually excreted) by zooplankton and fish.

These micro-stones are thought to speed up the ocean’s biological pump, according to William Balch, a senior research scientist at the Bigelow Laboratory for Ocean Sciences and a member of the Suomi NPP science team.

Without this dense calcium carbonate ballast for sinking particles to the depths, less carbon dioxide would be drawn down into the ocean. The net result would be higher atmospheric concentrations of carbon dioxide.

But as Balch points out, the ever-increasing amount of carbon dioxide in our air could upset this biological pump. Excess carbon dioxide is making the ocean more acidic, which may change the conditions that promote coccolithophore growth.

“Ocean acidification is highly relevant to coccolithophores,” said Balch.

“We are trying to understand if it would slow the ocean’s biological pump by inhibiting coccolithophore calcification. If they can’t calcify, they can’t make their limestone plates that pull all the sinking particulate carbon to the seafloor.”

Related Links
Earth Observatory
Carbon Worlds – where graphite, diamond, amorphous, fullerenes meet

Environmental

Rising CO2 in atmosphere also speeds carbon loss from forest soils

by Staff Writers
Bloomington IN (SPX)

Wood Pile


The research was conducted at the Duke Forest Free Air Carbon Dioxide Enrichment site in North Carolina, where mature loblolly pine trees were exposed to increased levels of carbon dioxide for 14 years, making it one of the longest-running carbon dioxide enrichment experiments in the world. Image courtesy Will Owens.

Elevated levels of atmospheric carbon dioxide accelerate carbon cycling and soil carbon loss in forests, new research led by an Indiana University biologist has found. The new evidence supports an emerging view that although forests remove a substantial amount of carbon dioxide from the atmosphere, much of the carbon is being stored in living woody biomass rather than as dead organic matter in soils.

Richard P. Phillips, lead author on the paper and an assistant professor of biology in the IU College of Arts and Sciences, said that after nearly two decades of research on forest ecosystem responses to global change, some of the uncertainty has been lifted about how forests are storing carbon in the wake of rising carbon dioxide levels.

“It’s been suggested that as trees take up more carbon dioxide from the atmosphere, a greater amount of carbon will go to roots and fungi to acquire nutrients, but our results show that little of this carbon accumulates in soil because the decomposition of root and fungal detritus is also increased,” he said.

Carbon stored in soils, as opposed to in the wood of trees, is desirable from a management perspective in that soils are more stable over time, so carbon can be locked away for hundreds to thousands of years and not contribute to atmospheric carbon dioxide increases.

The research was conducted at the Duke Forest Free Air Carbon Dioxide Enrichment site in North Carolina. At this site, mature loblolly pine trees were exposed to increased levels of carbon dioxide for 14 years, making it one of the longest-running carbon dioxide enrichment experiments in the world.

Researchers were able to calculate the age of the carbon cycling through the soil by growing roots and fungi into mesh bags that contained uniquely labeled soils. The soils were then analyzed for their organic composition.

The authors also report that nitrogen cycled faster in this forest as the demand for nutrients by trees and microbes became greater under elevated CO2.

“The growth of trees is limited by the availability of nitrogen at this site, so it makes sense that trees are using the ‘extra’ carbon taken up under elevated CO2 to prime microbes to release nitrogen bound up in organic matter,” Phillips said.

“What is surprising is that the trees seem to be getting much of their nitrogen by decomposing root and fungal detritus that is less than a year old.”

The two-fold effects of microbial priming, where microbes are stimulated to decompose old soil organic matter via an increase in new carbon and other energy sources, and the faster turnover of recently fixed root and fungal carbon, are enough to explain the rapid carbon and nitrogen cycling that is occurring at the Duke Forest FACE site.

“We call it the RAMP hypothesis – Rhizo-Accelerated Mineralization and Priming – and it states that root-induced changes in the rates of microbial processing of carbon and nitrogen are key mediators of long-term ecosystem responses to global change,” Phillips added.

“Most ecosystem models have limited representations of roots, and none of them include processes such as priming. Our results demonstrate that interactions between roots and soil microbes play an underappreciated role in determining how much carbon is stored and how fast nitrogen is cycled. So including these processes in models should lead to improved projections of long-term carbon storage in forests in response to global environmental change'” he said.

“Roots and fungi accelerate carbon and nitrogen cycling in forests exposed to elevated CO2” – by Phillips; IU and University of Gottingen (Germany) post-doctoral researcher Ina C. Meier; Emily S. Bernhardt of Duke University, A. Stuart Grandy and Kyle Wickings of the University of New Hampshire; and Adrien C. Finzi of Boston University – was published July 9 in the online early addition of Ecology Letters. Free access to the research article will be available until October.

Related Links
Indiana University
Forestry News – Global and Local News, Science and Application

Grassroots approach to conservation developed

by Staff Writers
Urbana IL (SPX)

Flora And Fauna


Prescribed fire is applied on reserves, but grazing is typically excluded and herbaceous vegetation is often dominated by grasses. Credit: Ryan Harr. Ecological Society of America

A new strategy to manage invasive species and achieve broader conservation goals is being tested in the Grand River Grasslands, an area within the North American tallgrass prairie ecoregion. A University of Illinois researcher along with his colleagues at Iowa State and Oklahoma State Universities enlisted private landowners in a grassroots community-building effort to establish a more diverse landscape for native wildlife.

The Grand River Grasslands has three main problems that pose challenges to conservation efforts: invasive juniper trees, tall fescue, and heavy grazing of cattle. U of I ecologist Jim Miller and his team developed a new model for conservation that begins by raising landowners’ awareness of these problems and providing strategies, such as moderate livestock grazing and regularly scheduled controlled burns. Miller and his team identified landowners who are interested in trying something different – who will, in turn, transfer their newfound knowledge and understanding to larger groups of people in the region.

“We conducted a survey and learned that people recognize burning as a legitimate management tool but don’t have experience with it,” Miller said. “Most of the landowners have never participated in a controlled burn, so we’ve essentially lost a fire culture in much of that part of the country.”

Miller’s team invited landowners to hands-on educational field days at nearby nature reserves to show them how grazing and burning techniques work. They got experience with drip torches and learned how to work with the wind and moisture levels.

“We followed that up with a burn at one of the landowner’s savannahs that he was trying to restore,” Miller said. “It went really well and was a key step for us in our process because now we’re getting landowners to try these new strategies on their own properties.”

Miller said the next step in the model is to encourage the landowners to champion these new practices to the larger community. “They go down to the coffee shop and meet their neighbors and friends and tell them about the success they’re having with the new practices to control the juniper trees and tall fescue and how well their cattle are doing on these pastures. The neighbors start to pick up on this, and then we have the whole process repeat itself with a larger group of landowners.

“If we’re successful with this, we’ll start to see changes, not just on individual properties here and there for key landowners but over the whole landscape or the whole region,” he said.

According to Miller, the fastest-growing group of landowners in the area is non-traditional. They don’t live in the region or come from a farming background, but they instead buy land to hunt deer, turkey, quail, or maybe just to birdwatch. He said that on land with intensive cattle grazing, the cedars can be kept at bay.

“Without burning or grazing, the cedars will take over,” Miller said. “Trees seem like a good thing to wildlife enthusiasts, but they don’t see that their land will go from being an open grassland to a closed-canopy cedar stand in 20 to 25 years. Under those conditions, there are no deer, no turkey, no quail – it’s a biological desert, and it’s too late to do much with it. We think we can make the most inroads with the non-traditional owners.”

Juniper trees are invasive, largely due to fire suppression. Junipers are a fire-intolerant, woody plant. This particular species of juniper is also called eastern redcedar.

Although that may sound appealing for patio furniture or decking or biofuels, it’s not. Miller said there’s no market for this type of tree. The trees produce a prodigious seed rain that facilitates rapid colonization of an area when left unchecked. With a survey from aerial photography dating back to 1983, Miller estimated a 3 percent increase in cedar coverage per year.

Tall fescue, an exotic invasive plant that forms a monoculture, greens up early in the spring making it difficult to burn.

“Heavy stocking of cattle is an issue,” Miller said. “Cattle quickly reduce available forage to the point that some ranchers feed hay by July and August. That’s not quality habitat for grassland birds, which have seen the steepest declines in North America since we’ve been monitoring bird populations.” He said.

“There are at least two things necessary for this model to work: ecological potential in the landscape and some level of social readiness,” Miller said. “In the Grand River Grasslands, there is ecological potential, but landowners don’t all recognize that eastern redcedar trees are invasive. We’re working on that.”

Miller says that with conservation, you need a plurality, a variety of approaches, because one size doesn’t fit all.

“We’re providing a model or a road map for a different way of doing things in conversation,” Miller said. “We need to go beyond the traditional jewels-in-the-crown or fortress conservation models, characterized by national parks and other set-asides. Paying people to take their land out of production and creating state and national parks or reserves just aren’t enough. This model may not work everywhere, but in some landscapes we think this can work, and we’re trying to provide an initial example to demonstrate how it could work.

“It’s meant to be a dialogue between, our team, landowners, and other resource management professionals, such as biologists who work for the Department of Natural Resources – not us telling them what they need to do,” he said.

Frontiers in ecology and the environment: Nature reserves as catalysts for landscape change was published in The Ecological Society of America. Lois Wright Morton, David Engle, Diane Debinski, and Ryan Harr contributed. Photos were provided by Ryan Hart, Devin McGranahan, and Dave Engle.

Related Links
University of Illinois College of Agricultural, Consumer and Environmental Sciences
Darwin Today At TerraDaily.com

 

 

Eddies, not sunlight, spur annual bloom of tiny plants in North Atlantic

by Staff Writers
Seattle WA (SPX)


File image.

On a recent expedition to the inhospitable North Atlantic Ocean, scientists at the University of Washington and collaborators studying the annual growth of tiny plants were stumped to discover that the plankton had started growing before the sun had a chance to offer the light they need for their growth spurt. For decades, scientists have known that springtime brings the longer days and calmer seas that force phytoplankton near the surface, where they get the sunlight they need to flourish.

But in research results published this week in the journal Science, scientists report evidence of another trigger.

Eric D’Asaro and Craig Lee, oceanographers in the UW’s Applied Physics Laboratory and School of Oceanography, are among the researchers who found that whirlpools, or eddies, that swirl across the North Atlantic sustain phytoplankton in the ocean’s shallower waters, where the plankton can get plenty of sunlight to fuel their growth even before the longer days of spring start.

The eddies form when heavier, colder water from the north slips under the lighter, warmer water from the south. The researchers found that the eddies cause the bloom to happen around three weeks earlier than it would if it was spurred just by spring’s longer days.

“That timing makes a significant difference if you think about the animals that eat the phytoplankton,” said D’Asaro, the corresponding author on the paper.

Many small sea animals spend the winter dozing in the deep ocean, emerging in the spring and summer to feed on the phytoplankton.

“If they get the timing wrong, they’ll starve,” Lee said. Since fish eat the animals, a reduction in their number could harm the fish population.

Scientists believe that climate change may affect oceanic circulation patterns such as the one that causes the eddies. They’ve found some evidence that warm waters from the subtropics are penetrating further to the north, Lee said.

“If the climate alters the circulation patterns, it might alter the timing of the bloom, which could impact which animals grow and which die out,” he said.

Learning about the circulation of the ocean also helps scientists forecast changes in the ocean, a bit like meteorologists are able to forecast the weather, said D’Asaro.

The scientists didn’t set out to look at the kind of large-scale mixing that they found. In April 2008, Lee and co-author Mary Jane Perry of the University of Maine arrived in a storm-lashed North Atlantic aboard an Icelandic research vessel.

They launched robots (specially designed by Lee and D’Asaro) in the rough seas. A float that hovered below the water’s surface followed the motion of the ocean, moving around “like a giant phytoplankton,” said D’Asaro.

Lurking alongside the float were 6-foot-long, teardrop-shaped Seagliders, also designed at the UW, that dove to depths of up to 1,000 meters, or 3,280 feet. After each dive, working in areas 20 to 50 kilometers, or 12 to 31 miles, around the float, the gliders rose to the surface, pointed their antennas skyward and transmitted their stored data back to shore via satellite.

The float and gliders measured the temperature, salinity and speed of the water, and gathered information about the chemistry and biology of the bloom itself. Soon after measurements from the float and gliders started coming in, the scientists saw that the bloom had started, even though conditions still looked winter-like.

“It was apparent that some new mechanism, other than surface warming, was behind the bloom’s initiation,” said D’Asaro.

To find out what, the researchers needed sophisticated computer modeling.

Enter first author Amala Mahadevan, with Woods Hole Oceanographic Institution, who used 3-D computer models to look at information collected at sea by Perry, D’Asaro and Lee.

She generated eddies in a model using the north-to-south oceanic temperature variation. Without eddies, the bloom happened several weeks later and didn’t have the space and time structures actually observed in the North Atlantic.

In the future, the scientists hope to put the North Atlantic bloom into a broader context. They believe much can be learned by following the phytoplankton’s evolution across an entire year, especially with gliders and floats outfitted with new sensors. The sensors would look at the tiny animals that graze on the phytoplankton.

“What we’re learning about eddies is that they’re a critical part of life in the ocean,” said Perry. “They shape ocean ecosystems in countless ways.”

Related Links
University of Washington
Water News – Science, Technology and Politics

**********************************************************************************************************

Cyber Space

ONR Sensor and Software Suite Hunts Down More Than 600 Suspect Boats

by Staff Writers
Arlington VA (SPX)


File image.

A new sensor and software suite sponsored by the Office of Naval Research (ONR) recently returned from West Africa after helping partner nations track and identify target vessels of interest as part of an international maritime security operation, officials announced July 10. Researchers deployed the system, called “Rough Rhino,” aboard U.S. aircraft, ships and partner nation ships operating in waters off the coast of Senegal and Cape Verde.

Sailors and Coast Guardsmen could access and control the sensors both afloat and ashore, as well as share information in a real-time common operating picture.

“It provides a comprehensive maritime domain awareness picture for dark, gray and light targets-vessels that range from no electronic emissions to those that cooperatively report their name and positions, said Dr. Michael Pollock, ONR’s division director for electronics, sensors and networks.

Rough Rhino was responsible for finding targets during the most recent two-week African Maritime Law Enforcement Partnership (AMLEP) operation. The primary missions are aimed at assisting and building the host nation’s capability to interdict and counter narcotics, human trafficking and illegal fishing.

On any given day, the distributed intelligence, surveillance and reconnaissance (ISR) system tracked more than 600 targets, identified vessels of interest and culminated in 24 boardings by Gambian, Senegalese and U.S. maritime security teams. For future operations, Gambia and Senegal will continue to work with African partner nations to build and maintain maritime security and safety.

“Rough Rhino provided them one of the clearest maritime operational pictures that they’ve ever seen,” said Pollock. “They could detect, locate, quantify and confirm detailed activities of all vessels in their respective countries’ exclusive economic zones.”

AMLEP provided an opportunity to test the prototype Rough Rhino system in an operationally and tactically relevant environment, allowing designers and developers to see firsthand where the system needs improvement.

The system includes: radar, optics, electronic surveillance and integrated software modified and developed by ONR contractors and the Naval Research Laboratory. The system was installed on the Naval Research Laboratory’s VXS-1 P-3, USS Simpson and Senegalese ships SNS Poponguine and SNS Djiffere.

“The unique aspect to this project is how the research directly supports an ongoing operation and how we can immediately ingest operator feedback” said Pollock. He added that the software is constantly rewritten annually from the ground up to keep up with changing technology, sensor improvements, and fleet and operator needs.

To date, the system has participated in five major operations, including AMLEP 2011 and 2012. Participants particularly liked the system’s ease of use, requiring little training, and clarity, as well as its information storage and retrieval abilities, which can be used to support after-action reviews and legal prosecutions.

AMLEP is a joint mission conducted by the U.S. Africa Command, U.S. Naval Forces Africa, U.S. Coast Guard Atlantic Area and multiple West African navies and coast guards. AMLEP is the operational portion of the Africa Partnership Station (APS) initiative in which African navies employ their professional skill, knowledge and experience to combat crime at sea.

Since 2007, the U.S. Navy has worked alongside African partner navies and coast guards through a series of APS training events and regional exercises to improve maritime safety and security. Additionally, operations such as AMLEP provide participants with numerous opportunities to operate together and develop productive relationships through real-world situations.

Related Links
Office of Naval Research
21st Century Pirates

 

 

10 Crazy IT Security Tricks That Actually Work

By Roger A. Grimes, Infoworld

Network and endpoint security may not strike you as the first place to scratch an experimental itch. After all, protecting the company’s systems and data should call into question any action that may introduce risk. But IT security threats constantly evolve, and sometimes you have to think outside the box to keep ahead of the more ingenious evildoers.

And sometimes you have to get a little crazy.

Charles Babbage, the father of the modern computer, once said, “Propose to a man any principle, or an instrument, however admirable, and you will observe the whole effort is directed to find a difficulty, a defect, or an impossibility in it. If you speak to him of a machine for peeling a potato, he will pronounce it impossible: If you peel a potato with it before his eyes, he will declare it useless, because it will not slice a pineapple.”

The world of network security is no different. Offer a new means for IT defense, and expect to meet resistance. Yet, sometimes going against the wave of traditional thinking is the surest path to success.

In that vein, we offer 10 security ideas that have been — and in many cases still are — shunned as too offbeat to work but that function quite effectively in helping secure the company’s IT assets. The companies employing these methods don’t care about arguing or placating the naysayers. They see the results and know these methods work, and they work well.

Innovative security technique No. 1: Renaming admins

Renaming privileged accounts to something less obvious than “administrator” is often slammed as a wasteful, “security by obscurity” defense. However, this simple security strategy works. If the attacker hasn’t already made it inside your network or host, there’s little reason to believe they’ll be able to readily discern the new names for your privileged accounts. If they don’t know the names, they can’t mount a successful password-guessing campaign against them.

Even bigger bonus? Never in the history of automated malware — the campaigns usually mounted against workstations and servers — has an attack attempted to use anything but built-in account names. By renaming your privileged accounts, you defeat hackers and malware in one step. Plus, it’s easier to monitor and alert on log-on attempts to the original privileged account names when they’re no longer in use.

Innovative security technique No. 2: Getting rid of admins

Another recommendation is to get rid of all wholesale privileged accounts: administrator, domain admin, enterprise admin, and every other account and group that has built-in, widespread, privileged permissions by default.

When this is suggested, most network administrators laugh and protest, the same response security experts got when they recommended local Administrator accounts be disabled on Windows computers. Then Microsoft followed this recommendation, disabling local Administrator accounts by default on every version of Windows starting with Vista/Server 2008 and later. Lo and behold, hundreds of millions of computers later, the world hasn’t come crashing down.

True, Windows still allows you to create an alternate Administrator account, but today’s most aggressive computer security defenders recommend getting rid of all built-in privileged accounts, at least full-time. Still, many network admins see this as going a step too far, an overly draconian measure that won’t work. Well, at least one Fortune 100 company has eliminated all built-in privileged accounts, and it’s working great. The company presents no evidence of having been compromised by an APT (advanced persistent threat). And nobody is complaining about the lack of privileged access, either on the user side or from IT. Why would they? They aren’t getting hacked.

Innovative security technique No. 3: Honeypots

Modern computer honeypots have been around since the days of Clifford Stoll’s “The Cuckoo’s Egg,” and they still don’t aren’t as respected or as widely adopted as they deserve. A honeypot is any computer asset that is set up solely to be attacked. Honeypots have no production value. They sit and wait, and they are monitored. When a hacker or malware touches them, they send an alert to an admin so that the touch can be investigated. They provide low noise and high value.

The shops that use honeypots get notified quickly of active attacks. In fact, nothing beats a honeypot for early warning — except for a bunch of honeypots, called a honeynet. Still, colleagues and customers are typically incredulous when I bring up honeypots. My response is always the same: Spend a day spinning one up and tell me how you feel about honeypots a month later. Sometimes the best thing you can do is to try one.

Innovative security technique No. 4: Using nondefault ports

Another technique for minimizing security risk is to install services on nondefault ports. Like renaming privileged accounts, this security-by-obscurity tactic goes gangbusters. When zero-day, remote buffer overflow threats become weaponized by worms, computer viruses, and so on, they always — and only — go for the default ports. This is the case for SQL injection surfers, HTTP worms, SSH discoverers, and any other common remote advertising port.

Recently Symantec’s pcAnywhere and Microsoft’s Remote Desktop Protocol suffered remote exploits. When these exploits became weaponized, it was a race against the clock for defenders to apply patches or block the ports before the worms could arrive. If either service had been running on a nondefault port, the race wouldn’t even begin. That’s because in the history of automated malware, malware has only ever tried the default port.

Critics of this method of defense say it’s easy for a hacker to find where the default port has been moved, and this is true. All it takes is a port scanner, like Nmap, or an application fingerprinter, like Nikto, to identify the app running on the nondefault port. In reality, most attacks are automated using malware, which as stated, only go for default ports, and most hackers don’t bother to look for nondefault ports. They find too much low-hanging fruit on default ports to be bothered with the extra effort.

Years ago, as an experiment, I moved my RDP port from 3889 to 50471 and offered a reward to the first person to find the new port. Two people discovered the port right away, which was no surprise; because I told them what I did, it’s easy to discover the right spot. What blew me away is that tens of thousands of hacker wannabes, scanning my system for the new port using Nmap, didn’t realize that Nmap, if left to its own defaults, doesn’t look on nondefault ports. It proved that by doing a simple port move you significantly reduce your risk.

Innovative security technique No. 5: Installing to custom directories

Another security-by-obscurity defense is to install applications to nondefault directories.

This one doesn’t work as well as it used to, given that most attacks happen at the application file level today, but it still has value. Like the previous security-by-obscurity recommendations, installing applications to custom directories reduces risk — automated malware almost never looks anywhere but the default directories. If malware is able to exploit your system or application, it will try to manipulate the system or application by looking for default directories. Install your OS or application to a nonstandard directory and you screw up its coding.

On many of my honeypots, I install the OS to nondefault folders — say, in C:/Win7 instead of C:/Windows. I usually create the “fake” folders that mimic the real ones, had I installed the software and taken the defaults. When my computers get attacked, it’s easy to find complete and isolated copies of the malware hanging out in the C:/Windows/System32 folder.

Changing default folders doesn’t have as much bang for the buck as the other techniques mentioned here, but it fools a ton of malware, and that means reduced risk.

Read Full Article Here

 

 

 

DNSChanger Doomsday Threat Fizzled–Just as It Should Have

By Jared Newman, PCWorld

DNSChanger Doomsday Threat Fizzled--Just as It Should HaveNow that the feds have cut the lifeline for Internet users infected by the DNSChanger malware, we find that the result of that action wasn’t quite the “Internet doomsday” that some had predicted.

[Read: DNSChanger Malware: What’s Next?]

DNSChanger caused a panic because it was routing Internet traffic through rogue servers, which the Federal Bureau of Investigation seized and shut down in late 2011. The FBI was hosting surrogate servers to keep infected users online, but pulled the plug on Monday, forcing users to get clean or risk losing their connections.

But as of Sunday night, the FBI estimated that only 41,800 computers remained infected by DNSChanger, the Associated Press reports, and some Internet service providers are offering their own solutions to keep customers online. It’s safe to say the cutoff day has been free of catastrophes. “We’re not aware of any issues,” FBI spokeswoman Jenny Shearer told the Boston Globe.

The Warnings Worked

In light of the aftermath–or lack thereof–you might see this whole ordeal as overblown. But there’s another way to look at it: The information campaign worked.

As of February, half of all Fortune 500 companies owned computers infected with DNSChanger, and an estimated 350,000 computers around the world were still infected.

I first wrote about DNSChanger in April, but by then, the FBI’s original cutoff date had already passed. A federal judge extended the deadline from March to July because not enough people were aware of the situation.

 

Read Full Article Here

**********************************************************************************************************

Survival / Sustainability

3 SHTF NUTRITIONAL POWERHOUSES!!

Uploaded by on Feb 24, 2010

Sprouting is an easy and economical way to grow highly nutritional greens for your plate no matter where you live or what time of year it is. WHEAT, WILD RICE and SEEDS are SUPER nutritional, high yield, easy to store space savers POWERHOUSES.

EMERGENCY WATER FILTER SYSTEM SHTF

Uploaded by on Nov 8, 2010

How to make a emergency water filtering system for about $45.00. DIY build it yourself easy project for SHTF & WROL.


 

 

This guest post is by Grizzly Hester and entry in our non-fiction writing contest .

In my first year as a prepper, I made an enormous misstep. My wife and I live in an apartment, and I placed our hurricane preps in an outside closet off a connected patio. Things were great until winter. Then the frigid (for here) temperatures played havoc on the water containers in storage. They burst. Rusty cans. Not good. It did give me an excellent opportunity to refine our cache; a lesson through failure.

As Atlantic Hurricane Season 2012 gets started, I broke out the stockpile to review and share. From the outset, I should concede that Plan A is to get gone – evacuation. If the situation doesn’t appear too severe, we’ll stay and make use of our preps, if needed. That caveat being said…

Where to store:

A lot of people have superb plans for the pallets of pinto beans they plan to lay away. Fewer people consider where they’ll store it. Fewer people still have a glut of space in the house to dedicate to items that will – Lord willing – go untouched for some time. (Thus, the initial outdoor storage solution failure.) For my kit, I knew it had to be as compact as possible to store out of the way. This also factors into its portability should the need arise to relocate the stash or me.

Why to store:

We’re still not quite to the meat of the matter, but it’s important to understand why you’re putting away some supplies. This level of store is not meant to sustain you through the long winter months after the grid is down while you and “missus” fend off roving hordes, zombies, or other enemy du jour. This kit is for a number of days without power while other services are, largely, unaffected or restored with reasonable haste.

What to store:

This is what you’ve been waiting for. I’ve geared this kit to items that my wife and I already consume (to ease rotation), items that can be easily used or prepared, and items that are fairly stable on the shelf. This kit is meant to sustain two people for at least five days with filling meals 2-3 times per day. This kit is also intended for use after all other such resources have been consumed from the house.

 

Read Full Article Here

 

***********************************************************************************************************

Activism

Tibetan sets himself alight in China protest: group

by Staff Writers
Beijing (AFP)

A young Tibetan man set himself on fire near Tibet’s capital of Lhasa on Saturday, a rights group said, the latest in a series of protests against Chinese rule.

The fate of the 22-year-old man, whose name was given as Tsewang Dorjee, was unknown though there were reports he had died, London-based Free Tibet said in a statement on Tuesday.

Government officials could not be reached for comment.

With the latest incident, at least 41 people have set themselves on fire in Tibetan-inhabited areas of China in protest at repressive government policies, according to activists.

The rights group said authorities have tightened security in Damxung county near Lhasa following the incident, detaining witnesses and cutting off communications.

On May 27, two men set themselves on fire in front of the Jokhang temple, a renowned centre for Buddhist pilgrimage in the centre of Lhasa, in the first such incident to hit the city.

Lhasa was the scene of violent anti-Chinese government protests in 2008, which later spread to other areas inhabited by Tibetans, and authorities have kept the city under tight security since then.

Tibetans have long chafed under China’s rule over the vast Himalayan plateau, saying that Beijing has curbed religious freedoms and their culture is being eroded by an influx of Han Chinese, the country’s main ethnic group.

Beijing, however, says Tibetans enjoy religious freedom and have benefited from improved living standards brought on by China’s economic expansion.

China on Sunday started work on a 30-billion-yuan ($4.8-billion) tourism project in Lhasa, as it seeks to draw more travellers to the restive Tibet region.

Related Links
China News from SinoDaily.com

 

************************************************************************************************************
[In accordance with Title 17 U.S.C. Section 107, this material is distributed without profit, for research and/or educational purposes. This constitutes ‘FAIR USE’ of any such copyrighted material.]