Tag Archive: California Institute of Technology


KING 5.com

Earthquake Early Warning coming to Washington

by GLENN FARLEY KING5 News

Bio | Email | Follow: @GlennFarley

Posted on April 18, 2014 at 6:58 PM

Updated yesterday at 6:59 PM

 

SEATTLE —  It’s called “earthquake early warning” – a network of seismometers, computers and software  designed to work together to give people time to brace for earthquake shaking.

Scientists say think of it like lightning and thunder.  The further you are away from the lightening, the more seconds there are between seeing a flash and feeling the thunder.

If you’re sitting on top of the quake’s epicenter, there is no warning, but the warning will be longer the further you are from where the quake starts.

The University of Washington, Cal Tech, and the University of California at Berkeley have been working together for years bringing earthquake early warning to the West Coast.  Pieces of the system are starting to go into effect in the more active area of Southern California.

Washington faces a risk of bigger but less frequent mega-quakes off the coast that creates different requirements, but it should start seeing pieces of the system begin operating later this year, said state seismologist John Vidale, who also leads the Pacific Northwest Seismic Network based at the University of Washington.

“It’s about noticing earthquakes fast and telling people the shaking is on the way,” said Vidale.

 

Read More Here

Enhanced by Zemanta

Earth Watch Report  –  Space

Published on Mar 11, 2013

This visualization, produced using the Hayden Planetarium’s Digital Universe–the most comprehensive and scientifically accurate, three-dimensional map of the known universe– shows where the star HR 8799 is in relation to our solar system. Recently, a team of researchers led by the American Museum of Natural History used a suite of high-tech instrumentation and software called Project 1640 (www.amnh.org/project1640) to collect the first chemical fingerprints, or spectra, of the four red exoplanets orbiting this star. This visualization also shows other stars that are known to harbor planetary systems (stars with blue circles around them). HR 8799’s system, which is 128 light years away from Earth, is one of only a couple of these stars that have been imaged, and the only one for which spectroscopy of all the planets has been obtained. Over the next three years, the team will survey many of these other stars in the same manner in which they studied HR 8799.

Music by Gurdonark (http://ccmixter.org/files/gurdonark/2…)
using Creative Commons Attribution samples by Kaer Trouz and the Institute of Contemporary Music

New technology aims at improvising exoplanet discovery

By

Nix

Posted on March 18, 2013P

A new advanced telescope imaging system is expected to improve detection of exoplanets obscured by exponentially bright stars. Unlike Kepler mission which uses the transit method to find out exoplanets transiting in front of their stars, this one actually blots out bright light of the star in order to see the planets. It incorporates high-tech instrumentation and software, and is called Project 1640. It will enable astronomers to observe and characterize exo-planetary systems easily and far more efficiently than previously used methods. The imaging system became first to observe planets at once in a solar system of star HR 8799. HR 8799 is 1.6 times massive...
  • The Watchers

A new advanced telescope imaging system is expected to improve detection of exoplanets obscured by exponentially bright stars. Unlike Kepler mission which uses the transit method to find out exoplanets transiting in front of their stars, this one actually blots out bright light of the star in order to see the planets. It incorporates high-tech instrumentation and software, and is called Project 1640. It will enable astronomers to observe and characterize exo-planetary systems easily and far more efficiently than previously used methods.

The imaging system became first to observe planets at once in a solar system of star HR 8799. HR 8799 is 1.6 times massive and five times brighter than our Sun. The star is located 128 light years away from Earth and had previously been imaged, although the star’s bright light overwhelmed previous attempts to study the planets with spectroscopy. Ben R. Oppenheimer, an astronomer at the American Museum of Natural History and the paper’s lead author said, “These warm, red planets are unlike any other known objects in our universe. All four planets have different spectra and all four are peculiar.” He further explains,

“It’s like taking a single picture of the Empire State Building from an airplane that reveals the height of the building but also a bump on the sidewalk next to it that is as high as a couple of bacteria. Furthermore, we’ve been able to do this over a range of wavelengths in order to make a spectrum.”

Now with this system, the researchers were able to determine the spectra of all four planets surrounding HR 8799, whereby they can assert chemical composition of planet’s atmosphere. Charles Beichman, executive director of the NASA Exoplanet Science Institute at the California Institute of Technology explains,

 

Read Full Article Here

ICE WORLD

by Staff Writers
Houston TX (SPX) Feb 13, 2013


Earth’s greenhouse-icehouse oscillations are a natural consequence of plate tectonics. The research showed that tectonic activity drives an episodic flare-up of volcanoes along continental arcs, particularly during periods when oceans are forming and continents are breaking apart. The continental arc volcanoes that arise during these periods are located on the edges of continents, and the magma that rises through the volcanoes releases enormous quantities of carbon dioxide as it passes through layers of carbonates in the continental crust.

A new Rice University-led study finds the real estate mantra “location, location, location” may also explain one of Earth’s enduring climate mysteries. The study suggests that Earth’s repeated flip-flopping between greenhouse and icehouse states over the past 500 million years may have been driven by the episodic flare-up of volcanoes at key locations where enormous amounts of carbon dioxide are poised for release into the atmosphere.

“We found that Earth’s continents serve as enormous ‘carbonate capacitors,'” said Rice’s Cin-Ty Lee, the lead author of the study in this month’s GeoSphere.

“Continents store massive amounts of carbon dioxide in sedimentary carbonates like limestone and marble, and it appears that these reservoirs are tapped from time to time by volcanoes, which release large amounts of carbon dioxide into the atmosphere.”

Lee said as much as 44 percent of carbonates by weight is carbon dioxide. Under most circumstances that carbon stays locked inside Earth’s rigid continental crust.

“One process that can release carbon dioxide from these carbonates is interaction with magma,” he said. “But that rarely happens on Earth today because most volcanoes are located on island arcs, tectonic plate boundaries that don’t contain continental crust.”

Earth’s climate continually cycles between greenhouse and icehouse states, which each last on timescales of 10 million to 100 million years. Icehouse states — like the one Earth has been in for the past 50 million years — are marked by ice at the poles and periods of glacial activity.

By contrast, the warmer greenhouse states are marked by increased carbon dioxide in the atmosphere and by an ice-free surface, even at the poles. The last greenhouse period lasted about 50 million to 70 million years and spanned the late Cretaceous, when dinosaurs roamed, and the early Paleogene, when mammals began to diversify.

Lee and colleagues found that the planet’s greenhouse-icehouse oscillations are a natural consequence of plate tectonics. The research showed that tectonic activity drives an episodic flare-up of volcanoes along continental arcs, particularly during periods when oceans are forming and continents are breaking apart.

The continental arc volcanoes that arise during these periods are located on the edges of continents, and the magma that rises through the volcanoes releases enormous quantities of carbon dioxide as it passes through layers of carbonates in the continental crust.

Lee, professor of Earth science at Rice, led the four-year study, which was co-authored by three Rice faculty members and additional colleagues at the University of Tokyo, the University of British Columbia, the California Institute of Technology, Texas A and M University and Pomona College.

Lee said the study breaks with conventional theories about greenhouse and icehouse periods.

“The standard view of the greenhouse state is that you draw carbon dioxide from the deep Earth interior by a combination of more activity along the mid-ocean ridges — where tectonic plates spread — and massive breakouts of lava called ‘large igneous provinces,'” Lee said. “Though both of these would produce more carbon dioxide, it is not clear if these processes alone could sustain the atmospheric carbon dioxide that we find in the fossil record during past greenhouses.”

Lee is a petrologist and geochemist whose research interests include the formation and evolution of continents as well as the connections between deep Earth and its oceans and atmosphere..

Lee said the conclusions in the study developed over several years, but the initial idea of the research dates to an informal chalkboard-only seminar at Rice in 2008. The talk was given by Rice oceanographer and study co-author Jerry Dickens, a paleoclimate expert; Lee and Rice geodynamicist Adrian Lenardic, another co-author, were in the audience.

 

Read Full Article Here

STELLAR CHEMISTRY

by Marcus Woo for Caltech News
Pasadena CA (SPX)


This image, taken with NASA’s Spitzer infrared space telescope, shows the mysterious galactic cloud, seen as the black object on the left. The galactic center is the bright spot on the right. Credit: NASA/Spitzer/Benjamin et al., Churchwell et al.

It’s the mystery of the curiously dense cloud. And astronomers at the California Institute of Technology (Caltech) are on the case. Near the crowded galactic center, where billowing clouds of gas and dust cloak a supermassive black hole three million times as massive as the sun-a black hole whose gravity is strong enough to grip stars that are whipping around it at thousands of kilometers per second-one particular cloud has baffled astronomers.

Indeed, the cloud, dubbed G0.253+0.016, defies the rules of star formation.

In infrared images of the galactic center, the cloud-which is 30 light-years long-appears as a bean-shaped silhouette against a bright backdrop of dust and gas glowing in infrared light. The cloud’s darkness means it is dense enough to block light.

According to conventional wisdom, clouds of gas that are this dense should clump up to create pockets of even denser material that collapse due to their own gravity and eventually form stars.

One such gaseous region famed for its prodigious star formation is the Orion Nebula. And yet, although the galactic-center cloud is 25 times denser than Orion, only a few stars are being born there-and even then, they are small. In fact, the Caltech astronomers say, its star-formation rate is 45 times lower than what astronomers might expect from such a dense cloud.

“It’s a very dense cloud and it doesn’t form any massive stars-which is very weird,” says Jens Kauffmann, a senior postdoctoral scholar at Caltech.

In a series of new observations, Kauffmann, along with Caltech postdoctoral scholar Thushara Pillai and Qizhou Zhang of the Harvard-Smithsonian Center for Astrophysics, have discovered why: not only does it lack the necessary clumps of denser gas, but the cloud itself is swirling so fast that it can’t settle down to collapse into stars.

The results, which show that star formation may be more complex than previously thought and that the presence of dense gas does not automatically imply a region where such formation occurs, may help astronomers better understand the process.

 

Read Full Article Here

Environmental

NASA Radar Penetrates Thick, Thin of Gulf Oil Spill

by Staff Writers
Pasadena CA (JPL)


NASA UAVSAR image of the Deepwater Horizon oil spill, collected June 23, 2010. The oil appears much darker than the surrounding seawater in the greyscale image. This is because the oil smoothes the sea surface and reduces its electrical conductivity, causing less radar energy to bounce back to the UAVSAR antenna. Additional processing of the data by the UAVSAR team produced the two inset color images, which reveal the variability of the oil spill’s characteristics, from thicker, concentrated emulsions (shown in reds and yellows) to minimal oil contamination (shown in greens and blues). Dark blues correspond to areas of clear seawater bordering the oil slick. Images credit: NASA/JPL-Caltech. For a larger version of this image please go here.

Researchers at NASA’s Jet Propulsion Laboratory and the California Institute of Technology in Pasadena have developed a method to use a specialized NASA 3-D imaging radar to characterize the oil in oil spills, such as the 2010 BP Deepwater Horizon spill in the Gulf of Mexico. The research can be used to improve response operations during future marine oil spills.

Caltech graduate student Brent Minchew and JPL researchers Cathleen Jones and Ben Holt analyzed NASA radar imagery collected over the main slick of the BP Deepwater Horizon oil spill on June 22 and June 23, 2010. The data were acquired by the JPL-developed Uninhabited Aerial Vehicle Synthetic Aperture Radar (UAVSAR) during the first of its three deployments over the spill area between June 2010 and July 2012.

The UAVSAR was carried in a pod mounted beneath a NASA C-20A piloted aircraft, a version of the Gulfstream III business jet, based at NASA’s Dryden Aircraft Operations Facility in Palmdale, Calif. The researchers demonstrated, for the first time, that a radar system like UAVSAR can be used to characterize the oil within a slick, distinguishing very thin films like oil sheen from more damaging thick oil emulsions.

“Our research demonstrates the tremendous potential of UAVSAR to automate the classification of oil in a slick and mitigate the effects of future oil spill tragedies,” said Jones. “Such information can help spill incidence response commanders direct cleanup operations, such as the mechanical recovery of oil, to the areas of thick oil that would have the most damaging environmental impacts.”

Current visual oil classification techniques are qualitative, and depend upon the skill of the people doing the assessment and the availability of skilled observers during an emergency. Remote sensing allows larger areas to be covered in a consistent manner in a shorter amount of time. Radar can be used at night or in other low-light or poor weather conditions when visual surveys can’t be conducted.

Radar had previously been used to detect the extent of oil slicks, but not to characterize the oil within them. It had generally been assumed that radar had little to no use for this purpose. The team demonstrated that UAVSAR could be used to identify areas where thick oil had mixed with the surface seawater to form emulsions, which are mixtures of oil and seawater.

Identifying the type of oil in a spill is vital for assessing its potential harm and targeting response efforts.

For example, thin oil consists of sheens that measure from less than 0.0002 inches (0.005 millimeters) to about 0.002 inches (0.05 millimeters) thick. Sheens generally form when little oil is released, as in the initial stages of a spill, or from lightweight, volatile components of spill material. Because sheens contain little oil volume, they weather and evaporate quickly, and are of minor concern from an environmental standpoint.

Oil emulsions, on the other hand, are 0.04 inches (1 millimeter) thick, contain more oil, and persist on the ocean surface for much longer, thereby potentially having a greater environmental impact in the open sea and along the shoreline.

“Knowing the type of oil tells us a lot about the thickness of the oil in that area,” said Jones.

The researchers acquired data in June 2010 along more than 3,400 miles (5,500 kilometers) of flight lines over an area of more than 46,330 square miles (120,000 square kilometers), primarily along the Gulf Coast. They found that at the time the slick was imaged by UAVSAR, much of the surface layer of the Deepwater Horizon spill’s main slick consisted of thick oil emulsions.

UAVSAR characterizes an oil spill by detecting variations in the roughness of its surface and, for thick slicks, changes in the electrical conductivity of its surface layer.

Just as an airport runway looks smooth compared to surrounding fields, UAVSAR “sees” an oil spill at sea as a smoother (radar-dark) area against the rougher (radar-bright) ocean surface because most of the radar energy that hits the smoother surface is deflected away from the radar antenna.

UAVSAR’s high sensitivity and other capabilities enabled the team to separate thick and thin oil for the first time using a radar system.

“We knew we were going to detect the extent of the spill,” said Holt. “But we had this great new instrument, so we wanted to see how it would work in this extreme situation, and it turned out to be really unique and valuable, beyond all previous radar results for spills.”

“We studied an unprecedented event using data collected by a sophisticated instrument and were able to show that there was a lot more information contained in the data than was apparent when we began,” said Minchew. “This is a good example of how the tools of science could be used to help mitigate disasters in real time.”

UAVSAR is returning to the Gulf of Mexico area this month and will image the area around the Deepwater Horizon site to look for leaks. In the future, UAVSAR data may be combined with imaging spectroscopic data from JPL’s Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) instrument to further improve the ability to characterize oil spills under a broader range of environmental conditions.

In addition to characterizing the oil slick, UAVSAR imaged most of the U.S. Gulf of Mexico coastline, extending from the Florida Keys to Corpus Christi, Texas, with extensive inland coverage of the southern Louisiana wetlands around Barataria Bay, the terrestrial ecosystem that ultimately sustained the greatest oiling from the massive spill.

Researchers tracked the movement of the oil into coastal waterways and marshlands, monitored impact and recovery of oil-affected wetlands, and assessed how UAVSAR can support emergency responders in future disasters.

UAVSAR is also used to detect detailed Earth movements related to earthquakes, volcanoes and glaciers, as well as for soil moisture and forestry biomass studies. For more on UAVSAR, see here.

Results of this study are published this month in the Institute of Electrical and Electronics Engineers journal Transactions on Geoscience and Remote Sensing. Caltech manages JPL for NASA.

 

.

Related Links
Uninhabited Aerial Vehicle Synthetic Aperture Radar (UAVSAR)
Earth Observation News – Suppiliers, Technology and Application

Fools’ Gold Found to Regulate Oxygen

BLUE SKY

by Staff Writers
Rehovot, Israel (SPX)


File image.

As sulfur cycles through Earth’s atmosphere, oceans and land, it undergoes chemical changes that are often coupled to changes in other such elements as carbon and oxygen. Although this affects the concentration of free oxygen, sulfur has traditionally been portrayed as a secondary factor in regulating atmospheric oxygen, with most of the heavy lifting done by carbon. However, new findings that appeared this week in Science suggest that sulfur’s role may have been underestimated.

Drs. Itay Halevy of the Weizmann Institute’s Environmental Science and Energy Research Department (Faculty of Chemistry), Shanan Peters of the University of Wisconsin and Woodward Fischer of the California Institute of Technology, were interested in better understanding the global sulfur cycle over the last 550 million years – roughly the period in which oxygen has been at its present atmospheric level of around 20%.

They used a database developed and maintained by Peters at the University of Wisconsin, called Macrostrat, which contains detailed information on thousands of rock units in North America and beyond.

The researchers used the database to trace one of the ways in which sulfur exits ocean water into the underlying sediments – the formation of so-called sulfate evaporite minerals. These sulfur-bearing minerals, such as gypsum, settle to the bottom of shallow seas as seawater evaporates.

The team found that the formation and burial of sulfate evaporites were highly variable over the last 550 million years, due to changes in shallow sea area, the latitude of ancient continents and sea level.

More surprising to Halevy and colleagues was the discovery that only a relatively small fraction of the sulfur cycling through the oceans has exited seawater in this way. Their research showed that the formation and burial of a second sulfur-bearing mineral – pyrite – has apparently been much more important.

Pyrite is an iron-sulfur mineral (also known as fools’ gold), which forms when microbes in seafloor sediments use the sulfur dissolved in seawater to digest organic matter. The microbes take up sulfur in the form of sulfate (bound to four oxygen atoms) and release it as sulfide (with no oxygen).

Oxygen is released during this process, thus making it a source of oxygen in the air. But because this part of the sulfur cycle was thought be minor in comparison to sulfate evaporite burial (which does not release oxygen), its effect on oxygen levels was also thought to be unimportant.

In testing various theoretical models of the sulfur cycle against the Macrostrat data, the team realized that the production and burial of pyrite has been much more significant than previously thought, accounting for more than 80% of all sulfur removed from the ocean (rather than the 30-40% in prior estimates). As opposed to the variability they saw for sulfate evaporite burial, pyrite burial has been relatively stable throughout the period.

The analysis also revealed that most of the sulfur entering the ocean washed in from the weathering of pyrite exposed on land. In other words, there is a balance between pyrite formation and burial, which releases oxygen, and the weathering of pyrite on land, which consumes it. The implication of these findings is that the sulfur cycle regulates the atmospheric concentration of oxygen more strongly than previously appreciated.

“This is the first use of Macrostrat to quantify chemical fluxes in the Earth system,” said Peters. “I met my coauthors at a lecture I gave at Caltech, and we immediately began discussing how we might apply Macrostrat to understanding biogeochemical cycling. I think this study will open the door to many more uses of Macrostrat for constraining biogeochemical cycles.”

“For me, the truly surprising result is that pyrite weathering and burial appear to be such important processes in the sulfur cycle throughout all of Earth’s history. The carbon cycle is recognized as the central hub controlling redox processes on Earth, but our work suggests that nearly as many electrons are shuttled through the sulfur cycle,” said Fischer.

Halevy: “These findings, in addition to shedding new light on the role of sulfur in regulating oxygen levels in the atmosphere, represent an important step forward in developing a quantitative, mechanistic understanding of the processes governing the global sulfur cycle.”

These findings appeared this week in Science.

 

Related Links
Weizmann Institute
The Air We Breathe at TerraDaily.com

Artifical jellyfish created in lab from rat cells

An artificial jellyfish which is able to swim with the help of beating heart muscle cells has been created by scientists.

Artifical jellyfish created in lab from rat cells

The silicon jellyfish can mimic swimming movements thanks to muscle cells from rat hearts implanted onto it. Photo: Harvard University and Caltech
Nick Collins

By , Science Correspondent

The tentacled artificial creature, made from silicon, has been dubbed “Medusoid” because of its resemblance to the snake-haired character from Greek mythology whose gaze turned people to stone.

It is able to mimic the swimming movement of a jellyfish thanks to muscle cells from rat hearts which were implanted onto its silicon frame and grown into a pattern similar to the muscles of a real jellyfish.

By applying an electric current to a container of conducting liquid, the scientists demonstrated they could “shock” the muscles into contracting so that it began to move through the water.

The “reverse-engineering” project by researchers from the California Institute of Technology and Harvard University was published on the website of the Nature Biotechnology journal.

Janna Nawroth, lead author of the study, said that most researchers working in tissue engineering have attempted to copy tissue or organs by simply recreating its major components, regardless of what their function is and whether they could be replaced by something simpler.

She said: “A big goal of our study was to advance tissue engineering. Our idea was that we would make jellyfish functions — swimming and creating feeding currents — as our target and then build a structure based on that information.”

Her colleague Prof John Dabiri added: “I’m pleasantly surprised at how close we are getting to matching the natural biological performance, but also that we’re seeing ways in which we can probably improve on that natural performance. The process of evolution missed a lot of good solutions.”

Jellyfish use a pumping muscle to propel themselves through the water, meaning their movement is based on a mechanism similar to a human heart.

This makes them a useful model for tissue engineering, technology which could one day be used to create synthetic hearts or other organs for human patients.

Prof Kevin Kit Parker, one of the study’s authors, said: “I saw a jellyfish at the New England Aquarium, and I immediately noted both similarities and differences between how the jellyfish pumps and the human heart. The similarities help reveal what you need to do to design a bio-inspired pump.

“The jellyfish provides a design algorithm for reverse engineering an organ’s function.”

Prof Dabiri added: “A lot of work these days is done to engineer molecules, but there is much less effort to engineer organisms.

“I think this is a good glimpse into the future of re-engineering entire organisms for the purposes of advancing biomedical technology.”