Planetary scientists are working on equations to assess how habitable a given planet might be.
By Alan Boyle, Science Editor, NBC News
Planet-hunters say they've developed a relatively simple method for determining how livable a faraway world might be, and they've used the formula to identify a top candidate: a super-Earth that's 36 light-years away.
The research paper was submitted to the journal Astronomy and Astrophysics just two weeks ago, but it's quickly making the rounds among those who follow the accelerating search for planets beyond our solar system. The big reason for all the interest is that the paper points to a new prospect for the short list of potentially habitable planets: HD 85512 b, a world that's at least 3.6 times as massive as Earth, circling an orange star in the constellation Vela.
The authors — Lisa Kaltenegger of the Harvard-Smithsonian Center for Astrophysics, and Stephane Udry and Francesco Pepe of the University of Geneva — rank the extrasolar planet right up there with Gliese 581d, a prime prospect for habitability that is 20 light-years from Earth. "HD 85512 b is, with Gl 581d, the best candidate for exploring habitability to date, a planet on the edge of habitability," they say.
The paper uses HD 85512 b as a test case for a set of equations aimed at assessing how livable a particular planet might be, based on its orbital parameters, how much radiation it gets from its parent sun and the nature of its atmosphere. HD 85512 b's minimum mass and orbital parameters were published only recently, based on data from the HARPS-Upgrade GTO planet search. The world orbits a star that is significantly dimmer than our own sun, at a distance of 0.26 AU — which is within Mercury's orbit in our solar system. It makes one full orbit every 58.4 Earth days, the researchers report.
The researchers assume that HD 85512 b is a rocky planet with an Earthlike atmosphere containing water vapor, carbon dioxide and nitrogen. If that's the case, and if more than half the planet is covered by clouds, then it "could be potentially habitable," they say.
Is there a way to resolve those "ifs"? Comparing the planet's mass with its size could tell astronomers whether its composition is more like Neptune's or Earth's. But to study its atmosphere, we're going to need a bigger telescope.
Here's how Kaltenegger explained the challenge to Skymania News: "As to whether it is really habitable, we’ll need a spectrum to tell that — direct imaging would be the ticket. With a direct imaging mission we could detect if it looks habitable. We could detect clouds if we had a big enough telescope in space."
It could be a long time before there's a telescope (or an interferometer) big enough to take on that job. But even now, Kaltenegger and her colleagues say that their research provides "a simple set of parameters which can be used for evaluating current and future planet candidates ... for their potential habitability."
How long will it take to whip up a top-ten list for extrasolar emigration? Weigh in with your comments below.
This graphic records proton collision events in the Large Hadron Collider's Compact Muon Solenoid in which four high-energy electrons (shown as red towers) are observed. The event shows characteristics expected from the decay of a Higgs boson but is also consistent with background processes.
By Alan Boyle, Science Editor, NBC News
The latest results from the Large Hadron Collider serve as a reality check for expectations that radical scientific discoveries are just around the corner. A month ago, folks were buzzing about prospects that the elusive Higgs boson might soon be found. This week, they're talking about how the Higgs boson, as well as other exotic ideas such as supersymmetry and superstring theory, might be merely a will o' the wisp.
Reservations about the imminent revolution in particle physics cropped up in the wake of last week's Lepton Photon conference in Mumbai, India. Some observers speculated that fresh results could confirm an anomalous "bump" in earlier data from the LHC's two main detectors, ATLAS and the Compact Muon Solenoid.
Such a bump could suggest the mass-energy level where the Higgs boson was lurking. Detecting the Higgs boson, also known as the "God Particle," has been the main goal of the $10 billion particle collider on the French-Swiss border. Physicists are anxious to see it because it would be the last fundamental particle predicted by the Standard Model, one of physics' most successful theories. The Higgs mechanism could explain why some particles have mass while others don't.
Bye-bye, bump? But instead of confirming the earlier bump in their data, researchers at Europe's CERN particle physics lab reported last week that "the significance of those fluctuations has slightly decreased." That led some observers to suggest that the Higgs boson "likely doesn't exist."
In fact, it's still too early to render a verdict. "Variations up and down on significance are to be expected," Fermilab physicist Don Lincoln, author of a book on the LHC titled "The Quantum Frontier," told me in an email today. "The two conferences are only a month apart, and things don't change hugely between them."
So far, the most significant findings from the LHC are those that have virtually ruled out broad areas of the mass-energy spectrum where the Higgs might have been detected — mostly in the range between 145 billion and 466 billion electron volts, with 95 percent certainty. There's a better chance of finding the Higgs at lower masses, below 145 billion electron volts, but that's going to be a trickier challenge for the high-powered LHC.
Sergio Bertolucci, CERN's research director, put an optimistic spin on the non-findings, declaring that "these are exciting times for particle physics."
"Discoveries are almost assured within the next 12 months," he said. "If the Higgs exists, the LHC experiments will soon find it. If it does not, its absence will point the way to new physics."
So long, supersymmetry? That new physics could theoretically include supersymmetry and string theory, weird concepts that propose the existence of whole classes of yet-to-be-discovered particles (or "sparticles"). Such concepts represent a departure from the Standard Model, and for that reason physicists are looking closely for any anomalies that would open the way to new physics.
For those physicists, the latest data from the LHCb detector — which is particularly sensitive to matter-antimatter anomalies in the decay of B-mesons — might represent a bit of a letdown. Researchers reported in Mumbai that their measurements were "in agreement with the Standard Model prediction," although they said "there is still room for a new physics contribution."
A spokesperson for the LHCb experiment, Tara Shears of Liverpool University, told the BBC that her team's results "put supersymmetry on the spot." Other physicists are starting to wonder whether the concept will have to be discarded in favor of other exotic ideas, such as a fifth fundamental force known as technicolor.
For physicists, non-discoveries can be as valuable as discoveries. But if CERN's big machine doesn't produce some breakthrough physics, it's likely to be more difficult to sell taxpayers and politicians on the next big machine.
Years ago, CERN theoretical physicist John Ellis told me it "might be a little bit difficult to explain to our politicians, that here they gave us 10 billion of whatever, your favorite currency unit, and we didn't find the Higgs boson." Ellis and his colleagues don't have to provide that explanation just yet, but stay tuned. A year from now, physicists will either be struggling to explain the weird phenomena they're seeing ... or struggling to explain the absence of weird phenomena.
Update for 2 p.m. ET Sept. 1: The BBC's Pallab Ghosh revives hopes for the Higgs in a report saying that the "Higgs particle could be found by Christmas." Those expectations are based on a Quantum Diaries blog posting by University of Wisconsin researcher Richard Ruiz, who notes there's a chance that the LHC will collect enough data by the end of this year to make statistical judgments about whether the Standard Model's version of the Higgs exists or not over a broad range of possible masses.
Guido Tonelli, spokesman for the LHC's Compact Muon Solenoid experiment, told the BBC, "We could discover the Standard Model version of the Higgs boson or exclude it earlier than expected. Could we discover it by Christmas? In principle, yes."
There are several caveats: First, the forecast assumes that data will keep flowing from the LHC at its current better-than-predicted rate. Second, researchers are shifting their focus to regions of the mass spectrum where the results are more difficult to interpret, and therefore physicists may require more data than they originally expected. And third, the projections apply only to the kind of Higgs particle predicted by the Standard Model. A non-standard Higgs boson could still escape the net.
Fermilab's Lincoln says updated results from the LHC are due to be announced in mid-November at the Hadron Collider Physics symposium in Paris, so the situation may become clearer at that time. Stay tuned ... maybe we'll have something more definitive by Thanksgiving.
An artist's conception of the one-pot method for transforming ammonium borane into hydrogen fuel and an illustration of how the technology, which doesn't produce greenhouse gasses, is good for the planet.
By John Roach, Contributing Writer, NBC News
Hydrogen is once again starting to look like a promising green fuel of the future thanks to breakthroughs that permit the fuel to be stored and released from a chemical "tank" that is easily recharged.
The problem with hydrogen, which is easily converted to electricity in a fuel cell and is carbon free, is that it’s a gas that is typically stored in high pressure or cryogenic tanks.
If there's a wreck, the resulting explosion becomes a major safety issue, noted Travis Williams, an assistant professor of Chemistry at the University of Southern California.
"There is also the problem of the best cars we can engineer would probably have a 100 mile driving range per fuel tank," he told me today.
To solve these problems, researchers led by efforts at the U.S. Department of Energy have focused their efforts on a storing the gas in a molecule called ammonia borane, a nitrogen-boron complex that has a hydrogen storage capacity approaching 20 percent by weight.
A block of the stuff could allow a fuel cell car to go 300 miles on a single tank, a benchmark set by the DOE.
The race at the national research labs and academia has been to figure out the best way to take ammonia borane, treat it with a chemical catalyst and get the hydrogen out so it can be used.
"It works at a mild temperature, the catalyst is reusable, it is air stable, you get a pretty good portion of the hydrogen out and it makes an innocuous byproduct which is intrinsic to the ammonia borane itself," he said.
The Los Alamos researchers envision an ammonia borane fuel tank that is used to fuel cars, then taken out and sent to a factory to be recharged.
For now, Williams said, using ammonia borane as a hydrogen fuel storage tank would work best for applications such as powering a laptop computer.
With a bit of the catalyst, he said, "it could power something the size of your laptop for a long time and be a lot less heavy than the battery because it is boron and nitrogen, not heavy metals."
This would be ideal, as an example, for military personnel out in the field, who currently carry around heavy batteries to power all their high-tech gadgets.
Getting the process running smoothly and efficiently enough to power hydrogen fuel cells in our cars is at least several years out, Williams noted.
"But I think in our lifetime this is going to become practical," he said.
"The way I see the race is there is going to be a point when we run out of petrochemicals…we are not there yet generally, but we want to have the chemical conversion technology ready so we don't disrupt the economy when that happens."
Astronomer Joe Liske tells the story behind the Hubble Space Telescope's observations of stellar jets in a "Hubblecast" presented by the European Space Agency's Hubble team.
By Alan Boyle, Science Editor, NBC News
Astronomers are using movies created from Hubble Space Telescope imagery to track the gassy burps belched out by young stars.
They say the moving pictures have already unraveled some of the mysteries surrounding Herbig-Haro objects, which are stars that send out colorful, blobby jets of glowing gas at supersonic speeds. The phenomenon is named after astronomers George Herbig and Guillermo Haro, who studied the outflows in the 1950s. Researchers still don't fully understand how the stars unleash such jets, but the new imagery has given them a better sense of the mechanism behind them.
The time-lapse movies were assembled from high-resolution still images collected by Hubble over the course of 14 years.
"For the first time, we can actually observe how these jets interact with their surroundings by watching these time-lapse movies," Rice University's Patrick Hartigan said today in a news release. "Those interactions tell us how young stars influence the environments out of which they form. With movies like these, we can now compare observations of jets with those produced by computer simulations and laboratory experiments to see which aspects of the interactions we understand and which we don't understand."
How the jets work Herbig-Haro jets occur during a relatively brief phase of star formation, lasting about 100,000 years. When a star is born, its gravity pulls in still more material from the disk of gas and dust that swirls around it. That's when the stuff really hits the fan: As the star spins, it flings blobs of the in-falling gas back out into space. At first, those blobs may zoom outward in a tightly focused beam due to the star's strong magnetic field. But eventually they collide with each other, creating a cosmic traffic jam.
The process stops when the disk is emptied of its excess gas and dust, leaving behind planets and bits of cosmic flotsam and jetsam. Such a scenario may well have unfolded in our own solar system 4.5 billion years ago.
Hartigan and his colleagues focused on three stars where the Herbig-Haro jets are in full swing. One star, near the Orion Nebula, has opposing jets known as HH 1 and HH 2. Another star in the southern constellation Vela expels jets that are designated HH 46 and HH 47. The third star, in Orion, has a jet called HH 34. All three stars are about 1,350 light-years from Earth.
For each star, the astronomers collected Hubble imagery at three data points between 1994 and 2008. Then they fed the still pictures into a computer program that turned the pictures into smooth animations. That made it easier to analyze how different parts of the jets interacted as they were expelled. (Check out this webpage for the animated images.)
The movies confirm that blobs of material are not ejected in a continuous stream, but are belched out sporadically — apparently as the result of material falling onto the stars. The blobs move at different speeds, and when one blob plows into another, that creates a bow shock that heats up the gas. Bow shocks also occur when the blobs slam into concentrations of interstellar gas. Regions of the jets brighten and fade as the clumps of gas warm up and cool down.
Lessons from virtual nuclear blasts "Taken together, our results paint a picture of jets as remarkably diverse objects that undergo highly structured interactions between material within the outflow, and between the jet and the surrounding gas," Hartigan said. "This contrasts with the bulk of the existing simulations, which depict jets as smooth systems."
To improve the fidelity of the simulations, Hartigan's team turned to experts in fluid dynamics from Los Alamos National Laboratory in New Mexico, the UK Atomic Weapons Establishment in Britain and General Atomics in San Diego, Calif., as well as computer specialists from the University of Rochester in New York. Those experts on simulated thermonuclear blasts helped the astronomers understand the interactions powered by Mother Nature's thermonuclear furnaces.
The astronomers are now conducting experiments at the Omega Laser Facility, which is housed at the University of Rochester.
"Our collaboration has exploited not just large laser facilities such as Omega, but also computer simulations that were developed for research into nuclear fusion," said Paula Rosen of the UK Atomic Weapons Establishment, a co-author of the paper. "Using these experimental methods has enabled us to identify aspects of the physics that the astronomers overlooked — it is exditing to know that what we do in the laboratory here on Earth can shed light on complex phenomena in stellar jets over 1,000 light-years away."
Earth and the moon look like mere specks amid the blackness of outer space in a picture sent back by NASA's Juno probe during its trip to Jupiter. Maybe the view from 6 million miles (9.66 million kilometers) isn't as impressive as the close-ups we're accustomed to, but it does call to mind what the late astronomer Carl Sagan said about our pale blue dot almost two decades ago: "That's home. That's us. On it everyone you love, everyone you know, everyone you ever heard of, every human being who ever was, lived out their lives."
Juno's principal investigator, Scott Bolton of the Southwest Research Institute, echoed Sagan's comments in today's image advisory: "This is a remarkable sight people get to see all too rarely. This view of our planet shows how Earth looks from the outside, illustrating a special perspective of our role and place in the universe. We see a humbling yet beautiful view of ourselves."
The $1.1 billion Juno mission was launched on Aug. 5 and won't enter Jovian orbit until 2016. But this won't be the last we'll see of Juno. The spacecraft is due for a slingshot close encounter with Earth in 2013, coming as close as 300 miles (500 kilometers). Until then, Godspeed, Juno....
Lockheed Martin's proposed "Plymouth Rock" mission would target a near-Earth asteroid. Our planet and the moon are in the background of this artist's conception.
By Alan Boyle, Science Editor, NBC News
To the moon? Or to an asteroid? Both destinations have been in NASA's sights — the moon during the George W. Bush administration, and a near-Earth asteroid during the Obama administration. Now a "Global Exploration Roadmap" being drawn up by NASA and its counterparts around the world lays out a 25-year scenario for each of the two paths leading beyond Earth orbit.
Both of the paths are aimed at the same eventual destination: Mars. And some observers are suggesting the best course is to aim directly at the Red Planet, rather than starting with closer destinations.
The moon vs. asteroid debate was brought back into the spotlight during the deliberations of a panel known as the International Space Exploration Coordination Group, or ISECG. The group, which includes representatives from Britain, Canada, the European Space Agency, France, Germany, Italy, Japan, Russia, South Korea and the United States, was established as a coordination forum for space exploration back when NASA was aiming for a return to the moon by 2020.
Over the past year, the group has retooled the long-term global strategy for space exploration. "It begins with the International Space Station and expands human presence throughout the solar system, leading ultimately to human missions to explore the surface of Mars," NASA said today in a news release. "The roadmap flows from this strategy and identifies two potential pathways: 'Asteroid Next' and 'Moon Next.'"
NASA said "each pathway represents a mission scenario over a 25-year period describing a logical sequence of robotic and human missions." That scenario would be consistent with the plan that President Barack Obama laid out two years ago, with a goal of sending astronauts to a near-Earth asteroid by 2025 and pushing out to Mars and its moons by the mid-2030s. The new twist is that the moon is back on the table as the initial destination beyond Earth orbit.
An artist's conception shows NASA's Orion exploration vehicle and a lander docked in lunar orbit.
Senior space officials gave their go-ahead to the two-pathway plan today during a meeting in Kyoto, Japan, NASA said.
"NASA is confident that the release of this product, and subsequent refinements as circumstances within each space agency evolve, will facilitate the ability of space agencies to form the partnerships that will ensure robust and sustainable human exploration," said Bill Gerstenmaier, NASA's associate administrator for human exploration and operations.
Gerstenmaier is the outgoing chair of the ISECG. The incoming chairman, Yoshiyuki Hasagawa of the Japan Aerospace Exploration Agency, said the group's members were "very happy with the progress of the Global Exploration Roadmap."
NASA spokesman Michael Braukus told me that the roadmap was not yet available for public release, but space officials agreed that an initial version of the document would be issued sometime in the next few weeks. Based on viewgraphpresentations prepared in advance of this week's meeting in Kyoto, both paths would eventually get to the moon as well as asteroids. It's more a question of which destination is targeted first.
One suggested strategy would start by sending a deep-space habitat to an Earth-moon gravitational balance point known as L-1. Later missions would go to the moon, as preparation for eventual Mars trips. Another scenario calls for reaching the lunar surface first. The lessons learned there would be applied to asteroid missions, and then to Mars-bound missions. A variant would focus on testing the deep-space habitat, then taking trips to the moon, then going to an asteroid, and finally flying to Mars. It's not yet clear how all these possibilities are wrapped up into the ISECG's "Asteroid Next" and "Moon Next" scenarios.
An artist's conception shows the Orion exploration vehicle and habitation modules in Martian orbit.
"We’re ready. Despite its greater distance, we are much better prepared today to send humans to Mars than we were to send men to the moon in 1961, when President Kennedy started the Apollo program - and we were there eight years later. Contrary to those seeking indefinite delay of any commitment, future-fantasy spaceships are not needed to send humans to Mars. The primary real requirement is a heavy-lift booster with a capability similar to that of the Saturn V launch vehicle employed in the 1960s. This is something we fully understand how to create.
"The issue is not money. The issue is leadership. NASA’s average Apollo-era (1961-73) budget, adjusted for inflation, was about $19 billion a year in today’s dollars, just 5 percent more than the agency’s current budget. Yet the NASA of the '60s accomplished 100 times more because it had a mission with a deadline and was forced to develop an efficient plan to achieve that mission. If NASA were given that kind of direction, we could have humans on Mars within a decade. If not, as the rudderless agency continues to drift into the coming fiscal tsunami, we may soon end up with no human spaceflight program."
Gearing up for missions to Mars would likely require a significant boost in space spending, as well as more serious efforts to solve the problems of interplanetary spaceflight, including radiation exposure and zero-G health hazards. The ISECG's deliberations are a sign that deep-space exploration is too expensive for any one country to take on by itself. But the latest reports about the roadmap suggest that the path beyond Earth orbit is not yet set in stone — which means there's still ample opportunity for you to weigh in on the debate.
Click your choice in the poll at right, and feel free to weigh in at length in the comment space below.
In related developments:
Caltech's Keck Institute of Space Studies has invited students from around the world to participate in a competition to design a mission that would send astronauts to a near-Earth asteroid and return a sample. The team exercise will bring at least 30 students to Caltech from Sept. 12 to 16. The Caltech Space Challenge was created by two Caltech students, Prakhar Mehrotra and Jon Mihaly. "We have more than 275 applications from exceptional students at 100 universities worldwide, including all the top-rated schools," Mehrotra said in a Caltech news release. "Selecting is going to be very hard."
China's Chang'e 2 spacecraft has left lunar orbit and traveled about a million miles (1.5 million kilometers) from Earth to settle into orbit around the sun-Earth gravitational balance point known as L-2. Chang'e 2's new location was announced today by China's State Administration of Science, Technology and Industry for National Defense. China's Xinhua news agency said Chang'e 2, which spent about eight months orbiting the moon, will carry out exploration activities around L-2 during the coming year. L-2 already serves as the locale for the European Space Agency's Herschel space telescope and NASA's Wilkinson Microwave Anisotropy Probe. It's also the intended destination for the James Webb Space Telescope, which is currently under construction and due for launch around 2018.
Instead of sending astronauts to an asteroid, how about bringing the asteroid to us? In the journal Research in Astronomy and Astrophysics, scientists at Tsinghua University in Beijing say "it is possible" to nudge an asteroid into temporary Earth orbit, and they provide a list of near-Earth asteroids that just might serve. One possibility is the 10-meter-wide (33-foot-wide) asteroid 2008 EA9, which could be placed in an orbit about twice as far away as the moon for study or for mining over the course of a few years. "Interesting idea," Technology Review's arXiv blog notes. "What could possibly go wrong?"
Surprise, the standard row-block method of airplane boarding isn't the most efficient way to get passengers to their seats.
In fact, it's the worst, according to a new experiment by an astrophysicist — yes, an astrophysicist, who works with Batavia-based Fermilab.
Jason Steffen tested different methods of boarding after he, like many travelers, became stalled by passengers stowing oversized carry-on bags into overhead bins.
Among the boarding types he tested: back to front, window seats then middle and aisle seats, and the oft-used, block-row boarding.
Steffen hired 72 people to try out five methods on a replica airplane on a Hollywood soundstage while he timed them.
Based on the trials, he found boarding by alternating rows at once was most efficient, followed by boarding window-to-seat and letting passengers board randomly. They're all faster than block rows, Steffen concluded.
Alternating rows gives passengers enough room to squeeze their luggage into bins while others find their seats.
The world's first carbon nanotube reinforced polyurethane wind blades were installed on a 400W 12V wind turbine generator. The technology could be scaled up, enabling the industry to build blades as long as 250 meters, according to researchers.
By John Roach, Contributing Writer, NBC News
Bigger is better … when it's also lighter and stronger, goes the thinking of engineers and materials scientists designing the next generation of blades to wring energy from the wind.
Bigger blades can get more energy from the wind, but this advantage is lost if the blade is also heavier, since more wind is needed to turn the rotor. In addition, the more the blade flexes in the wind, the more potential energy is lost.
That means the blades need to be lighter and stiffer as well as bigger.
To prove a longer, lighter and stronger blade could be made, a post-doctoral researcher has created the world's first polyurethane blade reinforced with carbon naontubes. It's only 29 inches long, but preliminary tests show it is lighter, more rigid and stronger than conventional epoxy blades.
"The main idea was to make something that I could show to people outside of the project to understand what we are doing here and why," Marcio Loos, the researcher in the macromolecular science and engineering group at Case Western Reserve University in Cleveland, Ohio, told me today.
"They are going to a larger scale already," Loos said of project partner Molded Fiber Glass Co. in Ashtabula, Ohio, who is infusing sections containing 50 layers of glass fiber with the new materials. "They are testing to see if it works and so far everything is going fine."
Fatigue tests show the reinforced polyurethane composite lasts about eight times longer than epoxy reinforced with fiberglass. The new material was also about eight times tougher in delamination fracture tests, according to the researchers.
A wind industry goal by 2020, Loos said, is to create blades that are 250 meters long. They currently top out at about 150 meters. "Hopefully, this new material can help reach that goal," he said.
Longer blades that wring more energy from the wind, however, are likely to run up against the same obstacles that routinely hobble the industry, including a lack of transmission capacity to bring the power to the people, on top of concerns that turbines harm wildlife and are an eyesore.
Just getting permission to install the prototype blade on a building at Case Western University proved a major hassle, Loos noted. They did, however, manage to get a the prototype up, which can be seen in the video below.
World's First Carbon Nanotube Reinforced Polyurethane Wind Blades
A supercomputer simulation produces a virtual spiral galaxy that comes close to matching the look of our own Milky Way. Msnbc.com's Alan Boyle reports.
By Alan Boyle, Science Editor, NBC News
How long does it take to simulate the Milky Way? The answer is about nine months, if you're using a powerful supercomputer. That's how long it took for researchers at the University of California at Santa Cruz and the Institute for Theoretical Physics in Zurich to produce the first simulation of galaxy formation that approximates the look of our own Milky Way spiral.
"Previous efforts to form a massive disk galaxy like the Milky Way had failed, because the simulated galaxies ended up with huge central bulges compared to the size of the disk," Javiera Guedes said today in a news release about the project.
Guedes worked on the project during her time at UC-Santa Cruz, and is now a postdoctoral researcher at the Swiss Federal Institute of Technology. She's the first author of a paper accepted for publication in the Astrophysical Journal that describes the simulation, known as the Eris galaxy.
For 20 years, astronomers have been trying to come up with a simulated galaxy that comes close to the look of the Milky Way and other spiral galaxies — but fell short of the mark. Guedes and her colleagues were more successful in part because of the computer firepower at their disposal: 1.4 million processor-hours on NASA's Pleiades supercomputer, plus additional supporting simulations at UC-Santa Cruz and the Swiss National Supercomputing Center.
"We took some risk spending a huge amount of supercomputer time to simulate a single galaxy with extra-high resolution," said UC-Santa Cruz astronomer Piero Madau, one of the paper's co-authors.
The effort used a software platform known as Gasoline to trace the motions of more than 60 million particles, representing galactic gas as well as dark matter, over the course of more than 13 billion years.
Annotated animation of Eris galaxy from University of Zurich.
Madau said developing a realistic simulation of star formation was another key to Eris' success.
"Star formation in real galaxies occurs in a clustered fashion, and to reproduce that out of a cosmological simulation is hard," he said. "This is the first simulation that is able to resolve the high-density clouds of gas where star formation occurs, and the result is a Milky Way type of galaxy with a small bulge and a big disk."
The recipe for the Eris galaxy limited star formation to the high-density regions of the galactic disk, which resulted in a more realistic distribution of stars. Within the high-density regions, supernova explosions powered an outflow of gas from the inner part of the galaxy, keeping the central bulge from getting too big.
The point of the exercise wasn't merely to come up with a pretty animation. The virtual conditions for Eris' creation are consistent with the theory that galaxy-scale structures coalesced from cosmic webs that were dominated by cold dark matter. Gravity drew primordial clumps of dark matter together into bigger clumps, and the "ordinary" matter that makes up stars and galaxies fell into those dark-matter clumps — giving rise to visible galaxies embedded in halos of invisible dark matter.
Cosmologists contend that the universe consists of 4.6 percent ordinary matter, 23.3 percent dark matter and 72.1 percent dark energy. But the fact that astronomers found it difficult to produce galaxies like the Milky Way using that formula led some to question the prevailing cosmological model of the universe. The Eris galaxy simulation "shows that the cold dark matter scenario, where dark matter provides the scaffolding for galaxy formation, is able to generate realistic disk-dominated galaxies," Madau said.
The research team's effort may be a tour de force for supercomputing, but don't confuse the virtual Eris with the real-life Milky Way. Even though Eris is an incredibly high-resolution simulation, its 60 million particles of gas and dark matter pale in comparison with the Milky Way's hundreds of billions of stars.
Swiss researcher Lucio Mayer discusses the galaxy formation simulation with interviewer Michele De Lorenzi.
Extra credit: Eris is named after the Greek goddess of discord, in recognition of the decades of discordant debate that have surrounded the scenarios for forming spiral galaxies, according to a description of the project on the HPC-CH weblog. Guedes' website includes a quote from the Iliad: "The soldiers fought like wolves while Eris, the Lady of Sorrow, watched with pleasure." The simulated galaxy isn't the first astronomy-related object to bear that discordant name: Eris is also the name given to the dwarf planet that caused so much trouble for Pluto.
In addition to Madau and Guedes, co-authors of the paper include Simone Callegari and Lucio Mayer of the Institute for Theoretical Physics in Zurich. This research was funded by NASA, the U.S. National Science Foundation, the Swiss National Science Foundation and an ARCS Foundation fellowship to Guedes.
Hurricane Irene wasn't as bad as predicted, and now some are asking whether the storm was over-hyped. NBC's Peter Alexander takes a closer look and The Weather Channel's Jim Cantore and Bryan Norcross share their insight.
By Alan Boyle, Science Editor, NBC News
Did forecasters, policymakers and media types overhype Hurricane Irene? It's not just a meteorological question: The debate over whether the outlook for damage was overhyped, or hyped just right, touches upon issues of risk perception and even the climate change debate. Like most natural disasters, Irene's deadly sweep over the U.S. East Coast has left behind some important lessons for researchers as well as regular folks.
Here are some of the lessons that Monday-morning commentators are chewing over:
What was right and wrong about storm prediction? The computer models, and the meteorologists who wielded them, put in a "gold medal" performance when it came to predicting Irene's track — but there was much more uncertainty about the intensity of the storm. That's typical for tropical storms, said Frank Marks, director of the Atlantic Oceanographic and Meteorological Laboratory's Hurricane Research Division. "Irene really exemplified the issues that we've been trying to tackle," he told me.
Hurricanes typically follow a pattern in which an outer ring of storms will tighten up to replace an inner ring surrounding the hurricane's eye, intensifying the storm system in the process. In Irene's case, that pattern (known as eyewall replacement) was interrupted, and the storm didn't gather as much strength as most of the models suggested. "Some of the models did represent it well," Marks said, but there wasn't enough confidence in those models to change the storm forecast.
Researchers have been working to reduce the error rate for hurricane track and intensity forecasts through the Hurricane Forecast Improvement Project, with the goal of a 50 percent reduction from 2008 levels by 2018. The University of Washington's Cliff Mass, an expert on weather modeling, said Irene showed that much more progress still has to be made on predicting a storm's intensity.
"The classic is good forecast for track, bad forecast for intensity," he told me. "Let's face it: This happens all the time. ... To get the intensity right, you have to be able to predict the inner workings of the storm, and that's what we don't do well yet."
But Mass said "we didn't even need the models" to know that Irene would become less intense as it moved up the coast, through the increasingly cool waters of the Atlantic. In fact, Mass contends in a blog post today that "there is really no reliable evidence of hurricane-force winds at any time the storm was approaching North Carolina or moving up the East Coast."
He argued that the National Weather Service should have downgraded the storm much more quickly than it did. "There's a tendency to be conservative," he told me. "We have to learn to be more nimble."
This GOES-13 satellite movie shows Hurricane Irene lashing the Mid-Atlantic region between Aug. 26 and Aug. 28. Credit: NASA/NOAA GOES Project, Dennis Chesters
Did forecasters overhype the storm? In his blog posting, Mass addresses the hype surrounding Irene: "Considering the tendency for media to hype storms, it is crucial for meteorologists to stick to the exact story and not overwarn in the hope of encouraging people ot take effective action. If the storm was known not to be a hurricane earlier, might the mayor of NY have held off closing the city down, thus saving billions of dollars?"
Marks said that the storm was assessed based on readings taken from above as well as on the surface, and that the National Oceanic and Atmospheric Administration followed the standard procedures for those assessments. But he acknowledged that Irene was a tough storm to classify, in part because of its breadth. "From Cape Cod all the way inland to Pennsylvania — just think about the energy," he said. "It's really the energy of the storm, it's not the peak wind."
He said spin control isn't part of NOAA's mission. "We provide as much information as we can, based on what we know," Marks said. "What the public and decision makers do with that information is something that's out of our purview."
Marks acknowledged that some of the reports made the storm sound scarier than it really was. "If you looked at those scenarios that the media was getting ... the disaster scenario was extreme. That was for a major hurricane coming straight at them, not a weakening storm coming up the coast," he said.
How much was lost in translation? So was this a case of journalists and policymakers making too much of the storm? Maybe so, said David Ropeik, a consultant on risk perception, Big Think blogger and author of the book "How Risky Is It, Really?" But maybe that's not so bad.
"Yes, the information the media presented was wrapped up in breathless alarmism," Ropeik, a former msnbc.com contributor, told me. "But we forget two things: First, surveys show that the public knows that about the media. And second, under all the alarmism was really important information that helped people stay safe: storm track timing, tips for preparedness, evacuation routes. It was alarmist in voice, but an informative tool. And that probably helped more than it hurt. ... There was no panic, there was no hysteria."
Ropeik said government officials also did the right thing: "In my opinion, they were overly precautionary, but most people want them to do that. One can only measure the accuracy of their precaution in hindsight, and you don't want to err on the wrong side. ... The evacuation, the closing of the subways, you don't want to make a mistake on that in the wrong direction."
There were political considerations, to be sure. Just ask New York City Mayor Michael Bloomberg, who faced harsh criticism over the lack of preparedness for last winter's snowstorms — or former President George W. Bush, who was similarly criticized in the wake of Hurricane Katrina almost exactly six years ago.
But beyond the politics, the storm's toll — more than 30 dead, plus an estimated $7 billion in property damage — clearly demonstrates that Irene was more than just hype.
"I daresay the people who are saying there was overreaction are not those who are still without power, or who suffered property losses, or who lost loved ones," Ropeik said. "Risk is a matter of perception. It depends on who you ask."
Some commentators worry that hyping hurricanes will lead folks to disregard future warnings as a case of "crying wolf," but Ropeik said the public response to the warnings about Irene "puts the lie to that."
"Other storms have been hyped, and have not panned out, and yet people still took reasonable precautions this time," he said. "The 'cry-wolf' thing didn't happen."
Do more big storms lie ahead? The concerns about Irene's effects could hint at the shape of climate debates to come.
Research published last year in the journal Nature Geoscience suggested that global warming was likely to produce fewer but stronger tropical storms. This year, a study in the journal Science came to a similar conclusion.
Such projections have sparked strong debate, as most claims about climate effects have done. It's impossible to link any single event, such as Irene or Katrina, to long-term climate trends. But in a posting to his Desmog Blog, science writer Chris Mooney argues that Hurricane Irene should get people thinking about what lies ahead:
"... Irene focuses our attention on our serious vulnerability, and we need to seize that moment — because too often our default position is to act like nothing bad is going to happen.
"There are several places in the United States, besides New Orleans, where a strong hurricane landfall could be absolutely devastating. These include the Florida Keys, the Miami-Fort Lauderdale area, Tampa Bay/St. Petersburg and Houston/Galveston. But they also include some East Coast locations, and chief among these is New York/Long Island. ...
"So what are our major coastal cities doing to protect themselves? That's the question we should all be asking right now."
What questions are you asking? Share them as a comment below, and we'll see if we can get a discussion going.
Update for 5:30 p.m. ET: One of the first Irene-related research projects to come to light focuses on whether big storms could actually counteract the effects of greenhouse-gas emissions.
Scientists at the Stroud Water Research Center and the University of Delaware are sampling the storm runoff at sites along creeks in Delaware to measure how much carbon is being transported. In a news release, the National Science Foundation says the project could reveal how much of a role soil erosion plays in sequestering carbon to prevent it from re-entering the global carbon cycle.
"The bigger the storm, the greater the disproportionate load, so you might have a single 100-year storm event move 25 percent of the material for an entire decade," said Anthony Aufdemkampe, a scientist at the Stroud Water Research Center. "This is important, because fresh waters and the carbon they transport play a major role in the global cycling of greenhouse gases."
In this file photo, crowds gather for the inauguration of President Barack Obama Tuesday, Jan. 20, 2009. New computer software is able to tap into the wisdom of crowds to get tasks done.
By John Roach, Contributing Writer, NBC News
Computers may eventually outsmart human intelligence, but for now they're just finally getting smart enough to ask humans for help.
That's the basic idea behind MobileWorks, a startup that is weaving crowdsourcing capability into computer software. Crowdsourcing is the concept of putting out a question to your social network to help solve a problem.
In MobileWorks case, software sends tasks to a hand-picked crowd — mostly workers recruited from the developing world such as the slums of India and Pakistan. Many work with a mobile phone. The company says these workers are getting high-tech experience and a "fair wage."
"Much of the criticism that has been leveled at online digital work is that it becomes kind of sweatshop labor," Anand Kulkarni, a cofounder and CEO of MobileWorks, told me today. "Our goal was to start with a livable wage and work forward to construct an effective crowdsourcing system."
And what's that wage? Workers in India on a mobile phone earn about U.S. $0.50 per hour; those with a laptop computer make $1.50.
"These are workers who are earning about $2 per day before joining our systems, so, in a way, what we are paying is enough to make a strong positive impact on their lives," Kulkarni said.
Tasks these workers accomplish include transcribing audio recordings, digitizing handwritten notes and scouring the Internet for contact information of potential job recruits. Many take just a minute or two to complete, which is part of the plan.
The cost to the user of the system is on the order of pennies per task.
To maintain client confidentiality, each task is broken up into tiny bits and distributed to the workforce. When the bits of work are completed, the software stitches them back together and delivers the completed task to the user.
The concept is similar to Amazon Mechanical Turk, where tasks are solved by a crowd of anonymous workers, though MobileWorks says their hand-picked crowd is faster and more accurate.
And since the workers are handpicked, MobileWorks can rouse them with a quick text message, making sure workers are at the ready when there is work to be done.
"The ability to spin up workers when you need them is very powerful," Michael Bernstein, who researchers crowdsourcing at MIT's Computer Science and Artificial Intelligence Laboratory who has developed an application to tap into the Mechanical Turk service, told Technology Review.
"On Mechanical Turk your tasks can just stall because not enough people chose to work on them."
Tulane has applied for a patent for a method to produce the biofuel butanol from organic material, a process developed by associate professor David Mullin, right, postdoctoral fellow Harshad Velankar, center, and undergraduate student Hailee Rask.
By John Roach, Contributing Writer, NBC News
The Internet is delivering a slow death to newspapers, but many of us still have piles of the stuff around the house that a microbe called TU-103 will convert to butanol, a biofuel that is nearly as energy dense as unleaded gasoline.
"This is a bacterium that we isolated straight out of nature," David Mullin, a cell and molecular biologist at Tulane University in New Orleans, told me today.
In fact, it was isolated from a truckload of feces he and colleagues collected from grass-eating animals at the Audubon Zoo in New Orleans, figuring their intestinal tracts likely harbored a naturally occurring microbe that had the qualities they sought.
They were looking for a microbe that produces butanol from cellulose — a woody, fibrous material in plants — rather than more expensive sugars and starches, as well as one that does this in the presence of oxygen.
All other known naturally-occurring butanol-producing microbes do so in the absence of oxygen — that is, oxygen kills them, which makes growing them, transporting them, and working with them rather tedious, Mullin noted.
"Among the many butanol-producing strains that we managed to isolate, we found one that can grow in the presence of air and expresses a cellulase (an enzyme) that can use cellulose to make butanol," he said.
The microbe is a strain of Clostriudium called TU-103. To prove it can convert cellulose to butanol, they wet, shredded and blended a bunch of newspaper (which is essentially cellulose) into a soupy stew and then added in the bacteria. "They started making butanol from newspaper," Mullin said.
"This stuff has been worked out by natural selection, by nature, not the human mind," he said. "[We are] starting with an organism that already does something really efficiently; we don't have to pull in genes from other bacteria and figure out how to get them expressed."
However, TU-103 could see some genetic enhancements down the line. Mullin's team is sequencing the genome and identifying the genes responsible for producing butanol from cellulose in the presence of oxygen. The team could then increase the activity of those genes to make the microbe more efficient.
Green fuel Research teams are racing to scale up the economically-efficient production of butanol, as it's widely considered a superior alternative to other biofuels, such as ethanol.
"If you drain the gasoline out of your gas tank and replace it with butanol, you can start your engine," Mullin said. "If you add ethanol to your fuel tank, no matter how many times you turn the key, it would never turn over; it doesn't have enough energy to run your engine."
That's why ethanol is blended with gasoline, usually up to about 10 percent, and not pumped straight into the gas tank, he added. What's more, most ethanol today is produced from corn and other food crops, which is a morally sticky proposition in world that routinely struggles with food shortages.
Mullin said he's shown that TU-103 converts seedy cotton that would otherwise go to waste into butanol and will soon try it on bagass, a fibrous material leftover from a sugarcane processing plant in Louisiana.
Newspapers, he noted, were a proof of concept — "something people can easily get their hands on and that they can easily wrap their minds around."
In this file photo, a remote-controlled robot called "Packbot", which has capabilities including manoeuvring through buildings, taking images, and measuring radiation levels, is pictured by another "packbot" in Tokyo Electric Power Co.'s crippled Fukushima Daiichi Nuclear Power Plant No.2 reactor building in Fukushima.
By John Roach, Contributing Writer, NBC News
Deleted blog posts by a lead robot controller at the crippled Fukushima Dai-ichi nuclear power plant have been resurfaced by a professional technology association.
The posts, by a worker known only by the initials S.H., are a window on the work environment at the plant and raise questions about whether Tokyo Electric Power, the plant's owner, is providing its workers with enough robots and resources to do their work efficiently.
The robot diaries, as the posts are known, also provide candid details on what it's like to be working with disaster response robots. They were considered "must read material for companies and researchers developing robots for emergency situations," notes IEEE Spectrum, which resurfaced the diaries.
For example, the posts show the difficulty of controlling the robots while wearing five pairs of gloves and bulky goggles, which means the controls and interface need to be made easier to operate than they already are, according to IEEE Spectrum.
As S.H.'s blog attracted more and more attention, however, the posts related to the robots were taken down in early July, IEEE noted, and not much later the entire blog was gone.
"It's unclear whether TEPCO or S.H.'s supervisors demanded that the material be removed. Efforts to reach S.H. were unsuccessful," writes Erico Guizzo, the author of the IEEE report.
Guizzo, however, took the time before the posts were gone to copy the blog as well as a series of YouTube videos showing training exercises with the robots at the plant. He translated the blogs to English and posted them on IEEE Spectrum website as well as snippets from the training videos.
Training of robot operators at Japan's Fukushima Dai-ichi nuclear power plant.
"The material offers important lessons about the Fukushima disaster — lessons that roboticists and others should heed if we want to be better prepared for tomorrow’s calamities. TEPCO has also been criticized for not being transparent, and these posts provide more information for Japanese citizens to decide whether the company and their government are doing a proper job," he writes.
Portions of the posts are also available via Google cache and from another Japanese researcher here.
To learn more, read the diaries and see pictures, please check out IEEE's reporting.You can also check out our Wilson Rothman's video below where he discusses the deployment of robots to Japan with iRobot vice president Tim Trainer.
Researchers are developing prototype shoes that harvest energy as their wearers walk around. The energy captured is enough to power portable gadgets such as cell phones.
By John Roach, Contributing Writer, NBC News
Going on a power walk could soon do more than blow off steam; it could recharge your cell phone and other portable electronics, according to engineers working on a new way to harvest the mechanical energy in the human gait.
The concept is called reverse electrowetting. It uses a micro-fluidic device consisting of thousands of micro-droplets that move past a novel nanotechnology-based thin film. This motion of the droplets is converted into an electrical current.
"The normal way of using the harvester would be couple it with a tiny, rechargeable battery not unlike the ones which we have in cell phones," Tom Krupenkin, an engineering researcher at the University of Wisconsin-Madison, explained to me on Tuesday.
"That thing would accumulate the energy you generate while walking and would provide you this energy whenever you need it."
He and colleague Ashley Taylor reveal the details of how this works this week in the journal Nature Communications and are now at work commercializing the technology with electricity harvesting shoes though their company Instep Nanopower.
Power walkers The market for the technology, according to Krupenkin, is huge. Potential customers range from military personnel who now carry 20 pounds of batteries in the field to keep their gadgets running to people in developing countries who have inadequate access to electrical grids for recharging their cell phones.
These users would likely just plug their devices into a tiny USB port in their shoes, happy enough to have a charge that they are not too bothered by wires snaking around their bodies. In countries like the U.S., though, wires coming out of footwear could be a fashion faux pas that's a step too far.
"I wouldn't like that idea myself," Krupenkin noted.
To get around the problem, he and Taylor envision equipping their footwear with tiny mobile hotspots that are powered by the electricity-generating shoes. The hotspot doesn’t charge your phone, but rather allows you to keep your phone in a low power state using technology such as Bluetooth.
The lion's share of cell phone battery drain, Krupenkin noted, is due to transmitting and receiving data over long range RF communications. The mobile hotspot does this task, allowing the phone to use the power sipping technology.
Relieved of having to communicate over long distances, "the battery of that device would last much, much longer," Krupenkin said. "Like a cell phone battery that lasts literally for a month instead of a day or two."
Technology improvement? The idea of harvesting energy from moving bodies isn't new, though most attempts such as vibrating plates or piezoelectric materials only produce a few milliwatts of power. A person wearing the electricity harvesting shoes, the researchers say, could generate up to 20 watts of electrical power.
Other ideas such as a backpack that generates electricity as it bounces up on down while a person walks along produces several watts, but "you have to carry a very heavy backpack," Krupenkin said.
"Our technology is you have to carry your own weight anyway and your own weight is very substantial … but unlike the backpack, it is not extra weight, it is the weight that you have."
In concept, he noted, the shoes would not be noticeably different from what people wear around today. Of course, he added, taking a concept from the lab to the street is a bigger challenge than just inventing a new technology and proving that it works.
"Footwear is a very complex blend of art, engineering, biology, and things like that," Krupenkin said. "Creating good footwear is tricky and requires special knowledge and experience,"
The researchers are hoping to pair their power harvesting technology with a footwear company that has this expertise. That means people itching to get their hands on these shoes will have to wait at least a couple of years.
People stand in a square outside the courthouse after an earthquake was felt in New York Tuesday, causing buildings to be evacuated. The Pentagon, the U.S. Capitol and Union Station in the nation's capital were all evacuated after the 5.9-magnitude quake, which was shallow with its epicenter only 0.6 miles (one kilometer) underground.
By John Roach, Contributing Writer, NBC News
A magnitude-5.8 earthquake in Virginia Tuesday afternoon was felt across the U.S. East Coast, shaking offices and nerves from Washington D.C. to New York City and as far south as Chapel Hill, North Carolina. There are even reports of shaking as far west as Columbus, Ohio, and out on Martha's Vineyard.
That's a whole lot of shaking for what amounts to a medium-sized quake. The reason for its reach, according to the U.S. Geological Survey, is geology of the East Coast.
"The reason an earthquake in the high 5s is felt so far away is that it occurred in an area … where the bedrock is solid, it's not really fractured or broken up by faults the way it would be, say, in California," Peter Powers, a geophysicist with the survey, told me today.
The seismic energy in areas with stable bedrock — "stable continental craton" in geophysics speak — can travel much farther than it can when broken up by young faults.
"Seismic energy attenuates very slowly on the East Coast," Powers said. "On the West Coast it attenuates much more rapidly because the bedrock is fractured and faulted and much more variable in its composition than on the East Coast."
Seismologist Dr. Lucy Jones joins Brian Williams from the United States Geological Survey headquarters in Pasadena, Calif.
The survey notes that earthquakes in the central and eastern U.S. are typically felt over a much broader region than those in the Western U.S., sometimes an area as much as ten time larger than similar magnitude earthquake on the West Coast.
"A magnitude 5.5 eastern U.S. earthquake usually can be felt as far as 300 miles from where it occurred, and sometimes causes damage as far away as 25 miles," the agency notes on its website.
Early reports indicated the depth of the earthquake is also quite shallow — just 3.7 miles down in the Earth's crust. Powers said this would likely be revised deeper as time goes by and the data is further analyzed, but depth here isn't much of a factor in the shaking.
The earthquake today on the East Coast, Powers said, was large enough to be felt over a large area no matter if its depth is ultimately determined to be 3 miles or 15 miles deep. "At that point, the depth really isn't much of a factor."
Depth can be a factor In other regions, however, depth of an earthquake can make a difference.
Generally, the closer the epicenter of an earthquake is to the surface, the stronger the shaking on the surface and the more damage they cause, no matter what their size.
Conversely, when earthquakes rupture deeper in the crust — they can rupture up to 500 miles deep — more energy is lost as it races to the surface.
That's one reason why some relatively strong earthquakes, originating deep in the crust, cause little damage on the surface whereas some seemingly small earthquakes can cause massive damage.
The 7.0 magnitude earthquake in Haiti in January 2010 was 8.1 miles deep. The relatively shallow depth combined with subpar construction in Haiti caused massive damage there.
A pair of minor earthquakes in Spain this May ruptured just over half a mile below the ground, causing several deaths and damaging buildings in a part of the world with a tame seismic history.
The 9.0 magnitude earthquake hit Japan on March 11 began at a depth of 19.9 miles. Though devastating, it could have been worse had it been even shallower — and had Japan's infrastructure not been among some of the most earthquake ready in the world.
NBC's Tom Costello reports from the earthquake epicenter in Mineral, Va., where the roof of the town hall collapsed in the quake.
An 5.8-magnitude earthquake in central Virginia was felt across much of the East Coast on Tuesday, causing light damage and forcing thousands of people to evacuate buildings in New York, Washington and other cities. NBC's Lester Holt reports.
Sugar-filled galls on Southern beech trees in the Patagonia region of South America are attractive to a species of yeast, Saccharomyces eubayanus, that somehow made it to Europe where it fused with S. cerevisiae to form lager yeast, the microbe responsible for fermenting ice cold lager beer.
By John Roach, Contributing Writer, NBC News
Ice cold beer: In these dog days of summer, few things are better. So, let's raise a glass and toast Saccharomyces eubayanus, newly discovered yeast that helped make cold-fermented lager a runaway success.
The yeast, in the wild, thrives in ball-shaped lumps of sugar that form on beech trees in Patagonia of South America. Its discovery appears to solve the mystery of how lager yeast formed. Until now, scientists only knew about the origins of ale yeast, which makes up just half of the lager yeast genome.
Yeasts are microscopic fungi that feast on sugar, converting it to carbon dioxide and alcohol via the process of fermentation. Ale yeast, S. cerevisiae, has been doing this throughout the history of beer, which stretches back to at least 6,000 B.C. in Mesopotamia, the cradle of civilization.
But ale yeast does its magic at relatively warm temperatures. In the 15th century, Bavarians started cold-fermenting beer in caves, a process known as laagering.
"The ale strains were probably poorly adapted to growing in that environment and that opened up an opportunity when S. eubayanus came on the scene," Chris Todd Hittinger, a professor of genetics at the University of Wisconsin-Madison, explained to me.
This graphic shows the journey S. eubayanus likely took as trade between South America and Europe ramped up in the 1500s.
Once in Europe, the Patagonia yeast fused with S. cerevisiae. "By forming a hybrid, you get an immediate temperature shift in its preference and that would have provided an immediate advantage to the process the Bavarians were using to make beer," Hittinger said.
He and his colleagues used genetic sequencing techniques on a search of five continents for this wild yeast species. Their quest was to complete the puzzle in part so that genetic engineers and brewers can create ever-more efficient strains of lager yeast and, potentially, a better beer.
That, in turn, could be a boost in the highly-competitive brewing industry. Lager, according to the researchers, is a $250 billion a year business.
Credit for the discovery of this wild yeast goes to Diego Libkind of the Institute for Biodiversity and Environment Research in Bariloche, Argentina, who was interested in the ball-shaped lumps of sugar that form on beech trees there.
The lumps, called galls, are an immune response to invasion by a fungus. They are abundant in the spring and local Aborigines such as the Mapuche traditionally use them as a slightly-sweet topping for salads, Libkind explained to me in an email.
In addition, "the Aboriginies used to let the galls spontaneously ferment in water and made an alcoholic beverage called by the generic name chicha," he said. "Most probably, S. eubayanus was in charge of that fermentation."
Today, Libkind and his colleagues are working with a microbrewery in Argentina to diversify their products with S. eubayanus and "artificially forced S. eubayanus/S. cerevisiae hybrids," he said.
Details on these products are under wraps, he said, though since yeast go under thousands of years of domestication by brewers, he doubts the wild yeast would make for a good beer.
But the idea of brewing a beer with S. eubayanus might be worth pursuing, Hittinger said. "It would unlikely be a particularly great beer straight out of the beech forest, but I suspect it would be passable."
Update, Aug. 23, 2:00 pm PT:
Patrick McGovern, a world expert on the history of alcoholic beverages and director of the biomolecular archaeology lab at the University of Pennsylvania Museum, sent me an email today with his thoughts on the yeast discovery.
Assuming the genetics work is correct, he said he is "troubled by how this newly discovered wild yeast strain made it into Bavaria in the 1500s."
For one, he noted, Germans, and especially Bavarians, were not involved in the European exploration of Patagonia at the time. So, if the yeast somehow hitched a ride back to Europe via trade with the English, Spanish, and Portuguese, how did it get to Bavaria?
"Perhaps, some Patagonian beech was used to make a wine barrel that was then transported to Bavaria and subsequently inoculated a batch of beer there?" he asked. "Seems unlikely."
He said a more likely scenario is that galls in the oak forests of southern Germany also harbored S. eubayanus, at least until it was out competed by the more ubiquitous S. cerevisiae.
"If true, then the use of European oak in making beer barrels and especially processing vats, which could harbor the yeast, might better explain the Bavarian 'discovery' of lager in the 1500s," he said.
Nevertheless, he added, history and archaeology are full of surprises.
"Nowhere is this more true than of the seemingly miraculous process of fermentation and the key role of alcohol in human culture and life itself on this planet," he said.
"This article has begun to unravel the complicated heritage and life history of the fermentation yeasts, and will hopefully stimulate more research to see whether the Patagonian hypothesis proves correct."
The right book can open up a whole new world of scientific information.
By Alan Boyle, Science Editor, NBC News
You don't have to turn your brain off for vacation: A summer break is the perfect time to open a book and let your imagination fly through exotic scientific realms.
The best science books for summer reading are those that take you to another place or time — locales that offer adventure and fun while providing insights into how our cosmos works. And because we're talking about vacation reading, they shouldn't be too weighty or worrisome. There's a time and a place for books like "Superbug," or "Annoying," or "Quantum" — but that may be after you come back from the beach.
Here are 10 recently published books that capture the mix I have in mind for summertime reading: a little science, a little travel, and little or no math required:
"Boltzmann's Tomb: Travels in Search of Science": Miami University geophysicist Bill Green revisits the scenes of his science, ranging from his boyhood hometown of Pittsburgh to his favorite stomping grounds in Antarctica. Kirkus Reviews: "Green is an exquisite writer, and his fierce focus and mastery of style are reminiscent of the biologist and essayist Lewis Thomas."
"On the Origin of Tepees: The Evolution of Ideas (and Ourselves)": English science writer Jonnie Hughes takes a novel approach to the subject of cultural evolution by following Darwinian principles as he travels across the American West. Publishers Weekly: "This ambitious book braids together studies in biology, psychology, history, linguistics, geology and philosophy into an impressively succinct and readable taxonomy of human culture."
"Space: From Earth to the Edge of the Universe": This coffee-table book isn't the kind of thing you'd take to the beach, but on a rainy day, it's nice to have it waiting for you on a bookshelf at the cabin. Like most DK books, it's chock-full of pictures and illustrations. The Coalition for Space Exploration's Leonard David says it provides "a picture-perfect look at the origins of human space exploits, current status, and the unknown unknowns awaiting discovery and investigation within the universe at large."
This post couldn't come at a better time, because I'll be taking a few days off myself. So what'll I be reading? "Turn Right at Machu Picchu" is definitely on my list, as "The Lost City of Z," "1491" and "Cahokia" were in past years. But for starters, I'm engrossed in something completely different: "A Dance With Dragons," the latest installment of the "Song of Ice and Fire" fantasy series from George R.R. Martin, the guy who's been called "The American Tolkien."
What are you reading? Feel free to pass along your recommendations as a comment below. I'll pick out some of the recommendations to feature when I'm back in office, a little more than a week from now. If I really like your recommendation, I just might send you a free book — maybe one of the volumes on the list above, or maybe a signed copy of my own book, "The Case for Pluto." Until then, here's wishing you happy reading and relaxation!
A study that reviews a host of sci-fi scenarios for contact with extraterrestrials stirred up such a ruckus today that NASA had to step in and distance itself from the research. The controversy focuses on the idea that E.T. could well decide that we're a threat to interstellar order, and therefore we have to be stopped before we spread.
The report itself, published in the journal Acta Astronautica, covers ground that's familiar to dedicated fans of E.T. lore. For example, the premise of the 1951 sci-fi classic "The Day the Earth Stood Still" is that universalist-minded aliens see our civilization as so rooted in violence that it's better to snuff us out than let us ruin the neighborhood. (The 2008 remake, starring Keanu Reeves, recycled that idea with an environmental theme.)
Then there's the "Hitchhiker's Guide to the Galaxy" scenario, in which Earth is destroyed merely to make way for a new stretch of intergalactic infrastructure.
"At the heart of these scenarios is the possibility that intrinsic value may be more efficiently produced in our absence," the researchers write.
The most familiar sci-fi scenario is the one in which the aliens are as selfish and territorial as we are, and want to wipe us out or enslave us and take our stuff. Think "War of the Worlds" or "Independence Day." In such cases, the researchers note that there's the potential for big payoffs ... if we prevail.
"Humanity benefits not only from the major moral victory of having defeated a daunting rival but also from the opportunity to reverse-engineer ETI [extraterrestrial intelligence] technology," they write. Indeed, New York Times columnist Paul Krugman joked last weekend that a fake alien invasion might be just the thing to spark an economic turnaround.
The researchers touch on more benign scenarios as well — for example, the "Star Trek" scenario, in which helpful aliens welcome us into the United Federation of Planets because we're all basically good guys (as opposed to those evil Klingons, until they become good guys, too). And then there's something like the "E.T." scenario, in which the aliens mostly just want to stay out of our way.
The 33-page study reflects at length on the potential risks.
"The possibility of harmful contact with ETI suggests that we may use some caution for METI [sending messages to extaterrestrial intelligence]," the researchers write. "Given that we have already altered our environment in ways that may be viewed as unethical by universalist ETI, it may be prudent to avoid sending any message that shows evidence of our negative environmental impact. The chemical composition of Earth's atmosphere over recent time may be a poor choice for a message because it would show a rapid accumulation of carbon dioxide from human activity. Likewise, any message that indicates widespread loss of biodiversity or rapid rates of expansion may be dangerous if received by such universalist ETI."
In short, let's keep our environmental bad habits on the down low, so as not to get the sad-Keanu E.T.'s on our case.
So what's the big deal? Well, one of the authors of the paper, Shawn Domagal-Goldman, happens to be a postdoctoral student working at NASA Headquarters — and that highly tenuous connection to the world's most influential space agency sparked a huge wave of scare headlines. It started with The Guardian's story, and rolled onto The Drudge Report's webpage with a headline reading "NASA REPORT: Aliens may destroy humanity to protect other civlizations..." Another variant was this one: "NASA: Aliens May Destroy Humanity Over Greenhouse Gases."
Eventually, NASA had to send out a Twitter update saying "Yes, @drudge and @guardiannews are mistaken about an 'alien' report. It's not NASA research. Ask the report's author...." The space agency followed up later with two more tweets, emphasizing that it was not involved in the study and saying that Fox News and CNN "have it wrong."
In each case, NASA linked to a lengthy clarification and apology from Domagal-Goldman, who made clear that the study was not a "NASA report," that no NASA funding was expended on it, and that he spent none of his working hours on writing the paper. He said his two co-authors, Seth Baum and Jacob Haqq-Misra of Pennsylvania State University, "put in the vast majority of work on it."
"It was just a fun paper written by a few friends, one of whom happens to have a NASA affiliation," Domagal-Goldman wrote.
He admitted that including the NASA affiliation turned out to be a "horrible mistake":
"I did so because that is my current academic affiliation. But when I did so I did not realize the full implications that has. I'm deeply sorry for that, but it was a mistake born out of carelessness and inexperience and nothing more. I will do what I can to rectify this, including distributing this post to the Guardian, Drudge and NASA Watch. Please help me spread this post to the other places you may see the article inaccurately attributed to NASA.
"One last thing: I stand by the analysis in the paper. Is such a scenario likely? I don't think so. But it's one of a myriad of possible (albeit unlikely) scenarios, and the point of the paper was to review them. But remember — and this is key — it's me standing for the paper ... not the full weight of the National Aeronautics and Space Administration. For anything I have done to mis-convey that to those covering the story, to the public, or to the fine employees of NASA, I apologize."
This isn't the first case where the NASA connection has become entangled in scientific speculation. In March, the space agency took great pains to distance itself from NASA researcher Richard Hoover's claims to have found evidence of outer-space organisms in meteorites.
In Domagal-Gordon's case, the substance was far less controversial. As I've tried to point out above, the views expressed in the paper aren't that far off from the typical science-blog fare. I'm willing to bet a goodly sum of quatloos that Domagal-Gordon will go on to have a fine career in science ... and also that this won't be NASA's last P.R. kerfuffle over E.T.
"Alligator oil can be used as a potential biodiesel feed stock and given that this feedstock is traditionally a waste product, its use should result in reduced processing costs," they conclude in the paper.
The gator biodiesel was similar in comparison to biodiesel from soybeans, the main source of the 700 million gallons of biodiesel produced in the United States in 2008.
The use of alligator fat instead of soybeans could help stem the diversion of food crops to fuel, which seems like a net positive in a world facing food shortages.
As for the alligator meat industry, the reptiles are grown and harvested for their skin and meat. The skin ends up in fashionable wallets, boots, and belts while the meat appears on menus.
"They say it's very good," Bajpai told the New York Times. "I don't know. I'm a vegetarian."
According to the newspaper, the 15 million pounds of alligator fat could amount to 1.25 million gallons of fuel with an energy content that is 91 percent as great as petroleum diesel. The cost of processing would be about $2.40 a gallon, not including the transport the presumably free fat to the plant.
Alligator fat derived biodiesel would join other alternative sources of oil such as leftover fryer grease from restaurants and sewage. To learn more about those sources, check out the stories below.
While facial recognition programs are increasingly being used for various purposes — including identifying London rioters — researchers are going one step further and designing programs that will be able to read human expressions.
The Atlantic's Rebecca Rosen reports on a new software program that seems to be able to match up with perceptions of people, and in a test case, celebrities.
In the video below, the "threatening" expression comes through loud and clear, as did expressions of surprise and wonder. The video is one in a series of expressions on scantodorov's YouTube channel, that show the program in action.
The scientists experimented with different degrees of expressions, to see how sensitive the program could be in capturing and interpreting facial expressions.
And they also explain why it's important:
In a world characterized by an ever growing amount of interactive artifacts, it is important to develop better human-centric systems that incorporate human communicative behaviors. Natural interaction with machines, one that mimics interactions between humans, is hence an important research goal for computer science that converges with similar interests from other disciplines such as social psychology. The understanding of the social value of objects, including faces, requires the development of engaging interactive systems that act in socially meaningful ways. For this purpose, analysis of facial images has become a major research topic with clear multidisciplinary implications.
Their paper published the results, which include the findings that "Dominant," "Threatening," and "Mean" expressions "exhibit the highest and most consistent scores for all classification rules with accuracy higher than 77 percent for all the classifiers."
While we're still a long way from a computer being able to act upon the expression that flashes across a face, it is one step closer to a kind of artificial intelligence that up til now has only been the domain of sci-fi movies and television.
GOP presidential hopeful Jon Huntsman went after Texas Gov. Rick Perry's claim that climate scientists were manipulating data in a Twitter tweet today: "To be clear. I believe in evolution and trust scientists on global warming. Call me crazy." Some Republicans might do just that. On the other hand, it looks as if Huntsman has added about 4,000 Twitter followers today, so maybe he's not so crazy after all. I've updated my roundup of GOP views on science-related issues to include Huntsman's perspective.
Crew members participating in the Mars500 simulated mission to the Red Planet strike a pose in their mock spaceship while wearing red-tinted glasses.
By Alan Boyle, Science Editor, NBC News
A top space official says Europe and Russia will follow up on their simulated 520-day mission to Mars with a real flight to Mars and back — although there's not yet any time frame set for the mission.
The pledge came on Wednesday from Jean-Jacques Dordain, head of the European Space Agency, during his visit to Russia's MAKS air show near Moscow. He said ESA and the Russian Federal Space Agency would "carry out the first flight to Mars together," according to a report from the RIA Novosti news agency.
Dordain was quoted as saying that the Mars500 simulation was a factor in preparations for a human mission to the Red Planet. Mars500's six crew members, all male, have been cooped up for 14 months inside an isolation chamber at Russia's Institute of Biomedical Problems. This week, the European-Russian-Chinese crew passed the 437-day milestone set by Russian cosmonaut Valery Polyakov aboard Russia's Mir space station in 1995. Polyakov holds the record for the longest continuous time spent in space, and if the Mars500 sextet had actually been in space, they would now be the champs.
Mission planners consider 500 days or so to be the most realistic time frame for a round trip to Mars, given the orbital mechanics involved in making the trip. The Mars500 experiment went through a simulated Red Planet landing in February, and the crew is due to come out of isolation at the end of the mission in November.
An actual mission to Mars would face many more hardships, including a prolonged period of reduced gravity as well as the potential for exposure to space radiation. There'd be lots of other logistical challenges, such as generating power on Mars (probably with a mini-nuclear reactor) and having enough food and water to sustain the crew. NASA's current vision for space exploration calls for sending a crew to Mars and its moons in the mid-2030s, and the first trips would likely involve just going there and back without landing on the planet itself.
The Voice of Russia website quoted Igor Lisov, an analyst for the Moscow-based journal Novosti Kosmonavtike (Cosmonautic News), as saying that any mission to the Red Planet would have to be an international venture with participation from Russia and ESA.
"If they decide to implement an emergency program, the mission may be carried out in 10 years," Lisov said. "If it is an ordinary one, then it will take 20 years. This is a long period of time."
Who do you think will take on a human mission to Mars? And when will it happen? Cast your vote, and feel free to weigh in with your comments below.
The California-based company is increasingly in the news because of its role as the first private-sector successor to the just-completed space shuttle program. Just this week, SpaceX confirmed that it had reached an agreement in principle with NASA to launch its next Dragon space capsule atop its Falcon 9 rocket on Nov. 30, carrying cargo to the International Space Station.
The original plan called for one test flight to approach the station without berthing, and for another to go all the way to the hookup. As long ago as last December, however, company founder and CEO Elon Musk said he hoped to combine those two tests into one initial resupply mission. Pending a final safety review, NASA is willing to go ahead with SpaceX's plan — which also calls for the Falcon 9's second stage to deploy two Orbcomm OG2 telecom satellites after the Dragon heads off for the station.
Computer animation shows the launch of a SpaceX Dragon spacecraft, berthing at the International Space Station, and return to Earth. Courtesy NASA.
The blend of commercial and NASA business is a hallmark of the "new space" approach to spaceflight: Development costs are covered by revenue from multiple clients, rather than having the government pay the entire bill for a project.
For now, NASA is SpaceX's prime customer: SpaceX's current manifest anticipates flying four resupply missions to the space station during 2012, which will call for a stepped-up production rate. It's been almost nine months since the company's last launch, which involved a surprisingly successful initial test of the full Falcon/Dragon system. In an exclusive interview this month, Musk acknowledged that "things always take a little more time than we think," but maintained that "we're arguably better than average as far as our schedules are concerned."
"We have built four rockets this year," Musk told me as we sat in his corner cubicle at SpaceX's headquarters in Hawthorne, Calif. "Last year we built two rockets, next year we'll build eight rockets. So our production rate is increasing quite rapidly."
Leah Thompson / AP
SpaceX CEO Elon Musk attends last month's groundbreaking ceremony at Vandenberg Air Force Base, with a launch pad and a picture of the Falcon Heavy rocket serving as a backdrop.
SpaceX is one of several companies in line for NASA's business — not only to fly cargo to the station, but eventually to fly astronauts as well. NASA has set aside nearly $270 million to support the development of the Dragon and spaceships offered by three other companies (Blue Origin, Boeing and Sierra Nevada Corp.) as vehicles for station-bound astronauts. The Dragon is the only one of the four proposed spaceships that's already been in space.
"At least for the next several years, we are the main thing that is flying to space from the United States," Musk noted. "And we're the principal means of resupplying the space station, and the only means of bringing cargo back from the space station. And then hopefully in about three years, we'll be transporting astronauts."
So how does it feel to have the burden of the post-shuttle era on your shoulders? "I get less nervous with each passing flight," Musk answered. And there are many more flights to come.
Another base ... in Texas? Musk has already said that SpaceX is thinking about establishing an additional base for launching Falcon rockets, to supplement its facilities at Cape Canaveral Air Force Station in Florida and the pad that's currently being renovated at Vandenberg Air Force Base in California. The Vandenberg pad is planned as the home base for SpaceX's Falcon Heavy rocket, which is designed to go after the Air Force's satellite launch business.
Last month, local officials in Texas hinted that SpaceX was ready to invest up to $50 million in the Gulf Coast Regional Spaceport, south of Houston. Musk told me that he hadn't yet decided where the third base would be located, but he made it sound as if he was firmly set on expanding operations. He also explained why an extra space base was on SpaceX's agenda:
"We have our main launch facility, which is Cape Canaveral in Florida. Then we are in the process of developing our second launch facility, which is Vandenberg in California. And we do intend to develop a third launch facility. Texas is one of the possible states. But we're also looking at a number of other locations: Puerto Rico, potentially another location in Florida, potentially Hawaii. And there are a few other locations that could work. So we're trying to make the right decision for the long term.
"The third launch site would open early, in perhaps three or four years. So we want to make sure we make the right decision. But we do think we need three launch sites in order to handle all of the launch demand that we have been able to get. ...
"It would be a purely commercial launch site, whereas Cape Canaveral and Vandenberg are actually Air Force bases — in the case of Cape Canaveral, it's sort of a joint NASA-Air Force activity. So it makes sense to have NASA and Defense Department launches occur from Cape Canaveral and Vandenberg, but then probably shift most of our commercial launches to a purely commercial launch site that's really aimed at being the best customer for a commercial launch provider. Just as there are Air Force bases and commercial airports ... there's some logic to separation."
So at a time when a lot of folks are wondering whether America's aerospace industry is heading toward atrophy, Musk is bullish about his company's future. SpaceX's work force has already risen to 1,500 employees, and that's just one company. Other new players in the spaceflight industry, such as Sierra Nevada Corp. and AdamWorks, are talking about expansion as well.
In the coming weeks, we'll be presenting a package of videos and stories about the future of spaceflight as part of msnbc.com's "Future of Technology" special report. What you're reading today is just a little taste from my wide-ranging interview with Musk. We also talked about his Red Planet ambitions, his perspectives on electric cars and other technological frontiers, and how he manages to wedge in a personal life as well. Stay tuned for much more to come, not only from Musk, but also from other leading figures in the spaceflight revolution.
Technicians prepare to remove one of the space shuttle Atlantis' three main engines from the orbiter's aft section on Aug. 18, using a highly modified fork lift in Orbiter Processing Facility Bay 2 at NASA's Kennedy Space Center in Florida. The engines will be stowed for study or future use.
Bruce Weaver / AFP - Getty Images
A wider view shows the space shuttle Atlantis inside the Orbiter Processing Facility.
IBM today unveiled a computer chip designed to emulate the brain's ability for perception, action and cognition. This chip in particular is pretty good at the game Pong, the company says.
By John Roach, Contributing Writer, NBC News
Computer chips with worm-like intelligence were unveiled today by researchers at IBM, a breakthrough, they say, on the road to creating computers that function like the human brain.
For now, achieving the goal of human-like intelligence in a computer with the size and power needs of our brains is a long ways off, Dharmendra Modha, the researcher leading the project, told me, but the chips he held as we spoke were proof that a "new generation" of computers are in the offing.
"It is IBM's first cognitive computer core that brings together computation in the form of neurons, memory in the form of synapses and communication in the form of axons," he said.
Such chips, he said, could form the basis of computers that are able to monitor real-time traffic-light cameras, notice an anomaly and dispatch an ambulance in time to save lives.
Other potential applications include lining the ocean with sensors for everything from temperature, humidity and wave height to acoustics and turbidity. The computer would constantly monitor all that data and detect patterns such as rogue waves that could interrupt shipping or a tsunami that could wipe out coastal villages.
A glove instrumented with sight, smell, temperature and other sensors and put on the hand of produce handlers at the grocery store could identify fruits and veggies that are contaminated, again saving lives.
The chips do this by integrating memory and processing, unlike today's computers, which separate the functions. It's a difference, he said, between growing food in one part of the world then eating it in another and a farmers market where you buy and eat locally grown food.
The silicon cores unveiled today aren't at the level of the human brain yet. They have 256 neuron-like nodes. One has what the company calls 262,144 programmable synapses, the other contains 65,536 synapses. They can drive a car through a simple maze and reconfigure a triangle from just a fragment, Modha said.
It can also play Pong, the 1970s arcade game. "It might beat you, I don't promise, but it might," Modha said.
The next step is to take these tiny brain-like circuits and weave them into a system that eventually has 10 billion neurons and 100 trillion synapses that consume just 1 kilowatt of power and occupy the same volume as a shoebox.
The technology, Modha added, is a departure from the way computers have evolved over the past 100 years and allowed the development of machines such as the question-answering maverick Watson that won a well-publicized game of "Jeopardy" earlier this year.
"Watson represents the epitome of artificial intelligence today, I would say," Modha said. "And we are trying to emulate the brain. They are yin and yang, salt and pepper, they may work together and complement each other, but they are not the same."
The project has keen interest from the U.S. government. As IBM unveiled the chips, they also announced $21 million in new funding from DARPA for the project, which is named SyNAPSE (Systems of Neuromorphic Adaptive Plastic Scalable Electronics).
The goal of the project is a system that not only analyzes complex information from multiple sensory modalities at once, but also dynamically rewires itself as it interacts with its environment — all while rivaling the human brain's compact size and low power usage.
We are still years away from such a computer, Modha said, and when it arrives it is likely to complement other computers and humans, not outsmart us or our machines.
"The human brain is so darn bloody awesome it shames me," he said. "The thing is, today's computers are so darn bloody bad in terms of power, space and functionality ... you pick a problem with so much room to play that you improve it slightly, you look like a hero."