Energy News  
The Space Simulator � Modeling The Universe On A Budget

illustration only

Los Alamos NM (SPX) Jun 23, 2004
For the past several years, a team of University of California astrophysicists working at Los Alamos National Laboratory have been using a cluster of roughly 300 computer processors to model some of the most intriguing aspects of the Universe.

Called the Space Simulator, this de facto supercomputer has not only proven itself to be one of the fastest supercomputers in the world, but has also demonstrated that modeling and simulation of complex phenomena, from supernovae to cosmology, can be done on a fairly economical basis. According to Michael Warren, one of the Space Simulator's three principal developers, "Our goal was to acquire a computer which would deliver the highest performance possible on the astrophysics simulations we wanted to run, while remaining within the modest budget that we were allotted. Building the Space Simulator turned out to be a excellent choice."

The Space Simulator is a 294-node Beowulf cluster with theoretical peak performance just below 1.5 teraflops, or trillions of floating point operations per second. Each Space Simulator processing node looks much like a computer you would find at home than at a supercomputer center, consisting of a Pentium 4 processor, 1 gigabyte of 333 MHz SDRAM, an 80 gigabyte hard drive and a gigabit Ethernet card.

Each individual node cost less than $1,000 and the entire system cost under $500,000. The cluster achieved Linpack performance of 665.1 gigaflops per second on 288 processors in October 2002, making it the 85th fastest computer in the world, according to the 20th TOP500 list

A gigaflop is a billion floating-point operations per second. Since 2002, the Space Simulator has moved down to #344 on the most recent TOP500 list as faster computers are built, but Warren and his colleagues are not worried.

They built the Space Simulator to do specific astrophysics research, not to compete with other computers. It was never designed to compete with Laboratory's massive supercomputers and, in fact, is not scalable enough to do so.

The Space Simulator has been used almost continuously for theoretical astrophysics simulations since it was built, and has spent much of the past year calculating the evolution of the Universe.

The first results of that work were recently presented at a research conference in Italy by Los Alamos postdoctoral research associate Luis Teodoro. Further analysis of the simulations, in collaboration with Princeton University professor Uros Seljak, will soon be published in the prestigious journal Monthly Notices of the Royal Astronomical Society.

In addition to simulating the structure and evolution of the Universe, the Space Simulator has been used to study the explosions of massive stars and to help understand the X-ray emission from the center of our galaxy.

The Space Simulator is actually the Laboratory's third generation Beowulf cluster. The first was Loki, which was constructed in 1996 from 16 200 MHz Pentium Pro processors. Loki was followed by the Avalon cluster, which consisted of 144 alpha processors.

The Space Simulator follows the same basic architecture as these previous Beowulf machines, but is the first to use Gigabit Ethernet as the network fabric, and requires significantly less space than a cluster using typical computers.

The Space Simulator runs parallel N-body algorithms, which were originally designed for astrophysical applications involving gravitational interactions, but have since been used to model more complex particle systems.

Community
Email This Article
Comment On This Article

Related Links
Los Alamos National Laboratory
SpaceDaily
Search SpaceDaily
Subscribe To SpaceDaily Express
Nuclear Space Technology at Space-Travel.com



Memory Foam Mattress Review
Newsletters :: SpaceDaily :: SpaceWar :: TerraDaily :: Energy Daily
XML Feeds :: Space News :: Earth News :: War News :: Solar Energy News


JHU-STScI Team Maps Dark Matter In Startling Detail
Baltimore MD (SPX) Dec 12, 2005
Clues revealed by the recently sharpened view of the Hubble Space Telescope have allowed astronomers to map the location of invisible "dark matter" in unprecedented detail in two very young galaxy clusters.







  • Fuel Cell Industry Targeting Life-Cycle Strategies
  • Cheaper Wastewater-Fueled Device Produces More Electricity
  • Fuel Efficiency Stimulates Use Of Lightweight Materials In Autos
  • Why Calcium Improves A High-Temperature Superconductor

  • Yucca Mountain Site Must Make Use Of Geological Safety Net
  • New Jersey Physicist Uncovers New Information About Plutonium
  • Complex Plant Design Goes Virtual To Save Time And Money
  • Volcanic Hazard At Yucca Mountain Greater Than Previously Thought





  • NASA Uses Remotely Piloted Airplane To Monitor Grapes



  • NASA To Award Contract For Aerospace Testing
  • Sonic Boom Modification May Lead To New Era
  • Hewitt Pledges Support For Aerospace Industry
  • National Consortium Picks Aviation Technology Test Site

  • NASA plans to send new robot to Jupiter
  • Los Alamos Hopes To Lead New Era Of Nuclear Space Tranportion With Jovian Mission
  • Boeing Selects Leader for Nuclear Space Systems Program
  • Boeing-Led Team to Study Nuclear-Powered Space Systems

  • The content herein, unless otherwise known to be public domain, are Copyright 1995-2006 - SpaceDaily.AFP and UPI Wire Stories are copyright Agence France-Presse and United Press International. ESA PortalReports are copyright European Space Agency. All NASA sourced material is public domain. Additionalcopyrights may apply in whole or part to other bona fide parties. Advertising does not imply endorsement,agreement or approval of any opinions, statements or information provided by SpaceDaily on any Web page published or hosted by SpaceDaily. Privacy Statement