Energy News  
LHC Computing Centres Join Forces For Global Grid Challenge

In 2007, CERN will introduce its Large Hadron Collider (LHC) - the world's largest particle accelerator. LHC will allow scientists to penetrate further into the structure of matter and recreate the conditions prevailing in the early universe, just after the Big Bang. But the four experiments at the LHC will produce more data than any previous coordinated human endeavour - 10 Petabytes each year, equivalent to a stack of CDs twice the height of Mount Everest. Careful analysis of all of this complex data will be required to look for some of the most elusive constituents of current physics, such as the Higgs particle and supersymmetry. Rather than deal with this data on expensive supercomputers, based at a few institutions and in high demand, LHC will use distributed computing. More than 100,000 PCs, spread at one hundred institutions across the world, will allow scientists from different countries to access the data, analyse it and work together in international collaborations.

Geneva, Switzerland (SPX) Apr 27, 2005
Tuesday, in a significant milestone for scientific grid computing, eight major computing centres successfully completed a challenge to sustain a continuous data flow of 600 megabytes per second (MB/s) on average for 10 days from CERN in Geneva, Switzerland to seven sites in Europe and the US.

The total amount of data transmitted during this challenge - 500 terabytes - would take about 250 years to download using a typical 512 kilobit per second household broadband connection.

This exercise was part of a series of service challenges designed to test the global computing infrastructure for the Large Hadron Collider (LHC) currently being built at CERN to study the fundamental properties of subatomic particles and forces.

The service challenge participants included Brookhaven National Laboratory and Fermi National Accelerator Laboratory (Fermilab) in the US, Forschungszentrum Karlsruhe in Germany, CCIN2P35 in France, INFN-CNAF in Italy, SARA/NIKHEF in the Netherlands and Rutherford Appleton Laboratory in the UK.

"This service challenge is a key step on the way to managing the torrents of data anticipated from the LHC," said Jamie Shiers, manager of the service challenges at CERN.

"When the LHC starts operating in 2007, it will be the most data-intensive physics instrument on the planet, producing more than 1500 megabytes of data every second for over a decade."

The goal of LHC computing is to use a world-wide grid infrastructure of computing centres to provide sufficient computational, storage and network resources to fully exploit the scientific potential of the four major LHC experiments: ALICE, ATLAS, CMS and LHCb.

The infrastructure relies on several national and regional science grids. The service challenge used resources from the LHC Computing Grid (LCG) project, the Enabling Grids for E-SciencE (EGEE) project, Grid3/Open Science Grid (OSG), INFNGrid and GridPP.

LHC scientists designed a series of service challenges to ramp up to the level of computing capacity, reliability and ease of use that will be required by the worldwide community of over 6000 scientists working on the LHC experiments.

During LHC operation, the major computing centres involved in the Grid infrastructure will collectively store the data from all four LHC experiments.

Scientists working at over two hundred other computing facilities in universities and research laboratories around the globe, where much of the data analysis will be carried out, will access the data via the Grid.

Fermilab Computing Division head Vicky White welcomed the results of the service challenge.

"High energy physicists have been transmitting large amounts of data around the world for years," White said.

"But this has usually been in relatively brief bursts and between two sites. Sustaining such high rates of data for days on end to multiple sites is a breakthrough, and augurs well for achieving the ultimate goals of LHC computing."

NIKHEF physicist and Grid Deployment Board chairman Kors Bos concurred.

"The challenge here is not just the inherently distributed nature of the Grid infrastructure for the LHC," Bos said, "but also the need to get large numbers of institutes and individuals, all with existing commitments, to work together on an incredibly aggressive timescale."

The current service challenge is the second in a series of four leading up to LHC operations in 2007.

It exceeded expectations by sustaining roughly one-third of the ultimate data rate from the LHC, and reaching peak rates of over 800 MB/s.

This success was facilitated by the underlying high-speed networks, including DFN, GARR, GEANT, ESnet, LHCnet, NetherLight, Renater, and UKLight.

The next service challenge, due to start in the summer, will extend to many other computing centres and aim at a three-month period of stable operations.

That challenge will allow many of the scientists involved to test their computing models for handling and analyzing the data from the LHC experiments.

Community
Email This Article
Comment On This Article

Related Links
LHC Computing Grid project
Enabling Grids for E-SciencE
Grid3
GridPP
INFNGrid
Open Science Grid
SpaceDaily
Search SpaceDaily
Subscribe To SpaceDaily Express
Satellite-based Internet technologies



Memory Foam Mattress Review
Newsletters :: SpaceDaily :: SpaceWar :: TerraDaily :: Energy Daily
XML Feeds :: Space News :: Earth News :: War News :: Solar Energy News


iPod Dominance A Mirage
Chicago (UPI) Jan 09, 2006
Though Apple Computer has reported remarkable success with its iPod - sales rose by 250 percent during the last fiscal year - there is some competition coming this week for the developer of the world's most famous, legitimate music downloading network, experts tell United Press International's Networking.







  • Bush Calls For More Nuclear Power In US
  • Green Machine Drives For Ultra Fuel Savings
  • Outside View: The Challenge Of Resources
  • Spontaneous Ignition Discovery Has ORNL Researcher Fired Up

  • Study Uncovers Bacteria's Worst Enemy
  • India Signs Nuke Safety Treaty
  • China Plans To Build 40 New Nuclear Reactors In Next 15 Years
  • New Alloy Verified For Safer Disposal Of Spent Nuclear Energy Fuel





  • NASA Uses Remotely Piloted Airplane To Monitor Grapes



  • Boeing Procurement Scandal Spawns 48 Air Force Reviews: General
  • Who Will Win: Boeing Or Airbus?
  • Airbus, Space Activities Lift EADS 2004 Profit By 60 Percent
  • Fossett Commits To Final Dash To Kansas

  • NASA plans to send new robot to Jupiter
  • Los Alamos Hopes To Lead New Era Of Nuclear Space Tranportion With Jovian Mission
  • Boeing Selects Leader for Nuclear Space Systems Program
  • Boeing-Led Team to Study Nuclear-Powered Space Systems

  • The content herein, unless otherwise known to be public domain, are Copyright 1995-2006 - SpaceDaily.AFP and UPI Wire Stories are copyright Agence France-Presse and United Press International. ESA PortalReports are copyright European Space Agency. All NASA sourced material is public domain. Additionalcopyrights may apply in whole or part to other bona fide parties. Advertising does not imply endorsement,agreement or approval of any opinions, statements or information provided by SpaceDaily on any Web page published or hosted by SpaceDaily. Privacy Statement