Saturday, May 21, 2005

A Torrent of CosmoParticle Data at the LHC

There is an interesting article in this week's New Scientist, about the computing challenge posed by the absurdly large amounts of data that will be generated at the Large Hadron Collider (LHC). The LHC, which will turn on in 2007, will be the world's largest machine, colliding protons at extremely high energies and providing unparalleled insights into the fundamental nature of matter and it's interactions.

There are compelling theoretical reasons to think that our current theories of particle physics require modifying at precisely those energies that the LHC will probe. These current theories are the electroweak theory, which unifies electromagnetism with the weak nuclear force, and quantum chromodynamics (QCD), which describes the strong nuclear force. Because the LHC collides protons together, the particles that cascade out after the collision are the result of a host of electroweak and QCD interactions. The QCD part is particularly complicated, and figuring out which collisions are potentially interesting (a software task that particle experimentalists refer to as triggering) and should be recorded by computer, is a formidable task. The numbers - 15 million Gigabytes of data in year one alone - are truly staggering.

As the New Scientist article comments
"The torrent of information gushing forth from the LHC each year will be enough to fill a stack of CDs three times as high as Mount Everest. To make sense of it will require some 100,000 of today's most powerful PCs, so it is little wonder that CERN - the European centre for particle physics near Geneva that is building the collider - is co-opting a worldwide "grid" of computers to help store and analyse the data."
This computing task is one that it is easy for most physicists to forget about. Those of us not directly involved in the experiment tend to be focused only on the particle physics, and forget the massive engineering, design and computing effort required to make the machine work.

It's not just particle physicists who care about all this. Orange Quark readers will know my view, that particle physics and cosmology are now inseparable, and, indeed, I've commented before on the role that a future International Linear Collider might play in our understanding of the cosmos. The LHC is a necessary precursor to such an endeavor, potentially discovering the Higgs boson (explaining the origin of mass) and very possibly yielding evidence for supersymmetry, extra dimensions, or some other physics beyond the standard model. Any of these discoveries would have profound cosmological implications, perhaps for our understanding of dark matter and of the origin of the asymmetry between matter and antimatter (although the New Scientist article makes it sound a little too much like cosmology is the primary reason for building colliders).

But the list of people expectantly watching the LHC doesn't end with cosmologists. New Scientist quotes François Grey, an IT spokesman for CERN, and reports that
"CERN will need to ensure that no one team or institute hogs the grid. Physicists will probably barter computer time for now, but the system could later work on a pay-as-you-go basis. `Industry is very interested to see how we handle this,' says Grey. `A large grid could be very exciting for commercial business, but they need to know what the business model would be.' "
For me, a sufficient reason to build a machine like the LHC is the breathtaking possibility of understanding more about our universe at its smallest and its largest scales. However, it doesn't hurt that there are other sectors of society that see tangible benefits beyond the fundamental discoveries.
| | (4Pre-Haloscan) xmlns:trackback="">

<< Home