- Published on 17 February 2011
The need to store, distribute and analyze the 15 million gigabytes of data annually generated by the Large Hadron Collider (LHC) at CERN has led to a revolutionary development of innovative software tools. Under CERN coordination, leading IT teams have tested and validated cutting-edge software technologies aimed to operate distributed computing and data storage infrastructures based on a worldwide network of hundreds of computing centers on an unprecedented scale. Such impressive achievements have allowed several thousands of scientists in hundreds of research institutes and universities around the world to participate in the LHC experiments and access a huge amount of experimental data, equivalent to more than 1.7 million dual-layer DVDs a year, in real-time.
This first EPJ Plus focus point deals
with the above-mentioned hot issues in software technologies valued by the high-energy physics computing community. It
delivers high-quality peer-reviewed papers written by internationally recognized scientists, which encompass the most
effective results achieved to date.
Eugenio Nappi, guest editor