Large Hadron Collider

Posted by on Jun 28, 2013 in Our Awesome Universe | No Comments

Big Data In the Large Hadron Collider

Situated underneath France and Switzerland, the Large Hadron Collider (LHC) is the world’s largest particle collider, consisting of a 27 km ring of superconducting magnets. It was created by the European Organization for Nuclear Research in 2008 in order to recreate the conditions at the beginning of the universe, or the big hang, and also to either prove or disprove the Higgs boson, the particle that creates the Higgs field that gives all other particles in the universe their mass (CERN).

Statistics recorded with Counterize - Version 3.1.4

In order to accomplish this daunting feat, particle accelerators with incense beams of particles traveling close to the speed of light are smashed together. The energy that released in these accelerators can recreate the conditions similar to the fractions of a second after the Big Bang. These free-moving matter particles are the structure of our universe. Within the accelerators of the LHC, the higher the intense energy created by the collisions, the more accurate scientists can reenact those very same. The 600 million collisions every second creates an enormous amount of data (roughly 15 petabytes yearly), while detectors placed around the circumference of the collider gather that data from the collisions. With such an immense amount of data, the question arises: How can one analyze it all in order to come with plausible solutions?

Big Data is collected data so large that it is extremely difficult to analyze. Gantz and Reinsel (2012) says that The International Data Corporation defines Big Data technologies as:

a new generation of technologies and architectures, designed to economically extract value from very large volumes of a wide variety of data by enabling high-velocity capture, discovery, and/or analysis” (p. 9).

Furthermore, there are three main characteristics of Big Data: the data itself, the analytics of the data, and the presentation of the results of the analytics.

Our own man-made digital universe is made up of all kinds of data, from the trillions of text messages we send daily to information about our spending habits to streaming videos. Unfortunately, this vast amount of data is difficult to sift through and interpret in order to come to any conclusions. With the Large Hadron Collider, these huge chunks of data are too cumbersome to calculate by conventional database software. Instead, the

Big Data and the LHC

The LHC uses ATLAS, a large data facilitator that is capable of recording over 10 pedabytes of data yearly. If it were to hold all of the data generated by the LHC, it would be the equivalent of 100,000 CDs per second (CERN). “As the amount of data continues to grow we have to keep trying to optimize the rate at which we can process the data,” said Bob Jones of CERN.

While this is a commendable feat, CERN must be concerned with effective ways to handle big data. Fortunately, they have developed the Worldwide LHC Computing Grid (WLCG) to facilitate this data. The WLCG is a network of more than 150 computing centers around the world. The WLCG stores the data then sends it around the world for analysis.

The raw data gathered by the LHC is analyzed using batch processing. But unfortunately, batch processing is slow and scientists will need to be able to access very large amounts of this data more rapidly in order to more accurately devise a robust architecture that may lead them to discovering the moment that the universe began. CERN scientists are examining the continuing expansion of cloud computing that is revolutionizing computer storage technologies. The spread of Big Data technology in the digital universe has taken on the feel of a tangible geography. Thankfully in this time of technological advances, processing big data is becoming more possible, even though it must be handled by large teams. Cloud computing uses data centers around the world that are linked to billions of distributed devices. The technology continues to grow, so that the discoveries at the LHC can be quantified. This digital universe, with its profound amount of information, need not be stored in a centralized location in order to be calculated. Therefore, experimental designs such as the LHC will continue to be build and other complex puzzles in our universe can be explored using cloud computing and the resources of expanding technologies.

Leave a Reply