Imprimir Republish

Physics

High Energies

The Brazilian researchers' tools for participation in the gigantic study on subatomic particles

One of the grandiose, complex and costly scientific experiments currently taking place, along with space exploration and genome projects, is the study concerning the smallest particles that go to make up any kind of matter. As paradoxical as it may seem, the quarks and the leptons, for example, that  make up protons, neutrons, atoms, indeed everything that exists in the Universe, demand huge instruments known as particle accelerators, as well as advanced computer systems, with high capability in transmission and data storage. Only true international cooperation, such as that involved in the construction of the space station or in the transcription of genes, can set the scene for obtaining new knowledge on the interaction and the formation of these particles. This involves an association of operations that can also count upon the participation of two Brazilian research groups, one from the São Paulo State University (Unesp) and the other from the State University of Rio de Janeiro (Uerj).

They are finishing a complex computer network that is going to bring together the equivalent of 380 computers working together and the fastest connection via the internet in the country. All of this so that they can be linked to the two largest and most important subatomic particle accelerators in the world, the Fermilab, acronym for Fermi National Accelerator Laboratory, located close to the city of Chicago, in the United States, and CERN, the European Organization for Nuclear Research, with its headquarters in the city of Geneva in Switzerland. They should generate 100 million gigabytes (GB) of data over the next ten years. This number is equivalent to the total capacity of 2.5 million hard discs each of memory 40 GB, the most widely used in the most up to date computers.

Latest generation
In the city the São Paulo, Regional Analysis Center (Sprace), has already been set up. It contains 114 central processing units (CPUs), or data processors, and is being finalized, through funding from FAPESP, with a further 64 CPUs, to a total that should be the equivalent of 178 of the latest generation CPUs all working in parallel. This equipment is being stalled at the Physics Institute of the University of Sao Paulo, within the São Paulo capital, in an agreement with Unesp, which places professor Sérgio Ferraz Novaes, from the Theoretical Physics Institute (IFT), as the project’s coordinator. Novaes, who worked at the Fermilab for two years between 2000 and 2002, is leading a team of four researchers: Eduardo de Moraes Gregores, Sérgio Morais Lietti and Pedro Galli Mercadante from the IFT, linked to the Young Investigator Post Doctoral Studies program, financed by FAPESP, and Rogério Luiz Iope, a post-graduate student at the Polytechnic School of USP. In the Sprace, the team has, for the storage of data, discs with memory capacity of 12 terabytes (TB), the equivalent of 12,000 gigabytes and more than 18 million CDs.

In Rio de Janeiro, 200 CPUs, each with 7 TB of disc space, were installed under the supervision of professor Alberto Santoro, from UERJ, together with twenty (20) researchers from the Brazilian Center of Physics Research (CBPF), of the Federal University of Rio de Janeiro (UFRJ), also in collaboration with the Federal University of Bahia (UFBA) and the Federal University of Rio Grande do Sul (UFRGS) and the Celso Suckow da Fonseca Federal Center for Technological Education (Cefet/RJ). The project is being funded by the Financier of Studies and Projects (Finep) of the Ministry of Science and Technology (MCT). Santoro, who is a veteran in this type of research, has been working with the two major particle accelerator laboratories for more than twenty years. In 1995, he was part of the team that discovered at the Fermilab the particle named quark top, the last of the six quarks forecast by the theory that describes elementary particles and their interactions.

The Sprace and the group from Rio de Janeiro form the High Energy Physics (HEP) Grid Brazil, which is currently working on experiments both at the CERN and the Fermilab. The two institutions recently adopted one of the greatest innovations in the area of informatics that came about in the decade of the 1990’s, which is the Grid System, a computer format that is beginning to turn itself into a tool that is more and more present in the processing of scientific data. In the grid concept, various computers are connected up at the same location, forming groupings, also called clusters, which can be linked to other groups of computers located either in a nearby building or on the other side of the planet.

In the United States, the National Science Foundation (NSF) will invest US$ 150 million over the next few years to complete the linkage in the form of a grid of all of the scientific and engineering communities within that country. Called TeraGrid, the system offers a series of hardware and software resources that are beginning to be used in the de decoding  of genomes and proteins, in the diagnosis of illnesses and in the forecasting of weather and earthquakes. In Europe, the German government announced in September an investment of 17 million euros for the formation of a national infrastructure based on the grid structure. The DGrid Network will give to all of the country the possibility of the resolution of complex scientific experiments at a distance. High-energy physics, included within this group, which studies the particles produced in accelerators, the observation of the Earth, astronomy, medical research and engineering applications. The grid system should thus overcome the concept of the super computer, an expensive piece of equipment that has little flexibility to increase or decrease the processing capacity. Within the grid one simply adds in or removes one or more computers.

“In the grid system everything works in an automatic and transparent manner, with the tasks being directed to the different clusters that are with free processing capacity at any determined moment.”, explains Novaes. Everything works with open source software in such a way that each group can also contribute to the improvement of the system. The high energy physics grid is going to be used at the CERN, when a new accelerator will be opened in 2007, a hierarchical architecture that will function starting from a central unit, called Tier 0, located in the laboratory building, from where the data will be distributed via high velocity networks to various national level one centers (Tier 1). Starting from each Tier 1 the data will be distributed to the level two centers (Tier 2) directly associated to them and from there to level three centers. The HEPGrid Brazil is in the level two category i.e. Tier 2. “The evolution of our work, together with further investment, will permit, within a few years, the transformation of our group into a Tier 1”, professor Santoro believes.

Essential velocity
The connection with the particle accelerator laboratories demands excellent communication between the various groups spread throughout the world. To this end, the transmission is done completely by means of optical fiber cables. Both in the USP laboratories and in those of Uerj data will be received and sent via optical fiber even to the United States by way of underwater cables. Forecast for the next few years, after the inauguration of the new CERN accelerator, is the Large Hadron Collider (LHC),  with transmission of at least 2.5 gigabits per second (Gbps). “When comparing it with the transmission offered commercially by the wide wave band of 256 kilobits per second (Kbps), we can say that the researchers will be transmitting at a speed that is 10,000 times faster or that the same quantity of data that would take a second by way of the CERN, via the common wide band would take some three hours to be sent”, calculates Luis Fernandez Lopez, the coordinator of FAPESP’s Information Technology Program in the Development of the Advanced Internet (Tidia).

Today all of the communication that comes out of the Sprace laboratory moves forward via laser by way of optical fibers encapsulated within underwater cables running up to Miami. Currently the Sprace operates at 622 megabits per second (Mbps), although it is already equipped to be connected at 2.5 Gbps. “Shortly, with the start of operations of the LHC, this transmission velocity will be essential for the continuity of research in this area”, says  Novaes.

Future demands have made FAPESP, which maintains both the São Paulo Academic Network (Ansp) and the Tidia, sign an agreement with the NSF for the financing of the Western Hemisphere Research and Educational Network (WHREN) project – a network that will link researchers within of all of the American continent -, which should be kicked off in December of this year through an optical cable that will make the link São Paulo-Miami-New York with a velocity of 2.5 Gbps. This connection is going to supply both the research needs in the area of high-energy physics and other laboratories. With the Whren, FAPESP put up 1 million US dollars and the NSF matched the funding. The link with New York will bring about velocities of 10 Gbps in the United States and a connection with Europe at a speed of 40 Gbps. In Rio de Janeiro, professor Santoro’s group already has available a transmission speed to Sao Paulo of 1 Gbps through an experimental network maintained by the National Teaching and Research Network (RNP) of the MCT and also funded by the Carlos Chagas Filho Foundation for Research Support in the State of Rio de Janeiro (Faperj). The RNP is also studying an external connection of 10 Gbps in an agreement with the Clara Network, namely the Latin American Cooperation of Advanced Networks. By the end of the year, a linkage, at the same velocity, between Sao Paulo and Rio de Janeiro, is going to benefit all of the research institutions.

All of the structure that allows for the search for knowledge and for an understanding of the most intimate parts of matter begins to function when two particles, such as two protons, for example, are led to collide within an accelerator. It is as if two certain objects were to be accelerated to high speeds within a metal ring and meet each other and are annihilated in the vision of a detector filled with sensors that work in a manner similar to that of photographic cameras. As a result, the destruction brings about a pile of smithereens, or in this case, of particles. Comparatively what the researchers do is to analyze the type, the curvature of trajectory of the particles produced and the energy that they deposit on the detectors.

In the world of particles there are “strange” behaviors in comparison to our macro world, such as the fact that some can count upon even smaller particles that transform themselves into another type of particle up until that point unthinkable. One of the problems for the researchers is that rare particles exist and for this reason the experiments require billions of collisions to be studied. “Within the circular accelerators the protons used in the experiments are accelerated by radio frequency and by superconductor electromagnets, installed throughout the circle”, explained professor Novaes. “The particles journey on the crests of electromagnetic waves just like surfers”, adds Santoro.

The researchers’ work is to identify both the rare particles and the more common particles within the information captured by the sensors after collisions. “We’ve received from Fermilab and CERN a group of data to identify the particles, to carry out the analysis of interactions and to present conclusions”, explained professor Novaes. Today there even exist particles forecast in theory and as yet not detected, such as the Higgs’ boson, a particle that could be responsible for the mass of all of the others. Another possibility is the experimental verification of alternative models such as the super symmetrical models, with new particles that have led to the names of gluinos, squarks and sleptons, or those that forecast the existence of extra dimensions.

All of these efforts have resulted in research about the formation and interaction of particles, but have also collaborated in the understanding of the Universe, of the stars and to a series of understandings that have been transformed into technologies in day to day use. One of them was the creation of the World Wide Web, the well-known www. It was exactly in order to turn more agreeable the interaction between the various researchers who worked with particles in various countries that the CERN researcher, Tim Berners-Lee, created the web system, in which it was enough to click on a link, for example, in order to access information. Scientists had previously exchanged information via the computer, but this was a lot more complex, as Santoro points out as he was one of the first Brazilians to use the www system.

Images in positron
In the area of medicine, the study of high energy physics has led to an improvement in the treatment of tumors by way of particle beams and the positron emission tomography (PET), whose fundamental principle is positron emission (or the anti-electron, which are the particles with the same mass as the electron but with a positive charge) and to offer high definition images of the interior of the human body. The technological advance of integrated circuits by the acquisition and processing of data and  the use of optical fibers, which later would become generalized in telecommunications, also had a major contribution in the study of particles.

To obtain gains in scientific knowledge and, as well, to give incentive to technological innovations has made particle accelerators an expensive experiment. Only in the running of the Fermilab 1 million US$ are spent per day, which is supplied by the US epartment of Energy and administered by an association of universities. While the European LHC does not come into operation, the Tevatron at Fermilab is the largest particle accelerator in existence. The collision ring has a circumference of 6.3 kilometers (km) and a radius of 1 km. The Brazilian group is working with data from one of the two detectors, the Dzero. With a five-floor structure and some twenty meters in length, it weighs more than 5 tons and has more than 800,000 channels of electronic readings. The scientific collaboration brings together 18 countries, as well as the United States and Brazil, such as Canada, the United Kingdom, Argentina, South Korea, China, France, Russia, Holland and Germany. There are 83 institutions, 36 of them being in the United States, bringing together 664 physicists, almost half of them in North America.

The era of the Exabyte
With the start of operation of the LHC in 2007, the number of researchers involved in this area throughout the world should rise. In only one of the four LHC detectors, the Compact Muon Solenoid (CMS), where a mountain of data will be produced per second equivalent to 10,000 British Encyclopedias, there are already more than 2,000 researchers working, coming from 165 institutions in the 36 participating countries. All of the CERN researchers are going to operate with the equivalent of 50,000 computers interlinked in the processing of information that is going to be generated. Over the period of five to eight years, the laboratory is going to usher in the era of the Exabyte (EB), with the production of 1 EB in digital data, or 1 quintillion of bytes. If it were possible to store this fabulous crop on CDs, which have the capacity of 700 megabytes (MB), we would have close to 1.4 billion discs. Another comparison is that this 1 EB would be equivalent to 20% of all of the type of information transformed by the digital highway generated during the year 2002 in the world, within the internet, magazines, newspapers, books and films.

The CMS will be the detector with which the Brazilian researchers linked to the HEPGrid Brazil will be working. However, another three detectors, namely Atlas, Alice and LHCb, will also have the collaboration of Brazilian researchers from the CBPF, UFRJ and USP. As a working instrument, within its circumference of 27 km, the CERN institute’s LHC is going to bring about collisions of seven times more energy than those of the Fermilab’s Tevatron. The North American accelerator works off 2 trillion Electron Volts (TeV) and the LHC accelerator will work with 14 TeV. As a text on the CERN site states: “A TeV is a unit of energy used in particle physics. 1 TeV is about the energy of motion of a flying mosquito. What makes the LHC so extraordinary is that it squeezes energy into a space about a million times smaller than a mosquito.”

The projects
1.
Experimental physics by way of ring collisions: SP-Race and HEP Grid-Brazil (nº 03/04519-9); Modality Thematic Project; Coordinator Sérgio Ferraz Novaes – Unesp; Investment R$ 709,342.00 (FAPESP)
2. Experimental physics of high energies: the Fermilab Dzero experiment and the CERN CMS experiment (nº 04/06708-6); Modality Young Investigator Post Doctoral Studies Program at Emergent Centers; Coordinator Eduardo de Moraes Gregores – Unesp; Investment R$ 73,963.15 (FAPESP)

Republish