The physicists who study how atomic particles are formed and organized have leapt ahead of experts from other fields and have embraced a new way of doing science: they work on major and common problems by resorting to powerful computers located in many cities around the world. These powerful computers are linked to each other in a way that allows them to function as if they were a single computer, on a wider, more integrated and more independent scale than has so far been the case in studies on genomes and proteins. This pioneering attitude was perhaps not deliberate. “This didn’t occur because we wanted it to, but because we needed it,” says Sergio Ferraz Novaes, a professor at the Physics Institute of Paulista State University (Unesp). “It cannot take us 50 years to analyze the data produced in a single year of work.”
Novaes coordinates the São Paulo State branch of an international computer network that filters and organizes the results of atomic collisions in particle accelerators on such a huge scale that no single computer could do this alone. Thanks to the São Paulo Regional Analysis Center (Sprace), built with a R$ 710 thousand grant from FAPESP, that houses processing capacity equal to that of almost one hundred computers, physicists from São Paulo have taken part, since 2004, in the analysis of the properties of millions of particles that are born or die when they collide at incredibly high speeds in the tunnels of Fermilab, in the United States. Now, two teams of physicists – one from São Paulo and another from Rio de Janeiro, under the coordination of Alberto Santoro – are fine-tuning the machines and catching their breath to participate in an even bigger adventure: collecting the information expected as of next year to come in even larger volumes from the Large Hadron Collider (LHC), the world’s biggest particles accelerator that involves the work of 10 thousand physicists and engineers from approximately 50 countries (see Pesquisa FAPESP nº 147, May, 2008).
In addition to generating substantial scientific production, which in any one year can come to dozens of articles published in journals, not to mention countless sleepless nights spent in front of the computer, the experience of working with colleagues from all over the world on machines that run day and night inspired the implementation of an even bigger structure at Unesp, with 368 computers able to make an impressive 33.3 trillion calculations per second. The machines of this R$ 4.4 million network, financed by the federal government, are due to be installed beginning in July, as soon as they start to arrive, and will take up an entire floor of the new Unesp building in São Paulo City’s Barra Funda neighborhood. The facility will include a training center and will house the operations team. Little by little, possible connections with computers from dozens of universities in the United States, Europe, China and Australia will begin to take shape, as all of these places have already adopted similar work strategies. Thus, strictly speaking, nothing will keep a team from the Unesp campus on Ilha Solteira from asking colleagues at Harvard if they have extra space in the computer to help solve a problem that overloaded the local computers. “If we want to keep up with what’s going on in the rest of the world,” says Novaes, “we cannot be provincial or narrow-minded.”
This is what e-Science is all about: it does not matter where you are or which computers are refining and examining the data of your valuable experiment. Created in 1999 to describe a project that would begin to take shape in England in the following year, the expression e-Science describes the scientific activities that depend on a huge storage capacity and information processing, such as particles physics, even though other fields of research can benefit from e-Science as well. In the book Da internet ao grid – A globalização do processamento, (From the Internet to the grid – the globalization of processing), Novaes and Eduardo Gregores bet on this expansion: “We can expect that, just like the Internet, the grid’s applications will go way beyond anything we can imagine at present.” The Unesp computer network, incidentally, is expected to explore other worlds – from the formation of tumors to the superconductivity of ceramics. In the United States, the computer grids help ambitious projects that deal not only with urgent problems but also with the search for new cancer treatments. The objective of one of these grids – the National Virtual Observatory – is simply to put into the computer all the information that has ever been collected on millions of stars and galaxies in the skies. e-Science could go beyond this and help solve global problems, according to the editorial in the March 15 issue of Nature, which proposes that governments work together to build super computers that can provide more reliable weather forecasts and discover how to prevent the likely catastrophes that may ensue from climate changes.
Perhaps more than any other experts from other fields, particle physicists today depend on the network of powerful computers as much as taxi drivers depend on their Global Positioning System/GPS. Otherwise, it would be impossible to analyze all the available data – nor would it be possible to find addresses quickly in an unknown city. In the course of the next five years, the four detectors of the LHC are expected to generate a volume of information equivalent to 1.4 billion CDs. If these were piled up, “without the packaging,” says Novaes, they would create a tower four thousand times higher than the Sugar Loaf, Rio de Janeiro’s most famous landmark. “This huge volume of information is itself a problem that dictates the need for new work concepts,” says Novaes. Luckily, in the last few years, data transmission speed has increased at a faster pace than processing speed, leading to a new form of computer organization, referred to as a grid, whereby distant machines work as one. In addition to software and hardware, we now have middleware, which distributes tasks by singling out machines that are not busy.
There is a hierarchy among the machines. Information on particles that are to be broken down or that are formed on impact must come out from the detectors that surround the 27-kilometer round extension tunnel of the LHC, located 100 meters underground, and first reach the computers of the European Nuclear Research Center in Geneva, Switzerland. The data from the Compact Muon Solenoid (CMS), the LHC detector which these teams from Rio and São Paulo are part of, then travels to the computer centers called Tiers-1, spread over 8 countries, and then to another 23 computer groups around the world – from Brazil to Pakistan – that comprise the Tiers-2. “We”re doing all right,” Novaes pointed out, when comparing the performance of the Brazilian grid with that of his colleagues’ machines in China, Italy, England and the United States. The entire group has participated in data transmission simulations, with visible progress: the operating capacity of the machines has risen from 20% in 2006 to 50% in 2007 and the objective now is to achieve 100% of what will be needed when the LHC goes into operation. The difficulties are also greater. Novaes became familiar with the new problems that can interrupt data transmission when he quickly read the 350 e-mails that had arrived on the eve of a holiday in late May, in one more simulated data transmission test that came from the LHC. “Everybody communicates with everybody else,” he says. “Collaboration is crucial now, because if one fails, all fail.”
Physicists built this global research environment and the enormous caves of the LHC to experimentally find an atomic particle that so far existed in theory only, the Higgs boson. Bosons are particles that transmit forces or maintain other particles united; Higgs is the last name of the Scottish physicist who predicted this particle in 1964. If it is indeed identified, the Higgs boson can explain why the elementary particles of matter have such a different mass (the mass of a neutron that forms the atomic nucleus is 1,800 times bigger than that of the electron that orbits around the nucleus).
So many people and so much work are explained by the fact that the Higgs boson might be the missing link needed to complete the atomic particles puzzle. At the beginning of the last century, only one particle – the electron – was known. Evidence soon appeared of the existence of the atomic nucleus, with bigger particles; by the early 1950’s, physicists had already identified dozens of them. “It was chaos,” says Novaes. “The particles were not organized.” Little by little, the physicists unraveled the forces that bind particles and atoms, but this was still not enough. When the particle accelerators began to function and shown even deeper dimensions of matter, the physicists realized that the entire particle zoo from the 1950’s could be organized by means of three particles only – the up, the down and the strange quarks. Three more quarks – charm, bottom and top – were discovered soon afterwards. These six quarks, paired into quark-antiquark pairs, or trios, comprise all the particles that are subject to one of the basic forces of nature – the powerful interaction that maintains the cohesiveness of the atomic nucleus. New particles began to appear with strange names, unfamiliar to most people, names such as kaon, eta, chi, lambda, sigma or J-psi. However, these were no longer hundreds of particles, but merely re-arrangements of the same basic elements. Categories now exist beyond isolated particles: namely, the protons and neutrons that form the atomic nucleus are called hadrons (hadrons in Greek means strong, massive). The nucleus itself lost its hypothetical tranquility and appeared as a stormy atmosphere, with clouds of particles that appear and disappear all the time around protons and neutrons.
The LHC may also come to shed some light on other dimensions, in addition to the known ones (three spatial ones – length, width, and height – and a temporal one); nobody has proven yet that these dimensions do not actually exist and some physicists need them to maintain their theories. Nevertheless, Novaes thinks there is more. “I hope that different things will stem from the LHC, things that can lead us to other challenges,” he says. “It might be that something new will be entirely new, with no links to current theoretical proposals.” As the results are unpredictable, more important things might surface, in addition to new explanation of the Universe. In 1990, Swiss physicist Tim-Bernes Lee created a computer language to facilitate the life of the people working at the European Nuclear Research Center/CERN, without ever dreaming that his invention – the hyper-text – would be crucial for the expansion of the Internet.
Physicists have been chasing the Higgs boson for many years. Novaes himself, when he was still in the master’s degree program in 1979, studied one of the mechanisms that produce this particle, by means of collisions between protons. “What had been a problem at that time is still on the agenda, which shows how difficult it has been for particles physics to progress in the last three decades.” But e-Science is expected to help solve this conundrum. “e-Science is open and fast and represents another way of doing science,” says Novaes. “We have to think of another way of being daring.”Republish