Imprimir Republish

Computing

Challenges for the future

IBM develops a chip that gives new life to the silicon chip and studies techniques that will allow computers to make decisions

One of the giants in the information technology industry, IBM, has recently launched a challenge to the scientific community, including the Brazilian community, and to its competitors – it wants to develop machines capable of behaving like a human autonomous nervous system, that runs basic organic functions such as breathing, blood pressure and heart beats, without any conscious intervention. “It’s time to project and to build computer systems capable of self-administration, adjusting themselves to variable conditions and mobilizing resources to carry out more efficiently the workload that we place upon them”, wrote Paul Horn, senior vice-president of IBM Research, in a manifesto of autonomous computing that began to be given out in the United States in October of last year.

At the same time in which it is organizing studies towards this innovation, the company is also worried about down to earth and concrete problems and is forecasting for the end of this year the launching of the fastest chip in the world, whose processor has speed higher than 100 gigahertz (GHz). Today, desktop computers are a little above 1 GHz. In the prototype phase, the chip is based on a new approach for silicon-germanium technology (SiGe), which had already produced a processor launching last year. The application of this new technology will guarantee the survival of the silicon chip which, as the specialists in the area warn, is reaching almost the limit of its usefulness. Research to substitute the silicon chip over the next few decades involves various types of materials, but one that looks poised to take its place is nanothreads of gold, as described by the cover story of the magazine Pesquisa FAPESP Nº 72, of February of this year.

“Until SiGe came up, I had imagined that silicon had its days counted”, observes Fábio Gandour, the manager of New Technologies of IBM Brazil. “The increase in processor speed had turned the circuits so dense that their temperature had got close to the melting point of the material. The addition of germanium was favorable to the alignment of the structure of the silicon atoms without damaging its conductivity”, explains Gandour. According to him, the combination of the two metals results in speed increase in the order of 35%, with a similar reduction in the heat produced. With this, the consumption of electrical energy in the cooling system also decreased. IBM has been researching SiGe technology since 1989 and has given the name SiGe 8HP to its most recent evolution. It will be an evolution that is certainly going to contribute to the already awaited autonomous nervous system of computers.

But until what point can autonomous computing provide intelligence to automated systems is a question whose answer largely depends to the meaning given to the word “Intelligence”. The company underlined emphatically that one is not speaking of giving to machines the capacity to think. After all, the autonomous nervous system (ANS), or vegetative system is not a human privilege – and not by a long shot does it mean that its working will be simple. In man, the ANS anticipates dangerous situations or those marked by urgency before consciousness acknowledges an effective emergency.

IBM reorganized all of its research division, amounting to 3,200 professionals and with an annual budget of around US$ 5 billion, around the goal of giving autonomy to information technology systems and is supporting academic projects that might contribute to its success. “We’re interested in speaking with the Brazilian universities and institutes that have been working on the development of hardware, software and network technologies compatible with the concept of autonomous computing”, announces Gandour, who presented the model to the public, made up for the most part by clients of the company in the Infrastructure Business, during a seminar that took place in São Paulo during the month of February. Brazil was the third country to get to know about the question in detail, which before, had been dealt with in only three events : two in the United States and one in Germany.

Famous partnerships
“We know that the task of developing autonomous computing will be hard and we don’t wish of carrying it out on our own”, explains Gandour. IBM, which has already gotten the support of Microsoft and Sun to the initiative, is working with several universities in a series of areas allied to the concept. The University of California in Berkeley, for example, has pitched in to the project OceanStore, conceived according to the principles of introspective computing that intends to give information technology systems the capacity for adaptation and operational continuity in case of errors in the servers or system crashes. The company is participating, in association with Stanford University, in a computer project directed towards recovery, which aims computers f capable to recover when there are problems.

The University of Bologna in Italy is developing studies on peer-to-peer systems that characterize themselves by decentralized control and by highly dynamic large scale operational environments that can be regarded as complex systems that are typically the object of the study of biological and social sciences.

In Brazil, IBM, which some time ago suspended equipment donation to universities, has been getting closer to the academic world since the setting up of the new technologies department in October 2000. Currently, the company has an agreement with the Catholic Pontifical University of Rio de Janeiro (PUC-RJ) for information exchange on new technologies, organizing weekly meetings with undergraduate and post-graduate students from the university. “We intend to form new alliances in São Paulo and in other states”, reveals Gandour.

The concept of autonomous computing began to be drawn up after the results of a survey carried out by IBM’s recommendation in 1999. The study targeted the demand for skilled labor in Information Technology (IT) and arrived at an alarming conclusion. “We discovered that, if the complexity of the systems keeps on growing at the same rate as over the last twenty years, already in the next decade the demand for specialists could get close to 200 million people, and this is almost the total of the North American population”, says Gandour.

Self-administering the work
From the point of view of the demands that are put on autonomous systems, the first requirement is the Socrates maxim: know thyself. So it can self-administrate, reacting to the increase – or a decrease – in work loads, and ultimately avoid and even repair localized errors. The second critical point is the ability that the systems must have to set up and re set up, adapting themselves to environmental changes. For this, one might need to use that which might be called software cloning, with the creation of multiple images of programs, such as, for example, an operational system and the relocation of memory resources, storage and communications band, depending upon the needs that are presented.

One of the important characteristics of autonomous systems includes the capacity to forecast and to correct errors, going back to redundant or underused elements so as to secure the maintenance of the operation and identifying the primary cause of the problem. “In the beginning, the autonomous systems can self-repair by following rules created by specialists”, forecasts Horn. “However, as soon as we can manage to build in more intelligence, they will begin to discover for themselves new rules that will move them towards the use of redundant or additional resources for their recovery and to comply with their main goal: the achievement of the goals specified by the users.”

Republish