MARCOS GARUTISectors of the economic community received with criticism the changes made in the Qualis System, the tool used by Capes, the Coordinating Office for the Training of Personnel with Higher Education, to classify the periodicals in which the graduate programs publish their scientific production. Whereas the preceding categorization system divided the vehicles according to their circulation (local, national and international) and their quality (A, B and C), the new scale has eight levels (A1, A2, B1 to B5 and C). The weight of level C is zero. The assessment of production quality will now be measured primarily by the impact (FI) factor of the periodicals, regardless of their circulation scope. FI, used as an evaluation tool since the 1960s, aims at measuring the scientific impact of a publication, taking into account the index of citations in other articles of the published material.
The common element in the criticisms is the exaggerated weight ascribed to the impact factor on the Capes classification and its effects upon poorly evaluated publications – it is foreseeable that they will start being avoided by researchers, which will make it even harder for them to become consolidated. The strongest reaction came from the chemistry area. In November 2008, the Forum of Coordinators of Postgraduate Programs in Chemistry met in the city of Ribeirão Preto (inner-state São Paulo), with an agenda that included discussion of the evaluation of the courses. After two days of debates, a motion was released that considered the new Capes classification “inadequate.” In its first issue of 2009, the journal Química Nova, of the Brazilian Chemistry Society, backed the forum’s position. An editorial signed by two researchers in the area who are among the most productive in Brazil, Fernando Galembeck, from Campinas State University (Unicamp), and Jailson Bittencourt de Andrade, from the Federal University of Bahia (UFBA), while emphasizing the importance of the evaluation, voiced discontent with the changes made to Qualis.
According to the editorial, the exaggerated value ascribed to visibility and to the international participation of the periodicals, as measured by FI, encourages artificial differentiation among sub-area publications of the subject. “If one considers a specific field such as chemistry, the comparability of periodicals based on the impact factor (FI) has serious biases. For instance, the periodical Inorganic Chemistry is unlikely to have a higher FI than Analytical Chemistry, which does not mean that there is a lack of excellent articles in the two disciplines. It is simply that the visibility of the analytical methodological innovation in other disciplines is greater than in inorganic chemistry,” wrote Galembeck and Andrade.
According to Andrade, Capes should make its evaluation scheme more sophisticated. “The Brazilian evaluation system has been very important in raising the quality of postgraduate work and I’m entirely in favor of it,” states the professor. “But the impact factor is not an absolute measure of quality. One must create a more complex system, which ascribes a greater weight to other indicators, such as, for instance, what happens to those who complete postgraduate programs. There are programs with PhD students who are unable to pass the entrance exams to become government university professors. This indicator says more about quality than the number of papers published,” he states.
Lívio Amaral, the Capes evaluation director, says that the change was required because the scale previously used had lost its capacity to discriminate among postgraduate programs. “One must always stress that the evaluation of the graduate programs takes into account a number of indicators and that Qualis Periodicals applies only to one of them, the one that concerns intellectual production,” says Amaral. He argues that the stratification was carried out by each one of the 47 coordinating areas of Capes and that not all of them work with FI. He also reminds us that this measure is ascribed by a set of procedures that are independent of the agency. “As we know, for a given journal to have an impact factor, it must be indexed and analyzed in the ISI-Thomson system. Once an impact factor is ascribed, it is a one and only factor, whether in Brazil or in any other country in the world,” he stated.
The Química Nova editorial reverberated among the other so-called hard sciences. In Boletim da Sociedade Brasileira de Física, Silvio Salinas, a professor at the Physics Institute of the University of São Paulo, endorsed the criticism. “Frankly, I think we have reached a dangerous limit, one that puts at risk the very concept of evaluation. For example, what does it mean to classify Physical Review D as A2 and Physical Review B as B1? Are the experts on particles, by any chance, better than the experts on solids?” asks the professor. “I think it is time to think about this numerology, which little by little is gaining strength in all areas, driven by the facilities that IT provides and by a vague notion of globalization. Is numerology really going to replace peer reviews?” says Salinas.
The USP professor touches upon a key point, namely, the increasing use of bibliometric criteria as an evaluation tool. They have been replacing more complex systems, such as peer reviews, in which each of the members of an academic community measures the quality of a given piece of work, as FAPESP does when it evaluates the research projects it finances. This discussion, however, is international. The United Kingdom, for instance, plans to replace its Research Assessment Exercise (RAE), which evaluates the quality of its universities’ research from time to time, drives the distribution of funding, and is essentially based on peer reviews. In its place, a simpler, cheaper system is expected to take place, which will resort to more bibliometric criteria, such as the impact factor, but that will nonetheless maintain a good measure of peer review (see Pesquisa FAPESP issue 156). The new system is still being designed and is sparking controversy in the British academic community. At the beginning of this year, the editors of international publications in the field of the humanities and the social sciences released a manifesto against a European Union proposal that sought to classify them into three scales, according to their impact factor and dissemination. The scale idea was eventually abandoned (see Pesquisa FAPESP issue 157).
Rogério Meneghini, the scientific coordinator of the SciELO Brazil electronic library, explains that the increasing use of bibliometric indicators is due to the need to assess scientific production that is growing rapidly. “The impact factor is a suitable tool for evaluating the academic production of postgraduate groups, because even the upward or downward deviations that might arise in the evaluation of each researcher are balanced out when one looks at the entire group,” he states. Livio Amaral, from Capes, adds: “Using Qualis Periodicals is totally inappropriate for evaluating researchers. It is designed to analyze postgraduate programs, not individual researchers.”
A major risk of the Qualis change, according to the critics of this measure, is the rejection of important publications that were poorly evaluated for reasons that range from irregular periodicity to a shortage of resources. It is reasonable to assume that they will be less sought-after by major researchers and that they will fall into a downward spiral with a loss in prestige . Zoologist Miguel Trefaut Rodrigues, a professor at USP’s Biosciences Institute, provides some examples in his field. “I can’t accept the idea that traditional publications that have high penetration and international respect, such as Arquivos do Museu Nacional, Boletim do Museu Nacional, Arquivos de Zoologia, Boletim do Museu Goeldi and certain new publications with top quality editorial staff are listed as B5, alongside leaflets and dissemination rags produced by amateurs,” said Trefaut. He recalls that some of these publications do not have an impact index because their publication is irregular, but they have had and continue to have great importance for Brazilian scientific development.
Irregular periodicity, he explains, is due largely to a shortage of funds, but these publications did inform the international community about the progress of knowledge, covering things such as the description of new species, a stage that no longer exists in developed nations whose fauna is entirely known. “It reflects a complete lack of vision of the reality of Brazilian science. We are not in the developed world. If today we have excellent zoological libraries with full collections of foreign periodicals at the USP Zoology Museum, it was thanks to bartering our journals with international publications as the trading currency. This and the high regard they still enjoy reveal how important they have been for the progress of science in Brazil. And we’re going to throw this in the rubbish?”
Science sociologist Lea Velho, a professor at Campinas State University, highlights the impact that the loss of prestige in such publications might entail. “In most fields of knowledge, there is a range of paradigms and publications about them – a Marxist economist does not publish papers in periodicals with a neoclassical orientation, for instance. The risk is eliminating the voice of minority paradigms and demanding that everything be published in the so-called mainstream vehicles. This is crazy for science,” she states. According to her, the Capes criteria particularly affect multidisciplinary areas. “The discourse is that multidisciplinarity is the way to go, because it deals with the problems of real life. But researchers in such fields publish in periodicals that are new and not quoted much, because their communities are still small and embryonic. The Capes bias is notably disciplinary: whoever wants a good evaluation will have to publish in consolidated fields,” states Lea Velho. She raises another drawback: research of purely regional interest will also lose out. “There is already a tendency in agricultural research to focus on themes of international interest, to the detriment of solving domestic agricultural problems, which had always been Brazil’s focus in this field.”
Lack of criteria
Rogério Meneghini criticizes the lack of criteria in the Capes evaluation. “Most of these publications have received financial aid from the federal government in the last few years, such as the CNPq public notices. The change in the evaluation is bad for this investment,” he states. According to him, the impact of the publications in many developed countries hinges on incentives for them to improve. This is occurring in Brazil by means of SciELO, which is financed by FAPESP, and which encourages periodicals to put in place strict quality standards. It has also enhanced their visibility, thanks to an open access system. “The SciELO publications have benefited enormously from this stimulus,” states Meneghini.
In a letter to the Capes president, Jorge Guimarães, dated late May, 78 botany and zoology researchers complained about the periodicals in their specialization having been demoted within the Qualis system, despite all the efforts made to improve their quality. “In this sense, we are pointing to the SciELO system as an alternative path for Capes to use in regard to domestic publications. SciELO has been assessing and monitoring Brazilian scientific journals for years,” says the manifesto, sent to Capes by Rodney Ramiro Cavichioli, president of the Brazilian Zoology Society. Hussam Zaher, a professor at USP’s Biosciences Institute and one of the coordinators of the manifesto, says that Capes, by adopting an evaluation system that favors one particular parameter, ends up deciding alone about a whole range of different players; this is the case, for instance, with the Ministry of Science and Technology. “The situation is very complex and cannot be managed simplistically. We shouldn’t reject the evaluations; what we ask is that the system be properly thought out, properly articulated,” says the professor, who is the editor of the journals in USP’s Zoology Museum. “I’d like to see a system that qualified our journals and helped us become competitive. This system is going to drive publications into extinction. I don’t think it’s healthy to diminish the possibilities of expression by the Brazilian scientific community,” he stated.
However, according to Rogério Meneghini, there are countries that have a different strategy. He mentions Sweden or the Netherlands, which worry little about maintaining their publications, but have a clear strategy of cultivating editors on the international publications’ committees. “Those who sit on such committees have decision-making power and this is converted into accepting the good articles published in the country. But the number of Brazilian editors in international publications is still very small,” states Meneghini.Republish