Criteria for getting into the club

Dissertation investigates the methodology behind academic rankings and explains the performance of Brazilian universities

Rankings01_244A doctoral dissertation defended in 2015 at the University of São Paulo (USP) assembled data and information that help explain why Brazil displays relatively modest performance in world rankings of universities. Conducted by Solange Maria dos Santos, coordinator of production and publication at the Scientific Electronic Library Online (SciELO), the study analyzed one decade of Brazilian scientific production (2003-2012) and scrutinized the methodology adopted by six of the research rankings in order to understand, for example, certain discrepancies in the number of Brazilian institutions classified among the world’s best: one ranking assigns only two Brazilian universities to the club while others see as many as 22 belonging to it. Santos also explores an issue involving an apparent paradox: if Brazil performs well in rankings specific to subject fields, like medicine and agronomy, why isn’t this reflected in the overall rankings?

According to Santos, the selection criteria adopted by the rankings keep more Brazilian universities from appearing on the lists. “One of the parameters is the volume of scholarship indexed by international databases. As a result, larger universities, which present strong indicators in research and teaching, have greater chances of classifying. The rankings select a limited number of institutions – most often, the top 500 – out of a universe of over 16,000 universities worldwide,” says the researcher, who defended her dissertation at the USP School of Communications and Arts (ECA) and did part of her research at Charles II University of Madrid, in Spain.

Interestingly, it is the weight of indexed scientific production that also helps explain why more Brazilian universities are on the rosters today than 10 years ago. Brazil invested in the professionalization of its academic journals, through initiatives like the SciELO electronic library, and succeeded in boosting the number of Brazilian journals indexed by international databases in the mid-2000s. For example, the number of Brazilian periodicals indexed by Web of Science leapt from 26 in 2006 to 103 in 2008. “Indicators began to consider a larger set of papers, and more Brazilian universities became visible to the rankings,” Santos says.

Rankings02_244This is noticeable in China’s Academic Ranking of World Universities ( When this tool was created in 2003, only USP, the University of Campinas (Unicamp), São Paulo State University (Unesp), and the Federal University of Rio de Janeiro (UFRJ) stood among the top 500 worldwide. In 2007, the Federal University of Minas Gerais (UFMG) made it onto the list, and in 2008 it was joined by the Federal University of Rio Grande do Sul (UFRGS).  The ARWU is based on more objective metrics, such as publications and citations, number of highly cited researchers, alumni and faculty who have received a Noble Prize or Fields Medal, and percentage of full-time faculty.

The Times Higher Education, a British ranking known by the acronym THE, categorizes only two Brazilian universities among the world’s 500 best: USP and Unicamp. UFRJ also won a spot in 2008 and 2009 but did not stay there. Part of the criteria employed by the British ranking have a subjective bias: one-third of the points are based on a survey of academic reputation conducted among researchers from 133 countries. Its point system also takes citations, number of foreign faculty members and students, and research budget into account.

In her study, Santos found that changes in ranking methodology are often responsible for sudden fluctuations in university performance. “I’m leery when a headline says a university climbed or dropped 100 places in a ranking. No institution changes that much from one year to the next,” she argues. One case of a shift in methodology involved the Quacquarelli Symonds (QS) World University Rankings. Starting in 2010, QS began using the Scopus database, which is maintained by the publisher Elsevier and encompasses a larger number of Latin American journals than are included in the previously used Thomson Reuters Web of Science. Because a portion of the awarded points are linked to citations of faculty members, the number of Brazilian institutions classified among the leading 1,000 universities jumped from six in 2010 to 22 in 2013. In this ranking, 40% of the points reflect a survey of academic reputation, while another 10% reflect a survey among employers of university alumni. The surveys change their pools of respondents from time to time, triggering fluctuations in results.

Rankings03_244This same phenomenon was detected by Carlos Marshal França in his master’s thesis, defended in 2015. França, professor of business administration at the Pontifical Catholic University of Campinas, compared three national university rankings done by Ibero-American newspapers: El Mercurio, from Chile; El Mundo, from Spain; and Folha de S.Paulo, from Brazil (Ranking Universitário Folha, or RUF). He found that each had its own way of gathering data. While the Chilean newspaper relies on public information sources and on bibliometric indicators, the Spanish paper uses surveys that poll institutions and faculty members. The Brazilian news daily combines public data with interviews conducted among faculty members and professionals in their fields and has instituted changes intended to refine its methodology. Chile presented the most stable results, with occasional yearly variations that represented differences of only one or two rungs on the ladder. Of the top 20 universities listed by the RUF, some were re-ranked by as much as seven slots from one year to the next.

Loose interpretation
According to Samile Vanz, professor at the UFRGS School of Library Science, the key contribution of Santos’s study is that it traces what each of the rankings measures. “These rankings are often interpreted very loosely, without understanding what they mean,” she says. In the opinion of Vanz, who is now studying these performance-measuring tools as part of her post-doctoral work at the same Spanish university where Santos completed part of her doctorate, Brazilian production is still underrated. “I’ve observed that a number of the rankings that use the Web of Science database as a reference do not take into account all of the journal collections included in the base. They very often pick out two or three main collections and omit, for example, the SciELO Citation Index, which holds a good share of Brazil’s production,” she says. Vanz points out that rankings are not neutral tools: “Very often, the companies who conduct the surveys sell services related to the data, and the listed institutions use them in their marketing strategies.”

According to Santos, it is hard for a ranking to measure all dimensions of academic quality. “The rankings measure what can be measured, not what they would like to measure,” she states. Objective indicators like scientific production, citations, and awarded researchers can be employed to compare institutions worldwide, but there are challenges with metrics like academic reputation or the quality of human resource training. “Rankings still can’t do a good job of gauging the quality of teaching, a university’s regional engagement, or its impact on society,” she offers as examples. Each ranking has its own methodology. The National Taiwan University Ranking, or NTU, grades universities on the basis of research indicators like h-index, number of highly cited papers, and number of papers published in high-impact journals. The Leiden Ranking – compiled by Leiden University, in Holland – uses indicators that capture number of publications and citations, especially those that measure high-impact science and collaboration with researchers abroad and industry.

Rankings04_244Santos spent the greater part of her research time analyzing 10 years of Brazilian scientific production included in international databases, by field. What she found first was that Brazilian universities do not score very high because they generally produce low-impact science. In 2003, 37.5% of Brazilian journals were in the upper 25%, which encompasses the most-cited journals in their respective fields. In 2012, Brazil’s participation fell to 28.8%. The number of Brazilian journals in the bottom 25% – that is, with the lowest impact – grew 137% during the same period. But Santos detected areas of excellence in her analysis. Clinical Medicine was foremost among these, thanks to a large community of researchers that was responsible for 20.83% of all of Brazil’s scientific production from 2003 to 2012, according to data compiled by the researcher. USP alone accounted for almost one-third of this volume. In the Times Higher Education subject ranking for 2014, USP came in 79th in Clinical, Pre-clinical, and Health Sciences and 92nd in Life Sciences. Overall, the university ranked in the 201-225 bracket.

Three other fields where Brazilian research stood out were Physics, Geosciences, and Space Sciences. Much like Medicine, these are fields where Brazilian researchers display a good ability to publish in high-impact journals. “In these fields, Brazilian researchers engage in international cooperation with high-level groups. But since this production is relatively small, it isn’t strong enough to boost the universities in overall rankings,” Santos says. Another notable area is Agricultural Sciences, responsible for 9.62% of Brazil’s production, although this scholarship is not concentrated in high-impact journals. “Universities devoted to this area – like the Federal University of Viçosa – distinguish themselves in subject rankings because of their production in the agricultural sciences,” she says. In the QS subject ranking, some Brazilian schools excel in Arts and Humanities. USP and Unicamp rank among the world’s top 100 universities in Philosophy, Sociology, and History (see Pesquisa FAPESP Issue nº 186).

Rogério Mugnaini, professor at ECA-USP, highlights one effect of the rankings: they reaffirm the influence of a set of Anglo-Saxon universities, based on criteria that do not always make sense for Brazilian institutions. One example is the weight that some rankings place on classes taught in English, something that in Brazil is often seen as making higher education more elitist. “The most prestigious institutions tend to reinforce this instrument, which ratifies their original position of dominance,” says Mugnaini, who is working to develop indicators of Brazilian scientific production based on the online Lattes Resume platform, which contains dissertations, books, and documents that are not usually indexed (see Pesquisa FAPESP Issue nº 233). In the opinion of Vanz, more thoroughgoing research is needed on rankings, along with proposals for new yardsticks that are relevant to scientific communities far from the so-called countries of the center. She notes, however, that ignoring rankings is not an alternative. “They serve as a benchmark for the circulation of foreign students and researchers and they’re important to the strategy of internationalizing our universities,” she says.