Imprimir Republish

Scientific production

The race to measure impact

Brazilian universities increasingly investing in offices dedicated to managing and measuring performance

In order to better meet the data requirements of oversight bodies and national and international ranking organizations, an increasing number of Brazilian universities are creating committees or offices dedicated to collecting and analyzing data on university performance. A recent example is the Office for Strategic Institutional Data created in September by the Federal University of São Paulo (UNIFESP). “Information is scattered across different departments within the university. Our mission is to organize this data and develop better metrics to evaluate teaching and research performance,” explains Juliana Garcia Cespedes, associate dean for planning at UNIFESP.

The university, along with other federal education and research institutions, is required to provide the Ministry of Education (MEC) and the Federal Audit Court (TCU) with a comprehensive and detailed set of information including budget performance, cost per student, the number of students who graduate within the time limits for their programs, and a “faculty qualification index.” The data is compiled under the supervision of information technology officer Lidiane Cristina da Silva, and has yielded important dividends for the university that have underlined the importance of expanding and better structuring data collection efforts.

Last April, the London-based magazine Times Higher Education (THE) launched a new academic ranking model that recognizes universities’ social impact. Besides teaching and research indicators, the new ranking measures university performance against some of the United Nations (UN) Sustainable Development Goals, such as reduced inequalities, sustainable institutions, and impact on urban development. Many major universities in Brazil and around the world have been left out of the ranking primarily because they are unable to provide data on aspects such as the amount of waste they recycle or initiatives in place against violence.

But UNIFESP decided to take up the challenge and was rewarded with a place among the top 200 universities in the world in this ranking, alongside two other Brazilian universities—the Federal University of ABC (UFABC) and the Federal University of Ceará (UFC). “We were able to quickly gather the data required for the ranking thanks to the expertise of our information technology and planning departments,” says dean Soraya Smaili. Having a woman as dean earned the university extra points for reducing inequalities, while providing medical care to communities, a tradition at the institution since its days as the São Paulo School of Medicine, also helped performance.

Academic performance indicators support managers in decision-making and help universities to report on their results to society—which is ultimately the source of university funding. “Analyzing our indicators helps us to design better metrics for evaluating teaching and research performance in different fields of knowledge. It also helps universities to reflect on their relationship with the public,” says Aluísio Cotrim Segurado, a professor at the University of São Paulo School of Medicine (FM-USP) and head of the institution’s Office for Academic Performance Indicators (EGIDA). The office, created in 2018, is dedicated to improving data collection and interpretation in support of planning.

EGIDA also liaises with organizations publishing academic rankings. The importance of intensifying this engagement became evident, says Segurado, when USP found that its performance in British consultancy QS’s world rankings for academic and employer reputation had been affected by the university’s failure to update its list of researchers named to respond to the survey. “Situations like these demonstrate how important these offices are for universities,” says Segurado. Another challenge is designing new research metrics for areas where bibliometric indicators—such as the number of published papers and citation indexes—are not ideal, as is the case in the humanities. “We need to develop evaluation indicators that are meaningful for fields such as the arts, music, and cinema,” he says.

Inputs from universities’ performance metrics offices are helping FAPESP to update its Science, Technology, and Innovation (ST&I) Indicators for São Paulo, which it has compiled and reported since the late 1990s. The Foundation is working with university metrics offices and committees to collect data on research spending and development, sources of funding, and the number and profiles of researchers working at universities, research institutes, and companies in the state. “These offices have been a godsend. Before, universities lacked a single point of contact for information. Now it is easier to get data that is organized and consistent,” explains economist Sinésio Pires Ferreira, head of indicator studies at FAPESP.

Initiatives similar to those at USP and UNIFESP have been implemented at other public universities in Brazil. The Strategic Planning and Management Department at UFC, in Fortaleza, is currently developing new strategies for analyzing and monitoring metrics for academic performance. “The data we have collected and analyzed so far is helping us to design more effective institutional development plans focused on transparent management,” says coordinator Roberta Queirós Viana Maia.

USP, UNICAMP, and UNESP are working together to create metrics depicting the economic, social, and cultural impact of their activities

In March 2017, São Paulo State University (UNESP) created a committee of administration, information science, and metrics experts to assess the institution’s performance in several university rankings, including the THE World University Rankings and the Center for Science and Technology Studies ranking produced at Leiden University, in the Netherlands. “We can now monitor our standings, interpret the data, and determine how they can be acted upon,” explains José Augusto Chaves Guimarães, a professor at the Information Science Department at the UNESP campus in Marília.

During their monitoring efforts, the UNESP committee found that many researchers were publishing papers in Portuguese, limiting their international impact. It was also difficult to provide ranking organizations with a comprehensive picture of the institution’s research, as the English translation of the university’s name had not been standardized in scientific papers. The committee identified more than 90 different interpretations of the university’s acronym in English.

To address these issues, UNESP launched a program, called Propetips, to encourage researchers to publish in English and in international journals, and to use the correct English translation for the university’s name. “There are other obvious dos and don’ts, such as not publishing in predatory, low-quality journals,” explains Guimarães. Several steps had already been taken before the commission was created. One was requiring all researchers to register for an Open Researcher and Contributor ID (ORCID), a persistent digital identifier in the global scientific environment that prevents researchers from being mistaken for others who share the same name.

University metrics offices in Brazil have drawn inspiration from successful experiences at Harvard University, the University of Miami, and Cambridge. Data collection and analysis can inform strategies and policies that help to maintain or capture additional funding for research. They also track the career paths of undergraduate and graduate alumni, supplying ranking organizations with a diverse set of information.

But as Brazilian universities step up their efforts to measure ranking performance, Marisa Beppu, who served as associate dean for university development at the University of Campinas (UNICAMP) between 2017 and 2019, cautions that institutions should not be subservient to rankings. “University rankings should be used as tools for calibrating teaching and research activities,” she says. “Universities need to critically analyze the rankings but also go beyond these data to include metrics that are meaningful in the Brazilian setting, says Milena Serafim, a staff member at UNICAMP’s Coordination Office, the body running the university’s institutional data management initiatives, under the leadership of deputy dean Teresa Atvars. Serafim and Atvars note that UNICAMP has worked to develop metrics over the years to improve its governance and aid decision-making involving administrative matters and the careers of professors and researchers. “The university has also invested in improving engagement with ranking organizations in order to develop metrics that are also meaningful for society,” says Beppu. UNICAMP’s impact on society is manifold, and is especially significant in areas such as intellectual property production and protection, technology transfer, and entrepreneurship—the university last month announced that in 2019 it had spun out 114 new “daughter companies,” including ventures created by students, alumni, and other individuals linked to the institution, businesses built around a technology licensed by the university, or firms that graduated from the university’s technological incubator. In all, 717 daughter companies are currently in operation, generating a total of R$7.9 billion in revenue this year.

At USP, the EGIDA program runs a comprehensive impact assessment based on searches in external databases. Information compiled from the Lattes platform is used to estimate the impact USP has through training undergraduate and graduate professors to work at other universities in Brazil. “At some universities, more than half of professors received their training at USP,” says Segurado.

The three state universities in São Paulo have developed a suite of metrics to assess economic, social, and cultural impact, as part of a program led by Jacques Marcovitch of the School of Economics, Management, and Accounting at USP (FEA-USP). The goal is to create indicators to track the extent to which these universities are fulfilling their missions. “There were already ongoing efforts to compile data for international rankings. We are taking advantage of the momentum to design indicators of university impact on economic development, innovation, community well-being, and other outcomes from investment with public funding,” says Renato Pedrosa, a professor at UNICAMP and a member of the research team led by Marcovitch. The plan is for universities to generate data using similar and comparable methodologies, says Pedrosa, who also heads the FAPESP Science, Technology, and Innovation Indicators program. “Professor Marcovitch’s program did exactly what FAPESP expects from a public policy research project: it identified a challenge, assembled a team of trained researchers, and secured buy-in from an organization interested in using the research results, in this case the Council of Deans of State Universities in São Paulo (CRUESP),” says FAPESP Science Director Carlos Henrique de Brito Cruz.

One of the initiatives undertaken as part of the project was joining U-Multirank, a ranking system launched in 2014 by a European consortium led by the Center for Higher Education Policy Studies at Universiteit Twente. U-Multirank publishes multifaceted performance charts that allow universities to identify their strengths and areas needing improvement. “This initiative should help us to better assess the performance of São Paulo’s universities and benchmark them against other universities that have similar or symbiotic attributes, possibly leading to strategic partnerships,” says Segurado.

Republish