Imprimir Republish

Evaluation

What each does best

USP Chemistry Institute department develops method to measure faculty performance

042-045_Inst Quimica_232-01At one department of the Chemistry Institute at the University of São Paulo (IQ-USP), over 70 professors recently underwent an evaluation process based on criteria including scientific production, personal ability to raise funds for research, student advisorships, post-doctoral supervision, and participation on examining committees, as well as any teaching, administrative, cultural, or extension activities in which they have engaged over the past five years. The initiative resulted in a kind of ranking (see chart), in which each professor employed by the institute’s Department of Fundamental Chemistry is identified only by a four-letter code. Their names are not disclosed to the general public – although, within the institution, the relative positions of the professors is reasonably well-known. Over the past 10 years, the evaluation has been performed four times. After the most recent one, based on data from 2009 to 2014, the nine faculty members with the lowest scores were invited to meet with professor and department head Mauro Bertotti. Each of them had earned a general score of about 2 points. The average for the institute as a whole was just over 6 points, with the top performer on the list earning more than 15 points (see chart). “We emphasized that there was no intent to punish anyone and that the purpose of the meeting was to find out how the department could help them improve their performance,” says Bertotti, who divided the task into two meetings, one with four professors and the other with the remaining five.

Although some of the professors had reservations about the criteria used in the evaluation, most attempted to justify their subpar scores. According to Bertotti, one professor argued that he was having a hard time recruiting students to work at his laboratory because his field does not appeal to young investigators. “We offered to help him attract students and suggested he join other groups, to avoid working alone,” says the department head. Longtime faculty members bemoaned the difficulties of competing with younger professors, perceived as highly aggressive in their pursuit of scientific productivity and research funding. “They complain that the rules of the game have changed and that younger professors gain an advantage because they feel more comfortable in this new environment,” says Bertotti, adding that a few of the oldest professors in the Department of Fundamental Chemistry view publishing in prestigious journals as less of a priority than ensuring that students get an excellent education.

The experiment has generated few concrete effects as yet. One example: in a discussion about the reallocation of Chemistry Institute research facilities, a low-ranked professor agreed to relinquish the physical space of his laboratory after having admitted that he no longer conducted research. But the impact of the evaluation cannot be fully assessed based solely on palpable results, since from the outset it was established that that no one would be rewarded or punished. The initiative is significant because its methodology was designed and negotiated by the professors themselves over the past 10 years, aiming to improve the performance of the group – the kind of effort seldom seen at Brazilian universities.  “This is a self-regulation instrument, used by professors to position themselves in relation to their peers,” says Bertotti. “It creates evaluation parameters for faculty activities, weighted according to widely discussed criteria that are defined by the community and are compatible with the institutional mission, with the ultimate goal of systematizing information, producing diagnoses and recognizing the value of human and financial resources. It is society that supports the public universities, and so they are obliged to demonstrate that these investments are being managed responsibly,” he emphasizes.

042-045_Inst Quimica_232-02According to Guilherme Andrade Marson, a professor at IQ-USP and member of the Faculty Performance Evaluation Commission, the debate on evaluation criteria forced the members of the department to reflect on the nature of their work and what they perceive as a standard of excellence. “The very existence of the evaluation process is a big step forward. The instrument shows what each professor does best and was designed to improve the quality of what we do, not to start a witch hunt,” he says. The effort was collective, but far from consensual. “The department board’s discussions about the criteria had to be resolved by vote,” says Bertotti, referring to a lack of consensus on the use of certain indicators, such as journal impact factors and course evaluations prepared by students. “The important thing is that the idea that evaluation is important prevailed.” Another effect that should be considered is the impact on the morale of younger investigators. “It’s stimulating to work in a department that values merit,” says Pedro Cury Camargo, 33, professor at IQ-USP since 2011.

The start of the evaluation process dates back to the early 2000s, when the Department of Fundamental Chemistry was headed by Henrique Eisi Toma, who used the internet to disclose the number of indexed publications produced by each professor, in addition to other information. “There was some discomfort about measuring the department’s performance based on number of publications, as was done in USP’s annual reports, which indicated a worse performance than the average obtained by the professors in another, comparable department,” says Professor Ivano Gebhardt Rolf Gutz, deputy head of the department at the time. “Back then, the Lattes Platform was just getting started and it was difficult to gather information about each professor’s scientific production and other activities.” An attempt was made to integrate the data collected by the department’s graduate studies office for submittal to Capes with the information collected by the Chemistry Unit and department libraries. An e-mail address was even created, for researchers to send in their information. But the strategy failed. “We were seeing some heterogeneity in the department, with some professors focusing more on teaching than on research,” Gutz recalls.

Initial version
In 2004, Ivano Gutz became the head of the Department of Fundamental Chemistry and decided to implement what would become the initial version of the evaluation process, using a comprehensive set of indicators and already contemplating the performance categories used today (see chart). The idea was to assign points to every type of activity performed by the professors, and to find a way to compile the numbers so that professors would gain recognition even if they were not top performers in research. Demonstrating good teaching skills, actively serving on committees, advising undergraduate and graduate students, and showing exceptional performance in extension activities would also be valued. “At the time, we agreed that the data would not be used as grounds for drastic measures. There were a lot of questions about the weighting of each indicator, but no one was against gathering the information,” says Gutz.

The most controversial issues were toned down. Instead of using the impact factors of scientific journals as a weighting criterion to evaluate each professor’s publications, the department decided to use the square root of these impact factors. This was done to avoid creating a chasm between the scores earned by researchers able to publish in high-impact periodicals and those whose works appeared in less prestigious journals. For articles published in educational journals or magazines – which are typically cited less often – the classification used instead of the impact factor was the ranking assigned to the periodical by the Qualis evaluation system of Capes.
The calculated result also had to be divided by the square root of the number of article authors who work at the department, giving each an equal share of the score.

The scoring spreadsheets took into account the indicators pertaining to the past three years (today, they contemplate the previous five years of work). The score awarded for hours allocated to teaching was increased by a bonus when the course was rated favorably by students, taught at night, or had a high number of students enrolled per semester. The more funding a professor was able to obtain for research projects, the more points he accumulated. Administrative activities that entailed additional responsibilities – such as heading or directing a department – were also awarded more points than participation on committees that did not generate extra work. The scoring spreadsheets were designed by Ivano Gutz himself, assisted by a secretary. Meanwhile, in hallway conversations at IQ-USP, the methodology was jokingly dubbed the ‘G-index’ – in a reference to a well-known measure of volume of scientific productivity and citation impact known as the H-index. “But the G-index is better because it is comprehensive and takes a researcher’s recent performance into account,” says Gutz. “The H-index is cumulative and puts more senior researchers at an advantage.” The results announced in 2006 and 2007 generated some discomfort, but were accepted by most. “There was the classic resistance from professors who felt threatened by the results. Since we showed that we did not intend to punish anyone, it gradually subsided,” Gutz emphasizes. Such opposition is understandable: in 1988, a list produced by the USP Office of the Dean for internal use, containing the names of researchers who had not published any scientific work between 1985 and 1986, was leaked out to the daily newspaper Folha de S.Paulo. This so-called “list of unproductives” generated lasting trauma and controversy.

Refined criteria
When Ivano Gutz left in 2008, the department put the process on hold after two evaluations had been completed. “Subsequent department heads delegated the data gathering process to secretaries. Because they had other responsibilities as well, the initiative lay dormant,” says Mauro Bertotti, who resumed the evaluations when he was appointed head of the department in 2012. Many of the criteria have been refined, a process that is still ongoing. In the case of scientific papers, weighting is now determined directly by the impact factor of a journal, not its square root. “We argue that paper publishing should be given a weight proportional to the quality of the journal,” says Bertotti. In the fundraising category, in addition to giving more weight to larger grants, the evaluation now awards bonus points to project coordinators, as compared to the other project members. The points awarded in each category are added up separately, then normalized by the department median, based on the weights attributed to each activity. This prevents one professor’s outstanding performance in a given category from eclipsing the performance of the others.

One of the precautions taken was to professionalize the data gathering process. An intern was hired to handle this task, which includes approaching individual professors and encouraging them to report every type of activity that is covered in the evaluation. The intern is also responsible for publishing a monthly bulletin of the faculty’s activities, covering everything from participations in examining committees to media interviews. “The goal is to recognize the importance of everything the professor does,” emphasizes Bertotti.

Ongoing discussions about organizational reforms at USP include a proposal that would allow departments to revoke the full-time status of underperforming professors and assign them a part-time schedule. Evaluations like the ones carried out by the Department of Fundamental Chemistry could serve as an objective reference for this type of decision. “We don’t know whether this change will be approved,” says Gutz.

The creators of the evaluation process are worried that the criteria adopted are not always aligned with the guidelines for career advancement at USP. “We can argue that a professor should be recognized for excellent teaching skills or participating in extension activities, but that will not be enough to take him to the top of the career path,” says Professor Silvia Serrano, who chairs the evaluation commission. Gutz observes that the initiative could be used as a basis for determining whether a researcher will be allowed to provide consultancy services to companies, for example. “A few years ago, the Brazilian Chemical Industry Association, interested in the successful innovation model used in South Korea, invited a professor and research center manager to speak at a conference. He also came to the Chemistry Institute to talk about how this process worked in his country. In South Korea, top-performing professors are free to work as consultants for companies. Those who fail to satisfactorily handle their academic duties are warned and encouraged for two years or so. If they continue to do badly on evaluations, they can be dismissed.” Gutz believes that the evaluation process shows that it’s possible – but not essential – for faculty members to perform well in every activity in which they are involved. “We have professors who manage to do excellent work in several areas. It’s only fair that they should serve as an inspiration for the others,” he says.

Republish