Imprimir Republish

Education

Acquisition of subject knowledge ranks first

Study detects substantial gains in subject knowledge and slight gains in general knowledge among seniors in 19 undergraduate programs

Educação_GettyImages-501880823Stuart Kinlough/Getty ImagesA study that analyzed the performance of 484,410 students who took the standardized National Student Performance Exam (ENADE) from 2008 through 2010 observed a slight improvement in general knowledge and a significant increase in subject knowledge among college students graduating from 19 programs in the fields of science, technology, engineering, and mathematics (STEM), the humanities, and the biological sciences. Socioeconomic status and type of institution (public or private) had no relevant effect on students’ average scores in any of the analyzed fields, according to the study, which came out in the science journal Higher Education on November 23, 2015. “The good news is that most graduating students seem to have gained some knowledge in relation to freshman performance on the ENADE, especially specific knowledge directly tied to their chosen career,” says Jacques Wainer, professor at the University of Campinas Institute of Computing (IC-Unicamp) and one of the authors of the study. “But this doesn’t mean that every course is good or that students’ scores were good.”

Wainer conducted the study in collaboration with Tatiana Melguizo, a specialist in the economics of higher education at the University of Southern California’s Rossier School of Education in Los Angeles. The researchers used public data available on the ENADE website to compare the performance of those who were about to graduate with that of first-year students in the same courses, whose performance on the test serves as a benchmark in judging how much outgoing graduates learn at college. The comparison was based on calculations of the Cohen’s d index, which is often used in this type of study. Cohen’s d indicates the standardized difference between the average score for those in their final year and that of freshmen, divided by the standard deviation for these scores.

Standard deviation is a measure of the amount of statistical dispersion, which indicates the degree of variance in relation to an average or to an expected value. The concept is similar to the idea of margin of error, so often mentioned when the results of election polls are announced in the news. If the standard deviation is low, it means the scores lie close to the average. When it is high, it suggests that student scores are distributed over a range of values, some lying closer to the average and others farther away. In Wainer and Melguizo’s study, the greater the standardized difference as calculated using Cohen’s d, the better the performance of seniors compared to freshmen. A Cohen’s d equal to 2 means that 98% of graduating students achieved a higher score than the average incoming student. If the index drops to 1 or 0.5, then about 84% or 69% of the graduating students achieved this performance.

Using this criterion, the study detected quite modest improvement in the general knowledge acquired by outgoing graduates over the course of their studies. In this regard, the greatest progress was shown by graduating pharmacy majors (Cohen’s d of 0.3) while the least was displayed by those completing their medical studies, who attained an index of 0.03, that is, practically zero. Measurements of knowledge specific to each subject field found sharpest gains among outgoing medical students (Cohen’s d of 2) and the smallest gains among majors in social communication courses (0.39). The five courses where the biggest strides in subject knowledge were observed fall within the field of biology. Ranking second was medicine, followed by dentistry (1.55), occupational therapy (1.34), nutritional science (1.12), and nursing (0.85). “It was encouraging to find relatively marked improvement in tests of subject knowledge vis-à-vis tests of general knowledge,” says Melguizo (see the table on page 85, which shows the increases in subject knowledge calculated for each course).

Students taking the ENADE: programs are assessed every three years

Rafael Hupsel/FolhapressStudents taking the ENADE: programs are assessed every three yearsRafael Hupsel/Folhapress

Academically adrift
A study by sociologists Richard Arum and Josipa Roksa, of New York University and the University of Virginia, respectively, resulted in the book Academically adrift: Limited learning on college campuses, released in late 2010. In it, the two U.S. researchers concluded that 45% of the 2,300 students at 24 universities who had two years of higher education and had taken a standardized test showed no significant improvement in a series of skills, such as writing, critical thinking, and complex reasoning. “I think this study’s findings are problematic and it’s not appropriate to compare them with our study,” says Melguizo. “Arum and Roksa tried to measure students’ gains in general rather than subject knowledge.”

Developed by Brazil’s Ministry of Education (MEC) in 2004 as part of the National System of Higher Education Evaluation (SINAES), the ENADE – popularly referred to as “The Big Test” – is given to both incoming and outgoing students. They have four hours to write two essays and respond to eight multiple-choice questions on general knowledge, which are the same across all majors. They also write three essays and answer twenty-seven multiple-choice questions on subject knowledge, and these differ by major. General knowledge accounts for 25% of the final score and subject knowledge for 75%. ENADE assesses each group of courses or programs every three years. STEM students took “The Big Test” in 2008, while those majoring in the human sciences had their turn the following year and those studying one of the biological sciences, in 2010. Wainer and Melguizo’s study does not cover all the courses that were evaluated in those three years. They chose 19 of these: engineering, physics, chemistry, mathematics, computer science, architecture, economics, law, accounting, business administration, communication, tourism, nutritional science, nursing, medicine, occupational therapy, pharmacy, dentistry, and physical education. The two researchers worked with data on students enrolled in 10,041 programs at both public and private universities.

The study adopted some statistical methods and corrective procedures to minimize the distortions inherent to the samples of incoming and outgoing students who took the ENADE. For example, students who handed in blank tests were excluded. This kind of boycott of the exam is a common form of protest among students in some public university courses. Although the ENADE is supposedly mandatory for graduating students, not taking the test or showing up and handing in blank exams causes these students little or no actual harm.

The authors of the study also used a method for correcting the average score of outgoing students downward. “There’s a tendency for weaker students or those with some kind of problem to drop out halfway through,” says Wainer. “So the ones who graduate are the best students from a group that was initially more heterogeneous. This tends to inflate the scores of the graduates.” Conversely, there is greater diversity among freshmen, with a mixture of good, average, and poor students, and this affects their average performance. Other factors suggest that ENADE results should be taken with a grain of salt, as Wainer and Melguizo admit. Easy tests on the specific content of a given course tend to yield similar results for incoming and outgoing students. By pushing all the scores up, they make it harder to measure whether graduating students had any gains in knowledge.

082-085_Educação_240Fostering critical thinking
Robert Verhine, a specialist in educational assessment and educational policies at the Federal University of Bahia (UFBA), says that Wainer and Melguizo’s study is interesting, particularly because little research has been done using ENADE data. “But the findings were obvious; they were to be expected. It’s natural that there would be bigger gains in subject knowledge than in general knowledge,” states Verhine, who was formerly chair of the National Committee for the Evaluation of Higher Education (CONAES). “People generally go to college to acquire subject knowledge.”

Renato Pedrosa, coordinator of Unicamp’s Higher Education Research Laboratory (LEES), agrees that the improvement detected by the study was to be expected. “The question is knowing what exactly is meant by these numbers that show a relative gain in knowledge. That’s impossible to ascertain, because MEC has never established a relation between scores or grades and levels of acquired knowledge or skills,” observes Pedrosa, who, in collaboration with physicist Marcelo Knobel, also of Unicamp, has studied the performance of engineering and medical students on the ENADE. “Without this, the assessment only produces a ranking of courses, without any criteria or profiles regarding proficiency or quality, which limits the use of course assessment results.”

When the ENADE records the performance of students enrolled in a given program, it assigns the program a relative rather than absolute score, ranking it at one of five levels: the best programs earn a 5; the second-best, 4; and so on, with 1 representing the lowest rank. So even if a program receives a 5 and therefore ranks at the top, it may be better than all the others but still not good. If the students in most programs perform very poorly on the test, the percentage of correct answers needed for a program to score 5 can be relatively low, around 50%.

According to Knobel, the findings from Wainer and Melguizo’s study seem consistent and should encourage further research using data from the ENADE and also from the National High School Exam (ENEM). But Knobel voices a warning about the profile of higher education in Brazil. “Brazilian universities are still very concerned with providing technical and subject content in their courses but not much with stimulating skills that are vital in the 21st century.” In his opinion, it would also be important for teaching institutions to endeavor to promote more general skills, like critical thinking and teamwork. “These facets are valued at U.S. universities and that is the trend worldwide,” says Knobel.

Scientific article
MELGUIZO, T. and WAINER, J. Toward a set of measures of student learning outcomes in higher education: evidence from BrazilHigher Education. Nov. 23, 2015.

Republish