Four projects associated with fields as diverse as robotics, digital archeology, Earth observation and agrometeorology were selected in the first round of the FAPESP Research Program in eScience. The initiative, launched in 2013, seeks to encourage bold, novel and unconventional approaches to cutting-edge research, involving the multidisciplinary collaboration of computer scientists and researchers in other domains. One of the proposals included seeks to equip Brazil with an information system about land use. Led by Gilberto Câmara Neto of the National Institute for Space Research (INPE), the thematic project, “e-Sensing: Big Earth observation data analytics for land use and land cover change information will link together INPE teams that work with remote sensing, agriculture, forest monitoring, databases and geoinformatics.
Earth observation satellites are the only source of data that provides a continuous and consistent set of information about our planet. Data from these satellites are available, free of charge, in repositories of space agencies and research centers such as INPE and NASA. Although the satellites produce large amounts of data, only a small portion of it is actually used for research and operations. Most of the methods used by Earth observation scientists are unable to handle large datasets. To face this challenge, the e-Sensing project will develop a new type of knowledge platform to address Earth observation.
“It is an interdisciplinary project that brings together computer science competence with several other domains,” says Câmara Neto, who coordinates the research and development team in geoinformatics and environmental modeling at INPE. “We want to develop methods that allow us to tap massive quantities of data in order to detect deforestation, changes in land use, and agricultural expansion,” he says. He notes that Brazil lacks information on land use. “We have good systems for monitoring deforestation in Amazônia and mapping sugarcane production areas in Central-South Brazil. However, Brazil does not currently have the capacity to produce updated information about its own land use. We need information about important biomes like the Cerrado, Caatinga and Pampa. Many things we learn only through the Agricultural Census taken every 10 years,” he says.
The methodologies could have applications in managing other scientific data. “The challenge of organizing and extracting information from a massive volume of data, generated absurdly quickly is common to several fields of knowledge, such as astronomy, genetics and bioinformatics,” Câmara Neto says. In the case of satellite imaging, the changes have been extraordinary. “Ten years ago, INPE would sell a single satellite image for R$2,000. Today, it provides thousands of images free of charge. The challenge is in taking scientists to the archives rather than taking the archives to scientists while at the same time creating a virtual environment in which to organize and analyze this information.”
The term eScience sums up the challenge of joint collaboration in computer science and other domains in organizing, classifying, visualizing and facilitating access to the massive volume of data constantly being generated in all fields of research, in order to obtain new knowledge and conduct broad and original analyses. Thus, eScience is often considered a synonym for data intensive science, in other words, science developed from mining massive volumes of data. The program introduced by FAPESP seeks to integrate groups involved in computer science research with scientists from other fields ranging from agricultural to the social sciences. The computing aspects cover a broad spectrum, from interfaces and algorithms to computer modeling and data infrastructure. “We’re satisfied with the results of the first call for proposals because of the range and quality of the projects, which include those submitted as well as those selected,” says Claudia Bauzer Medeiros, professor of the Institute of Computing at the University of Campinas (Unicamp), deputy coordinator of the FAPESP Area Panel on Special Programs and coordinator of the eScience program.
Twenty-five proposals were submitted under the call for proposals, a significant number for a program that was launched only recently. The selection process was complicated. Each project was evaluated by at least two specialists – one to analyze the innovative aspect of the proposal in the field of computer science and the second to do the same analysis, but in relation to the project’s potential contribution to another field of knowledge involved. Only those projects that received two recommendations were selected. “Often what happened is that the project would be recommended by one of the advisors, but not the other,” Medeiros says. In some cases, a project involving several fields of knowledge required consultation with as many as four advisors to ensure a balanced assessment. “We called upon advisors from Europe, North America, Asia and even Australia.” One new feature of the call for proposal was the required submission of a Data Management Plan that describes how each project intended to manage, protect, preserve and disseminate the data it generated. “The goal is to allow other researchers to re-use the data, besides ensuring that the results of a scientific study based on the data can be reproduced transparently by other interested parties,” says Medeiros.
Another proposal contemplated seeks to combine archeology and computer science. Led by Marcelo Knörich Zuffo, professor and researcher at the Interdisciplinary Center in Interactive Technologies at the USP Polytechnic School, the project, Cyber-archaeology – virtual reality and e-Science meets archaeology involves Brazilian and international partners in the fields of engineering, computer science and archeology, and aims to apply digital techniques to exploring archeological excavations, using cloud computing to store large volumes of data. The project’s main goal is to scan archeological sites and transform them into 3D images using high-resolution cameras, a 3D scanner and even unmanned aerial vehicles known as drones. “The excavation of an archeological site is basically a destructive process. When we excavate, the site is destroyed. Progressive scanning of the excavation generates a large volume of data, on the order of terabytes for each site. Our goal is to create interactive virtual reality tools that allow study of the sites in immersive environments and analysis of aspects that are not always apparent in the live study,” Zuffo says. The initiative calls for a partnership with two researchers from Duke University, Maurizio Forte and Regis Kopper, responsible for scanning the Çatalhöyük site in Turkey. It also involves researchers Maria Isabel D’Agostino Fleming and Astolfo Araújo from the Polytechnic School and the Museum of Archeology and Etiology of USP. “There are significant challenges. Three-dimensional processing using cloud computing is not an easy task,” Zuffo says.
With regard to the project, Attitude and heading reference system based on recursive robust Kalman filter implemented in FPGA, the intent is to create low-cost sensors that help monitor the positioning of moving vehicles. The project provides for the specification of new algorithms based on a mathematical technique developed by Hungarian Rudolf Kalman in 1960. Validation of the sensors will be carried out in a truck that moves in autonomous mode, without a driver. The expectation is that the research will lead to a patent. “We have techniques known as robust filters to improve the performance of these sensors. We have been able to reduce the cost of these units by using algorithms created especially for this purpose,” says project leader Marco Henrique Terra, professor at the São Carlos School of Engineering of USP. Creation of the sensors involves a data management challenge because monitoring a moving truck necessitates managing large amounts of information. “We need to access the vehicle position, taking into account the distance that it needs to maintain from stationary or mobile obstacles around it, determine where it will go, and process all of this to make an appropriate decision. The volume of data involved in reproducing an actual situation in which the vehicle travels from one place to another is highly complex,” Terra explains. Five professors from USP and one from the Federal University of São Carlos are involved in the initiative. Some of them took part in a project that created a prototype driverless car (see Pesquisa FAPESP Issue no. 213). The sensors for trucks are more complex.
Finally, the project AgroComputing.net – digital infrastructure and novel computational methods for analyzing and mining climate and remote sensing large databases to improve agricultural monitoring and forecasting seeks to establish a computer platform for agrometeorologists that is supplied by climate and remote sensing data from various sources (satellites, sensors and stations), in an manner that is organized and easy-to-use. “The proposal is to build a digital infrastructure that brings together climate time series from weather stations, climate change scenario models and remote sensors, coupled with computer science models based on data mining in order to improve agricultural monitoring and crop forecasts,” says Luciana Alvim Santos Romani, a researcher at Embrapa Informática Agropecuária, in Campinas, who is heading up the project. The climate data are part of the Agritempo system, a result of Embrapa’s partnership with the Center for Meteorological and Climate Research Applied to Agriculture at Unicamp.
“This new project will also allow us to use statistical and computational methods to generate a complete database, improving the data already available in Agritempo and giving rise to new agrometeorological databases. With this massive dataset available through cloud computing, agrometeorologists will be able to improve their analyses. The project is expected to help in analyzing agricultural scenario models that take into account climate change,” she says. The new platform will be fed by data from regional and global climate models from INPE’s Center for Weather Forecasting and Climate Studies, and satellite image time series like NOAA and GOES in the United States. “We’re trying to develop low complexity algorithms that handle large volumes of data quickly. It makes little sense to have an efficient method that takes an inordinate amount of time to carry out. For computer scientists, the advantage lies in addressing the complex problems. For agrometeorologists, the benefit is working with large volumes of data more effectively and automatically,” Romani says.Republish