Imprimir Republish


The hole was right alongside

New strategy for analyzing information reveals an off-center black hole in Galaxy M 94

Assembly of photos by Miguel Boyayan (magic cube) and Nasa/hstIn the middle of the constellation, Canes Venatici, at this time of year visible in the Northern hemisphere after nightfall, there is a spiral galaxy similar to the Milky Way which for decades has aroused the curiosity of astronomers and astrophysicists. Identified by French astronomer, Pierre Méchain, in 1781 and catalogued by his master, Charles Messier, under the number 94, this galaxy known as M 94 appears to be like most of the spiral galaxies. It is only 15 million light ears from Earth and is home to tens of billions of stars in a central spherical region (the nucleus) and tens of billions more in a flattened disc of gas and dust. Occupying an area smaller than the solar system, the most central region of this galaxy emits a different type of light from that produced by the stars. This concentrated glow in such a restricted space normally indicates the presence of a gigantic black hole, which continually sucks up material from stars and the gas and dust clouds that surround it. Its luminosity comes from the movement of the material that is about to be absorbed: close to the hole it spins at such high speeds that it is transformed into energy and escapes into space in the form of electro-magnetic radiation, from the weakest kind, like radio waves and visible light, to the most energetic kind, like gamma rays.

Over the last few decades various research groups in Brazil and abroad have been delving into the depths of this galaxy, also known as NGC 4736, with the most powerful telescopes available, but without locating the black hole they expected to find. Some astrophysicists even proposed other mechanisms for explaining the origin of so much luminosity, like the collision of ultra-fast winds or the transfer of energy from the stars to gas clouds (photoionization). But recent evidence continued to indicate that black holes must be at the origin of most of the galaxies, serving as a type of support on which they are structured.

After almost three years analyzing images obtained with one of the biggest optical telescopes on earth, Gemini North, which has an 8.1 meter diameter mirror and is installed in the mountains of Mauna Kea, in Hawaii, Brazilian astrophysicist, João Steiner, finally obtained unquestionable proof that M 94 is home to a voracious black hole, one of the closest to the solar system. But to everybdy`s surprise, including Steiner, he did not find it where the researchers believed it should be.

With a mass millions of times greater than the Sun, all concentrated in a reduced space, black holes exercise a very strong gravitational pull on the nearest stars and may even consume those that get too close. Neighboring stars, which are virtually attached to them by the force of gravity, help attract more distant stars, and so on and so forth, as if black holes were colossal magnets that structure the galaxy. For this reason it was imagined that they are the center of the galaxies. But this was not what Steiner and his team saw. In M 94, the black hole is not at the center but a little away from it (some 10 light years) towards the perimeter. “It was so obvious that it ought to be found in the center of the galaxy that it was never imagined it could be in any other place”, comments Steiner, a professor at the Institute of Astronomy, Geophysics and Atmospheric Sciences of the University of São Paulo (IAG-USP).

The findings of Steiner’s group are not only due to the power of magnification of the images from Gemini, a telescope which he himself helped build and to which Brazilian researchers have access for observation approximately 20 nights  a year. They result mainly from an information analysis strategy that has been perfected by the astrophysicist from USP and his team over the last two years and that was presented in an article published this month in the Monthly Notices of the Royal Astronomical Society. In collaboration with astrophysicists Roberto Menezes and Tiago Ricci, from USP, and Alexandre Oliveira, from the University of Vale do Paraíba, in up-state São Paulo, Steiner perfected a statistical method used in other areas of science (the analysis of principal components) and used it to filter the enormous quantity of data generated by a powerful recent astronomical observation technique, integral field spectroscopy.

In integral field spectroscopy the image of an area of the sky, equivalent to the point on a pencil seen from a meter away, is focused on a set of microscopic lenses connected by optical fibers to a powerful spectrograph. This apparatus disperses the light into different energy levels of the electro-magnetic spectrum.

Sloan Digital Sky SurveyM 94: a galaxy with an atypical black holeSloan Digital Sky Survey

Light filter
In the case of Gemini North the light captured from a star or galaxy converges on 500 micro-lenses, which together fit onto the surface of a ten cent coin. Each micro-lens receives the light from a different point of this image and separates it into 6,000 energy levels, which indicate the quantity and variety of chemical elements found in that region of space. Identifying the chemical composition of a particular region is important because, strictly speaking, everything that exists in the Universe, from the stars to living beings, is formed from different combinations of 116 chemical elements that originated inside the stars.

Integral field spectroscopy, however, generates an absurdly huge volume of data, millions of times larger than the amount of data obtained from the strategies for investigating the skies that led to advances in astronomy in the last century. So the problem stopped being one of how to obtain information and became one of what to do with so much of it – integral field spectroscopy from Gemini produces 30 million pieces of data for each image. “It was impossible to interpret all this information and most was simply discarded”, explains Steiner.

Until the 1990s knowledge about the planets, the stars and the galaxies progressed, driven by two techniques that were separately used: observation, using telescopes with magnification power hundreds of times greater than those used by Galileo at the beginning of the 17th century and analysis of the light from celestial objects using a spectrograph, which was developed by German physicist, Robert Bunsen, at the end of the 19th century. More sophisticated equipment allowed the two techniques to be united, initially supplying researchers with information about the light spectrum – and consequently about chemical composition – from a single point of each image.

An astrophysicist who, in addition to the shape, wanted to know, at the very least, the chemical composition and star population of a galaxy like M 94, needed to measure the spectrum at different points. It was a slow and laborious process, like the one faced by those who try to find out the temperature of water in a lake by plunging a thermometer into it at various points. With the improvement in spectrography, in one go it became possible to obtain the energy data of the whole of an imaginary line that slices through the observed object, and now with integral field spectroscopy,  the whole of its surface.

The information obtained by this form of spectrography is generally represented by a three-dimensional graph with perpendicular axes in the shape of a cube, which is why it is known by specialists as a data cube. The graph is similar to those in which the three spatial dimensions (width, height and depth) of a room in a house are represented. In data cubes constructed with information from astronomical images, however, only two dimensions are spatial (height and width), since the images obtained from telescopes are two-dimensional. The third dimension, which corresponds to depth, is normally represented by energy levels (spectrum). “The problem with data cubes generated using this technique has been to evaluate the absurd amount of information in such a way as to be able to extract some physical meaning from it”, comments astrophysicist, Keith Taylor, from the Anglo-Australian Observatory in Epping, Australia, one of the pioneers in the use of data cubes in astronomy.

It was 2007 when Steiner, with images from Gemini in hand and refusing to accept the lack of a mathematical tool that would allow him to use the mountain of data he had managed to accumulate, went in search of a solution. He tested various alternatives and noted that principal components analysis might be useful. “This statistical tool looks for associations between data that are not always clearly related and allows the redundancies that are common in integral field spectrographs of a galaxy to be eliminated”, explains astrophysicist, Roberto Cid Fernandes, from the Federal University of Santa Cantarina (UFSC). “By eliminating what is unnecessary, principal components analysis makes it possible to use the minimum amount of data to represent the phenomenon with the maximum amount of realism possible”, adds Fernandes, another of Steiner’s helpers, who previously had unsuccessfully looked for the black hole in the M 94 galaxy and proposed an alternative explanation for the brightness in the galaxy’s central region.

Mathematical trick
“When analyzing data distributed in various dimensions this statistical tool locates first those that concentrate the greatest amount of information and then those that bring together the second largest group and so on and so forth”, says astrophysicist, Laerte Sodré Júnior, from USP, a specialist in the application of principal components analysis to astronomy. It is as if a survey of a collection of books in a house indicates that it can be best represented first of all by books from the library, then by books from the bookshelf in the living room and thirdly by the small pile beside the bed. In short, this is a strategy for reorganizing data by amount and relevance.

Miguel BoyayanJust the statistical tool, however, does not resolve the difficulties imposed by data cube analysis. So Steiner, Menezes, Ricci and Oliveira developed a mathematical procedure that emphasizes the weak features of astronomical images. “This improvement resulted in a powerful way of extracting information from the data cube”, Steiner relates. He is even betting that this approach will move beyond astrophysics and become useful in other areas of science, which despite being different often structure information in a similar fashion.

According to Steiner the first ten images are sufficient to recover 99.9% of the information contained in the data cube, which in the case of galaxy M 94 contains 6000 images. This approach also helps select and regroup the data that are of interest, removing what is not interesting, as if there were successive filters. To arrive at M94’s black hole, Steiner’s group eliminated the first data group, which represented all the stars, and then the information about the gas and dust. Only then did they manage to observe it. “The evidence that this black hole really exists was never convincing”, comments Fernandes, from UFSC. “As the signal it emits is very weak traditional methods were unable to find it.”

This strategy is somewhat different from that normally adopted in astrophysics and in other areas of science. Generally, the researcher formulates a question and uses the methods available to look for the answer. In this approach, says Steiner, the answer is given without the question having been posed. “What’s complicated is knowing how to interpret the results that the technique throws up”, adds Fernandes. They did not even ask if there was a black hole in M 94. They simply found it, hidden where no one thought of looking, in a similar way to the one they observed in another galaxy, M 58, or NGC 4579, located in the Virgo constellation.

In a piece of recently concluded stellar archeology work Steiner and Fernandes suggest an explanation for the M 94 black hole being found where it should not be: M94, which was formed 12 billion years ago, when the Universe was in its infancy, collided 2 billion years ago with a smaller galaxy. This encounter of cosmic proportions displaced the black hole from its original position. “When it reaches its equilibrium”, says Steiner, “it will go back to the place it should be, in the center of the galaxy, even if this takes a million years”.

The projects
1. Differentiation of models for Liners (06/05203-3); Modality: Master’s degree scholarship; Tutor: João Steiner – IAG/USP; Grant holder: Roberto Bertoldo Menezes.
2. Analysis of principal components from a sample of close Seyfert galaxies (05/03323-9); Modality: Master’s degree scholarship; Tutor: João Steiner – IAG/USP; Grant holder: Tiago Vecchi Ricci.

Scientific article
STEINER, J. E. et al. PCA Tomography: how to extract information from Data cubes. Monthly Notices of the Royal Astronomical Society. v. 370. May. 2009.