The Milky Way is a zombie. No, not really, it doesn’t go around eating other galaxies’ brains. But it did “die” once, before flaring back to life. That’s what a Japanese scientist has ascertained after peering into the chemical compositions of our galaxy’s stars. In a large section of the Milky Way, the stars can be divided into two distinct populations based on their chemical compositions.
The first group is more abundant in what is known as α elements – oxygen, magnesium, silicon, sulphur, calcium and titanium. The second is less abundant in α elements, and markedly more abundant in iron. The existence of these two distinct populations implies that something different is happening during the formation stages. But the precise mechanism behind it was unclear.
Read more: Andromeda and Milky Way Galaxies Are Already Touching
Astronomer Masafumi Noguchi of Tohoku University believes his modelling shows the answer. The two different populations represent two different periods of star formation, with a quiescent, or “dormant” period in between, with no star formation. Based on the theory of cold flow galactic accretion proposed back in 2006, Noguchi has modelled the evolution of the Milky Way over a 10 billion-year period. Originally, the cold flow model was suggested for much larger galaxies, proposing that massive galaxies form stars in two stages.
Because of the chemical composition dichotomy of its stars, Noguchi believes this also applies to the Milky Way. That’s because the chemical composition of stars is dependent on the gases from which they are formed. And, in the early Universe, certain elements – such as the heavier metals – hadn’t yet arrived on the scene, since they were created in stars, and only propagated once those stars had gone supernova.
In the first stage, according to Noguchi’s model, the galaxy is accreting cold gas from outside. This gas coalesces to form the first generation of stars. After about 10 million years, which is a relatively short timescale in cosmic terms, some of these stars died in Type II supernovae. This propagated the α elements throughout the galaxy, which were incorporated into new stars.
But, according to the model, it all went a bit belly-up after about 3 billion years.
“When shock waves appeared and heated the gas to high temperatures 7 billion years ago, the gas stopped flowing into the galaxy and stars ceased to form,” a release from Tohoku University says.
During a hiatus of about 2 billion years, a second round of supernovae took place – the much longer scale Type Ia supernova, which typically occur after a stellar lifespan of about 1 billion years. It’s in these supernovae that iron is forged, and spewed out into the interstellar medium. When the gas cooled enough to start forming stars again – about 5 billion years ago – those stars had a much higher percentage of iron than the earlier generation.
That second generation includes our Sun, which is about 4.6 billion years old.\ Noguchi’s model is consistent with recent research on our closest galactic neighbour, Andromeda, which is thought to be in the same size class as the Milky Way. In 2017, a team of researchers published a paper that found Andromeda’s star formation also occurred in two stages, with a relatively quiescent period in between.
If the model holds up, it may mean that the evolution models of galaxies need to be revised – that, while smaller dwarf galaxies experience continuous star formation, perhaps a “dead” period is the norm for massive ones. If future observations confirm, who’s up for renaming our galaxy Frankenstein?