Skip to content

World’s most powerful supercomputers achieve most accurate black hole accretion ever

The team used two of the most powerful supercomputers, capable of performing a quintillion operations per second.

Space
FacebookLinkedInXReddit
Google News Preferred Source
FacebookLinkedInXReddit
Google News Preferred Source
Representative image.
Representative image.NASA

In a breakthrough, a team of computational astrophysicists has developed the most comprehensive model of luminous black hole accretion to date.

The team used two of the world’s most powerful supercomputers, Frontier and Aurora, to calculate the flow of matter into black holes.

The new achievement builds on decades of black hole research and provides a new method for understanding the cosmic giants at the heart of countless galaxies.

The most accurate black hole simulation

The team, led by researchers from the Institute for Advanced Study and the Flatiron Institute, Center for Computational Astrophysics, claims their model provides the most accurate simulation of black hole accretion to date. They published their findings in a new paper in The Astrophysical Journal.

“This is the first time we’ve been able to see what happens when the most important physical processes in black hole accretion are included accurately,” study lead author Lizhong Zhang explained in a press statement.

“These systems are extremely nonlinear—any over-simplifying assumption can completely change the outcome,” Zhang continued. “What’s most exciting is that our simulations now reproduce remarkably consistent behaviors across black hole systems seen in the sky, from ultraluminous X-ray sources to X-ray binaries. In a sense, we’ve managed to ‘observe’ these systems not through a telescope, but through a computer.”

Black hole accretion is the process by which a black hole’s mass increases by capturing surrounding matter. As this matter is sucked into the black hole, it forms a swirling, superheated disk called an accretion disk. This was famously captured for the first time by the Event Horizon Telescope team in 2019.

Black holes vary greatly in size. The largest ever observed, Phoenix A, has an estimated mass of at least 100 billion times that of the Sun.

Due to these mind-bending sizes and the resulting complexity of simulating black hole accretion, previous models have used shortcuts to simplify the problem of radiation flows. “Previous methods used approximations that treat radiation as a sort of fluid, which does not reflect its actual behavior,” explained Zhang.

Simulating stellar mass black holes

The team behind the new model used insights gathered over decades to develop new algorithms. These allowed them to tackle the problem without using approximations. “Ours is the only algorithm that exists at the moment that provides a solution by treating radiation as it really is in general relativity,” Zhang said.

The team’s paper specifically focused on stellar-mass black holes—those with roughly 10 times the mass of the Sun, making them relatively small compared to other black holes. Unlike supermassive black holes, stellar mass black holes change on human timescales of hours and even minutes. This makes them ideal for charting the evolution of these cosmic giants.

The scientists found that, as it spirals toward stellar mass black holes, matter forms turbulent, radiation-intense disks. This forms powerful winds and can also produce powerful jets. The team found that their model fit the spectrum obtained from the limited observational data for these types of black holes remarkably well.

For their research, Zhang and his team used the supercomputers Frontier and Aurora at Oak Ridge National Laboratory and Argonne National Laboratory. These “exascale” computers are among the most powerful supercomputers in the world and can perform a quintillion operations per second.

“What makes this project unique is, on the one hand, the time and effort it has taken to develop the applied mathematics and software capable of modeling these complex systems, and, on the other hand, having a very large allocation on the world’s largest supercomputers to perform these calculations,” explained James Stone, Professor in the Institute for Advanced Study’s School of Natural Sciences, and co-author on the paper. “Now the task is to understand all the science that is coming out of it.”

Recommended Articles

The Blueprint
Get the latest in engineering, tech, space & science - delivered daily to your inbox.
By subscribing, you agree to our Terms of Use and Policies
You may unsubscribe at any time.
0COMMENT

Chris Young is a journalist, copywriter, blogger and tech geek at heart who’s reported on the likes of the Mobile World Congress, written for Lifehack, The Culture Trip, Flydoscope and some of the world’s biggest tech companies, including NEC and Thales, about robots, satellites and other world-changing innovations.

WEAR YOUR GENIUS

IE Shop
Shop Now