This article is more than 1 year old

AI to help study first images from James Webb Space Telescope

To find dark matter and early galaxies, Morpheus could be The One

Scientists around the world are gearing up to study the first images taken by the James Webb Space Telescope, which are to be released on July 12.

Some astronomers will be running machine-learning algorithms on the data to detect and classify galaxies in deep space at a level of detail never seen before. Brant Robertson, an astrophysics professor at the University of California, Santa Cruz, in the US believes the telescope's snaps will lead to breakthroughs that will help us better understand how the universe formed some 13.7 billion years ago.

"The JWST data is exciting because it gives us an unprecedented window on the infrared universe, with a resolution that we've only dreamed about until now," he told The Register. Robertson helped develop Morpheus, a machine-learning model trained to pore over pixels and pick out blurry blob-shaped objects from the deep abyss of space and determine whether these structures are galaxies or not, and if so, of what type.

The software will be used as part of the COSMOS-Webb program, the largest and most ambitious project the telescope will undertake in its first year. Robertson and a team of nearly 50 researchers will survey half-a-million galaxies from a patch of the sky; they'll be hunting for the oldest, fully-evolved galaxies to study how dark matter evolved over time as these structures began hosting stars, and use the software to automate this process.

universe

A composite of separate exposures taken from 2003 to 2012 with the Hubble Space Telescope. Image Credit: NASA/ESA ... Click to enlarge

Robertson and his colleagues have updated Morpheus to adapt to data from the JWST. "We have now integrated attention methods that allow for larger regions of images to be classified at a time, which resulted in a speed-up of roughly a factor of a hundred. The newer Morpheus can classify larger images faster and more reliably than before," he told us.

The latest version of the software has new image processing capabilities too, such as deblending which can separate out astronomical objects that appear to overlap in the sky, he explained. 

These abilities will come in handy as the JWST provides a wider and deeper view of the universe than ever before, and each image will contain more structures that cannot be studied manually with the naked eye. Morpheus was initially trained on 7,600 galaxy images snapped by NASA's Hubble Space Telescope, and Robertson reckons it'll have to be retrained to better adapt to data from the JWST.

"We will try to apply Morpheus as is on the JWST data without retraining first, and check performance for objects in regions of the sky where both Hubble and JWST data exist," he told us.

"It's likely we will need to retrain Morpheus based on the JWST data given that JWST data are redder, extend over a wider range of wavelengths, and point spread function - basically what a star looks like through the telescope optics – differs from Hubble."

Morpheus will run on UC Santa Cruz's supercomputer Lux, which is armed with 80 CPU-only compute nodes each containing two 20-core Intel Cascade Lake Xeon processors, and 28 GPU-only nodes containing two Nvidia V100 GPUs each. "Once the data is in hand, running Morpheus on all the JWST images will only take a few days at most on lux," Robertson said. 

The long-awaited ten-billion-dollar telescope was finally launched on Christmas Day last year after repeated delays. Ground control spent months perfectly aligning its complex 18-mirror system before the instrument began detecting its first photons in February. ®

More about

TIP US OFF

Send us news


Other stories you might like