Skip to content

Mass Estimation of Planck Galaxy Clusters using Deep Learning

Galaxy cluster masses can be inferred indirectly using measurements from X-ray band, Sunyaev-Zeldovich (SZ) effect signal or optical observations. Unfortunately, all of them are affected by some bias. Alternatively, we provide an independent estimation of the cluster masses from the Planck PSZ2 catalogue of galaxy clusters using a machine-learning method. We train a Convolutional Neural Network (CNN) model with the mock SZ observations from The Three Hundred (the300) hydrodynamic simulations to infer the cluster masses from the real maps of the Planck clusters. The advantage of the CNN is that no assumption on a priory symmetry in the cluster’s gas distribution or no additional hypothesis about the cluster’s physical state is made. We compare the cluster masses from the CNN model with those derived by Planck and conclude that the presence of a mass bias is compatible with the simulation results.

Daniel de Andres, Weiguang Cui, Florian Ruppin, Marco De Petris, Gustavo Yepes, Ichraf Lahouli, Gianmarco Aversano, Romain Dupuis, and Mahmoud Jarraya, Mass Estimation of Planck Galaxy Clusters using Deep Learning, EPJ Web of Conferences, January 2022, 257, 00013.

DOI: https://doi.org/10.1051/epjconf/202225700013

Click here to access the paper.

Releated Posts

Investigating a Feature Unlearning Bias Mitigation Technique for Cancer-type Bias in AutoPet Dataset

We proposed a feature unlearning technique to reduce cancer-type bias, which improved segmentation accuracy while promoting fairness across sub-groups, even with limited data.
Read More

Muppet: A Modular and Constructive Decomposition for Perturbation-based Explanation Methods

The topic of explainable AI has recently received attention driven by a growing awareness of the need for transparent and accountable AI. In this paper, we propose a novel methodology to decompose any state-of-the-art perturbation-based explainability approach into four blocks. In addition, we provide Muppet: an open-source Python library for explainable AI.
Read More