As modern astrophysical surveys deliver an unprecedented amount of data, from the imaging of hundreds of millions of distant galaxies to the mapping of cosmic radiation fields at ultra-high resolution, conventional data analysis methods are reaching their limits in both computational complexity and optimality. Deep Learning has rapidly been adopted by the astronomical community as a promising way of exploiting these forthcoming big-data datasets and of extracting the physical principles that underlie these complex observations. This has led to an unprecedented exponential growth of publications with in the last year alone about 500 astrophysics papers mentioning deep learning or neural networks in their abstract. Yet, many of these works remain at an exploratory level and have not been translated into real scientific breakthroughs.
The goal of this ICML 2022 workshop is to bring together Machine Learning researchers and domain experts in the field of Astrophysics to discuss the key open issues which hamper the use of Deep Learning for scientific discovery.
An important aspect to the success of Machine Learning in Astrophysics is to create a two-way interdisciplinary dialog in which concrete data-analysis challenges can spur the development of dedicated Machine Learning tools. This workshop is designed to facilitate this dialog and will include a mix of interdisciplinary invited talks and panel discussions, providing an opportunity for ICML audiences to connect their research interests to concrete and outstanding scientific challenges.
We welcome in particular contributions that target, or report on, the following non-exhaustive list of open problems:
Contributions on these topics do not necessarily need to be Astrophysics-focused, works on relevant ML methodology, or similar considerations in other scientific fields, are welcome.
All times are in Eastern Time. Please visit the ICML Workshop Page for live schedule (requires registration).
|Introduction||Welcome and Introduction of the Workshop|
|Keynote||Jakob Macke: Simulation-based inference and the places it takes us|
|Break||Morning Coffee Break|
|Spotlight||Aritra Ghosh: GaMPEN: An ML Framework for Estimating Galaxy Morphological Parameters and Quantifying Uncertainty|
|Spotlight||Ioana Ciuca: Unsupervised Learning for Stellar Spectra with Deep Normalizing Flows|
|Spotlight||Siddharth Mishra-Sharma: Strong Lensing Source Reconstruction Using Continuous Neural Fields|
|Keynote||Katherine Bouman: Capturing the First Portrait of Our Milky Way's Black Hole & Beyond|
|Keynote||Dustin Tran: Uncertainty Quantification in Deep Learning|
|Spotlight||Chirag Modi: Reconstructing the Universe with Variational self-Boosted Sampling|
|Spotlight||Yuchen Dang: TNT: Vision Transformer for Turbulence Simulations|
|Break||Afternoon Coffee Break|
|Keynote||Soledad Villar: Equivariant machine learning, structured like classical physics|
|Spotlight||Kwok Sun Tang: Galaxy Merger Reconstruction with Equivariant Graph Normalizing Flows|
|Spotlight||Denise Lanzieri: Hybrid Physical-Neural ODEs for Fast N-body Simulations|
|Spotlight||Tri Nguyen: Uncovering dark matter density profiles in dwarf galaxies with graph neural networks|
|Panel Discussion||Enabling Scientific Discoveries with ML|
|Poster Session||Main Poster Session|
|Pixelated Reconstruction of Gravitational Lenses using Recurrent Inference Machines||Adam, Alexandre*|
|Unsupervised Learning for Stellar Spectra with Deep Normalizing Flows||Ciuca, Ioana*; Ting, Yuan-Sen|
|TNT: Vision Transformer for Turbulence Simulations||Dang, Yuchen*; Hu, Zheyuan; Cranmer, Miles; Eickenberg, Michael; Ho, Shirley|
|Calibrated Predictive Distributions for Photometric Redshifts||Dey, Biprateep*; Zhao, David; Andrews, Brett; Newman, Jeff; Izbicki, Rafael; Lee, Ann|
|Full-Sky Gravitational Lensing Simulations Using Generative Adversarial Networks||Fiedorowicz, Pier*; Rozo, Eduardo; Boruah, Supranta; Coulton, William; Ho, Shirley; Fabbian, Giulio|
|GaMPEN: An ML Framework for Estimating Galaxy Morphological Parameters and Quantifying Uncertainty||Ghosh, Aritra*; Urry, C. Megan; Rau, Amrit; Perreault-Levasseur, Laurence; Cranmer, Miles; Schawinski, Kevin; Stark, Dominic; Tian, Chuan; Ofman, Ryan|
|SIMBIG: Likelihood-Free Inference of Galaxy Clustering||Hahn, ChangHoon*; Abidi, Muntazir; Eickenberg, Michael; Ho, Shirley; Lemos, Pablo; Massara, Elena; Moradinezhad Dizgah, Azadeh; Régaldo-Saint Blancard, Bruno|
|Accelerated Galaxy SED Modeling using Amortized Neural Posterior Estimation||Hahn, ChangHoon*; Melchior, Peter M|
|Scalable Bayesian Inference for Detection and Deblending in Astronomical Images||Hansen, Derek L*; Mendoza, Ismael; Liu, Runjing; Pang, Ziteng; Zhao, Zhe; Avestruz, Camille; Regier, Jeffrey|
|Galaxies on graph neural networks: towards robust synthetic galaxy catalogs with deep generative models||Jagvaral, Yesukhei*; Mandelbaum, Rachel; Lanusse, Francois; Ravanbakhsh, Siamak; Singh, Sukhdeep; Campbell, Duncan|
|Learning Galaxy Properties from Merger Trees||Jespersen, Christian K*; Cranmer, Miles; Melchior, Peter M; Ho, Shirley; Somerville, Rachel; Gabrielpillai, Austen|
|Probabilistic Dalek - Emulator framework with probabilistic prediction for supernova tomography||Kerzendorf, Wolfgang E*; Chen, Nutan; van der Smagt, Patrick|
|Hybrid Physical-Neural ODEs for Fast N-body Simulations||Lanzieri, Denise*; Lanusse, Francois; Starck, Jean-Luc|
|Population-Level Inference of Strong Gravitational Lenses with Neural Network-Based Selection Correction||Legin, Ronan*; Stone, Connor J; Hezaveh, Yashar; Perreault-Levasseur, Laurence|
|Robust Simulation-Based Inference in Cosmology with Bayesian Neural Networks||Lemos, Pablo*; Cranmer, Miles; Abidi, Muntazir; Hahn, Chang Hoon; Eickenberg, Michael; Massara, Elena; Yallup, David; Ho, Shirley|
|DeepBench: A library for simulating benchmark datasets for scientific analysis||Lewis, Ashia; Voetberg, Margaret*; Nord, Brian; Jones, Craig; Hložek, Renée; Ciprijanovic, Aleksandra; Perdue, Gabriel Nathan|
|On Estimating ROC Arc Length and Lower Bounding Maximal AUC for Imbalanced Classification||Liu, Song*|
|Autoencoding Galaxy Spectra||Melchior, Peter M*; Hahn, ChangHoon; Liang, Yan|
|Strong Lensing Source Reconstruction Using Continuous Neural Fields||Mishra-Sharma, Siddharth*; Yang, Ge|
|Reconstructing the Universe with Variational self-Boosted Sampling||Modi, Chirag*; Li, Yin; Blei, David|
|Bayesian Neural Networks for classification tasks in the Rubin big data era||Moller, Anais*; Main de Boissiere, Thibault|
|Don't Pay Attention to the Noise: Learning Self-supervised Representations of Light Curves with a Denoising Time Series Transformer||Morvan, Mario*; Nikolaou, Nikolaos; Yip, Kai; Waldmann, Ingo|
|Uncovering dark matter density profiles in dwarf galaxies with graph neural networks||Nguyen, Tri*; Mishra-Sharma, Siddharth; Necib, Lina|
|Astroconformer: Inferring Surface Gravity of Stars from Stellar Light Curves with Transformer||Pan, Jiashu*; Ting, Yuan-Sen; Yu, Jie|
|Learnable wavelet neural networks for cosmological inference||Pedersen, Chris*; Ho, Shirley; Eickenberg, Michael|
|A Convolutional Neural Network for Supernova Time-Series Classification||Qu, Helen*|
|Estimating Cosmological Constraints from Galaxy Cluster Abundance using Simulation-Based Inference||Reza, Moonzarin*; Zhang, Yuanyuan; Nord, Brian; Poh, Jason; Ciprijanovic, Aleksandra; Strigari, Louis|
|Fast Estimation of Physical Galaxy Properties using Simulation-Based Inference||Robeyns, Maxime H*; Walmsley, Mike; Fotopoulou, Sotiria; Aitchison, Laurence|
|Learning useful representations for radio astronomy “in the wild” with contrastive learning||Slijepcevic, Inigo V*; Scaife, Anna; Walmsley, Mike; Bowles, Micah R|
|An Unsupervised Learning Approach for Quasar Continuum Prediction||Sun, Zechang*; Ting, Yuan-Sen; Cai, Zheng|
|Galaxy Merger Reconstruction with Equivariant Graph Normalizing Flows||Tang, Kwok Sun*; Ting, Yuan-Sen|
|Reduced Order Model for Chemical Kinetics: A case study with Primordial Chemical Network||Tang, Kwok Sun*; Turk, Matthew|
|Inferring Structural Parameters of Low-Surface-Brightness-Galaxies with Uncertainty Quantification using Bayesian Neural Networks||Tanoglidis, Dimitrios*; Drlica-Wanger, Alex ; Ciprijanovic, Aleksandra|
|LINNA: Likelihood Inference Neural Network Accelerator||To, Chun-Hao*; Rozo, Eduardo|
|Toward Galaxy Foundation Models with Hybrid Contrastive Learning||Walmsley, Mike*; Slijepcevic, Inigo; Bowles, Micah R; Scaife, Anna|
|Automated discovery of interpretable gravitational-wave population models||Wong, Kaze*; Cranmer, Miles|
|Neural Posterior Estimation with Differentiable Simulator||Zeghal, Justine*; Lanusse, Francois; Boucaud, Alexandre; Remy, Benjamin; Aubourg, Eric|
|Parameter Estimation in Realistic Binary Microlensing Light Curves with Neural Controlled Differential Equation||Zhao, Haimeng*; Zhu, Wei|
We invite all contributions in connection with the theme of the workshop as described here, from the fields of Astrophysics and Machine Learning, but also from other scientific fields facing similar challenges.
Original contributions and early research works are encouraged. Contributions presenting recently published or currently under review results are also welcome. The workshop will not have formal proceedings, but accepted submissions will be linked on the workshop webpage.
Submissions, in the form of extended abstracts, need to adhere to the ICML 2022 format (LaTeX style files), be anonymized, and be no longer than 4 pages (excluding references). After double-blind review, a limited set of submissions will be selected for contributed talks, and a wider set of submissions will be selected for poster presentations.
Please submit your anonymized extended abstract through CMT at https://cmt3.research.microsoft.com/ML4Astro2022 before May 23rd, 23:59 AOE.
ICML 2022 is currently planned as an in-person event. As such, this workshop is currently assuming a hybrid format, with physical poster sessions and in-person speakers, but with support for virtual elements to facilitate participation of people unable to travel. We encourage all interested participants (regardless of their ability to physically travel to ICML) to submit an extended abstracts.
Registration for ICML workshops is handled through the main ICML conference registration here. The workshop will be able to guarantee the ICML registration for participants with accepted contributions.
Inquiries regarding the workshop can be directed to firstname.lastname@example.org
To prepare your poster, please use the following dimensions:
In addition to the physical poster, please visit this page for instructions on how to upload the camera-ready version of your poster to the ICML website: https://wiki.eventhosts.cc/en/reference/posteruploads You will be able to submit your poster at this link: https://icml.cc/PosterUpload
Scientific Organizing Committee for the ICML 2022 Machine Learning for Astrophysics workshop: