Machine Learning for Astrophysics

Workshop at the Thirty-ninth International Conference on Machine Learning (ICML 2022), July 22nd, Baltimore, MD

Rationale

As modern astrophysical surveys deliver an unprecedented amount of data, from the imaging of hundreds of millions of distant galaxies to the mapping of cosmic radiation fields at ultra-high resolution, conventional data analysis methods are reaching their limits in both computational complexity and optimality. Deep Learning has rapidly been adopted by the astronomical community as a promising way of exploiting these forthcoming big-data datasets and of extracting the physical principles that underlie these complex observations. This has led to an unprecedented exponential growth of publications with in the last year alone about 500 astrophysics papers mentioning deep learning or neural networks in their abstract. Yet, many of these works remain at an exploratory level and have not been translated into real scientific breakthroughs.

The goal of this ICML 2022 workshop is to bring together Machine Learning researchers and domain experts in the field of Astrophysics to discuss the key open issues which hamper the use of Deep Learning for scientific discovery.

An important aspect to the success of Machine Learning in Astrophysics is to create a two-way interdisciplinary dialog in which concrete data-analysis challenges can spur the development of dedicated Machine Learning tools. This workshop is designed to facilitate this dialog and will include a mix of interdisciplinary invited talks and panel discussions, providing an opportunity for ICML audiences to connect their research interests to concrete and outstanding scientific challenges.

We welcome in particular contributions that target, or report on, the following non-exhaustive list of open problems:

  • Efficient high-dimensional Likelihood-based and Simulation-Based Inference
  • Robustness to covariate shifts and model misspecification
  • Anomaly and outlier detection, search for rare signals with ML
  • Methods for accurate uncertainty quantification
  • Methods for improved interpretability of models
  • (Astro)-physics informed models, models which preserve symmetries and equivariances
  • Deep Learning for accelerating numerical simulations
  • Benchmarking and deployment of ML models for large-scale data analysis

Contributions on these topics do not necessarily need to be Astrophysics-focused, works on relevant ML methodology, or similar considerations in other scientific fields, are welcome.


Program

Confirmed Invited Speakers and Panelists


Josh Bloom
UC Berkeley

Katie Bouman
Caltech

Daniela Huppenkothen
SRON


Jakob Macke
Tübingen University

Laurence Perreault-Levasseur
University of Montreal

Dustin Tran
Google


George Stein
UC Berkeley

Soledad Villar
Johns Hopkins University

Workshop Schedule

All times are in Eastern Time. Please visit the ICML Workshop Page for live schedule (requires registration).

8:45-9:00 Introduction Welcome and Introduction of the Workshop
9:00-10:00 Keynote Jakob Macke: Simulation-based inference and the places it takes us
10:00-10:30 Break Morning Coffee Break
10:30-10:45 Spotlight Aritra Ghosh: GaMPEN: An ML Framework for Estimating Galaxy Morphological Parameters and Quantifying Uncertainty
10:45-11:00 Spotlight Ioana Ciuca: Unsupervised Learning for Stellar Spectra with Deep Normalizing Flows
11:00-11:15 Spotlight Siddharth Mishra-Sharma: Strong Lensing Source Reconstruction Using Continuous Neural Fields
11:15-12:15 Keynote Katherine Bouman: Capturing the First Portrait of Our Milky Way's Black Hole & Beyond
12:15-13:30 Break Lunch Break
13:30-14:30 Keynote Dustin Tran: Uncertainty Quantification in Deep Learning
14:30-14:45 Spotlight Chirag Modi: Reconstructing the Universe with Variational self-Boosted Sampling
14:45-15:00 Spotlight Yuchen Dang: TNT: Vision Transformer for Turbulence Simulations
15:00-15:30 Break Afternoon Coffee Break
15:30-16:30 Keynote Soledad Villar: Equivariant machine learning, structured like classical physics
16:30-16:45 Spotlight Kwok Sun Tang: Galaxy Merger Reconstruction with Equivariant Graph Normalizing Flows
16:45-17:00 Spotlight Denise Lanzieri: Hybrid Physical-Neural ODEs for Fast N-body Simulations
17:00-17:15 Spotlight Tri Nguyen: Uncovering dark matter density profiles in dwarf galaxies with graph neural networks
17:30-18:30 Panel Discussion Enabling Scientific Discoveries with ML
18:30-20:00 Poster Session Main Poster Session


Accepted Contributions

Pixelated Reconstruction of Gravitational Lenses using Recurrent Inference Machines Adam, Alexandre*
Unsupervised Learning for Stellar Spectra with Deep Normalizing Flows Ciuca, Ioana*; Ting, Yuan-Sen
TNT: Vision Transformer for Turbulence Simulations Dang, Yuchen*; Hu, Zheyuan; Cranmer, Miles; Eickenberg, Michael; Ho, Shirley
Calibrated Predictive Distributions for Photometric Redshifts Dey, Biprateep*; Zhao, David; Andrews, Brett; Newman, Jeff; Izbicki, Rafael; Lee, Ann
Full-Sky Gravitational Lensing Simulations Using Generative Adversarial Networks Fiedorowicz, Pier*; Rozo, Eduardo; Boruah, Supranta; Coulton, William; Ho, Shirley; Fabbian, Giulio
GaMPEN: An ML Framework for Estimating Galaxy Morphological Parameters and Quantifying Uncertainty Ghosh, Aritra*; Urry, C. Megan; Rau, Amrit; Perreault-Levasseur, Laurence; Cranmer, Miles; Schawinski, Kevin; Stark, Dominic; Tian, Chuan; Ofman, Ryan
SIMBIG: Likelihood-Free Inference of Galaxy Clustering Hahn, ChangHoon*; Abidi, Muntazir; Eickenberg, Michael; Ho, Shirley; Lemos, Pablo; Massara, Elena; Moradinezhad Dizgah, Azadeh; Régaldo-Saint Blancard, Bruno
Accelerated Galaxy SED Modeling using Amortized Neural Posterior Estimation Hahn, ChangHoon*; Melchior, Peter M
Scalable Bayesian Inference for Detection and Deblending in Astronomical Images Hansen, Derek L*; Mendoza, Ismael; Liu, Runjing; Pang, Ziteng; Zhao, Zhe; Avestruz, Camille; Regier, Jeffrey
Galaxies on graph neural networks: towards robust synthetic galaxy catalogs with deep generative models Jagvaral, Yesukhei*; Mandelbaum, Rachel; Lanusse, Francois; Ravanbakhsh, Siamak; Singh, Sukhdeep; Campbell, Duncan
Learning Galaxy Properties from Merger Trees Jespersen, Christian K*; Cranmer, Miles; Melchior, Peter M; Ho, Shirley; Somerville, Rachel; Gabrielpillai, Austen
Probabilistic Dalek - Emulator framework with probabilistic prediction for supernova tomography Kerzendorf, Wolfgang E*; Chen, Nutan; van der Smagt, Patrick
Hybrid Physical-Neural ODEs for Fast N-body Simulations Lanzieri, Denise*; Lanusse, Francois; Starck, Jean-Luc
Population-Level Inference of Strong Gravitational Lenses with Neural Network-Based Selection Correction Legin, Ronan*; Stone, Connor J; Hezaveh, Yashar; Perreault-Levasseur, Laurence
Robust Simulation-Based Inference in Cosmology with Bayesian Neural Networks Lemos, Pablo*; Cranmer, Miles; Abidi, Muntazir; Hahn, Chang Hoon; Eickenberg, Michael; Massara, Elena; Yallup, David; Ho, Shirley
DeepBench: A library for simulating benchmark datasets for scientific analysis Lewis, Ashia; Voetberg, Margaret*; Nord, Brian; Jones, Craig; Hložek, Renée; Ciprijanovic, Aleksandra; Perdue, Gabriel Nathan
On Estimating ROC Arc Length and Lower Bounding Maximal AUC for Imbalanced Classification Liu, Song*
Autoencoding Galaxy Spectra Melchior, Peter M*; Hahn, ChangHoon; Liang, Yan
Strong Lensing Source Reconstruction Using Continuous Neural Fields Mishra-Sharma, Siddharth*; Yang, Ge
Reconstructing the Universe with Variational self-Boosted Sampling Modi, Chirag*; Li, Yin; Blei, David
Bayesian Neural Networks for classification tasks in the Rubin big data era Moller, Anais*; Main de Boissiere, Thibault
Don't Pay Attention to the Noise: Learning Self-supervised Representations of Light Curves with a Denoising Time Series Transformer Morvan, Mario*; Nikolaou, Nikolaos; Yip, Kai; Waldmann, Ingo
Uncovering dark matter density profiles in dwarf galaxies with graph neural networks Nguyen, Tri*; Mishra-Sharma, Siddharth; Necib, Lina
Astroconformer: Inferring Surface Gravity of Stars from Stellar Light Curves with Transformer Pan, Jiashu*; Ting, Yuan-Sen; Yu, Jie
Learnable wavelet neural networks for cosmological inference Pedersen, Chris*; Ho, Shirley; Eickenberg, Michael
A Convolutional Neural Network for Supernova Time-Series Classification Qu, Helen*
Estimating Cosmological Constraints from Galaxy Cluster Abundance using Simulation-Based Inference Reza, Moonzarin*; Zhang, Yuanyuan; Nord, Brian; Poh, Jason; Ciprijanovic, Aleksandra; Strigari, Louis
Fast Estimation of Physical Galaxy Properties using Simulation-Based Inference Robeyns, Maxime H*; Walmsley, Mike; Fotopoulou, Sotiria; Aitchison, Laurence
Learning useful representations for radio astronomy “in the wild” with contrastive learning Slijepcevic, Inigo V*; Scaife, Anna; Walmsley, Mike; Bowles, Micah R
An Unsupervised Learning Approach for Quasar Continuum Prediction Sun, Zechang*; Ting, Yuan-Sen; Cai, Zheng
Galaxy Merger Reconstruction with Equivariant Graph Normalizing Flows Tang, Kwok Sun*; Ting, Yuan-Sen
Reduced Order Model for Chemical Kinetics: A case study with Primordial Chemical Network Tang, Kwok Sun*; Turk, Matthew
Inferring Structural Parameters of Low-Surface-Brightness-Galaxies with Uncertainty Quantification using Bayesian Neural Networks Tanoglidis, Dimitrios*; Drlica-Wanger, Alex ; Ciprijanovic, Aleksandra
LINNA: Likelihood Inference Neural Network Accelerator To, Chun-Hao*; Rozo, Eduardo
Toward Galaxy Foundation Models with Hybrid Contrastive Learning Walmsley, Mike*; Slijepcevic, Inigo; Bowles, Micah R; Scaife, Anna
Automated discovery of interpretable gravitational-wave population models Wong, Kaze*; Cranmer, Miles
Neural Posterior Estimation with Differentiable Simulator Zeghal, Justine*; Lanusse, Francois; Boucaud, Alexandre; Remy, Benjamin; Aubourg, Eric
Parameter Estimation in Realistic Binary Microlensing Light Curves with Neural Controlled Differential Equation Zhao, Haimeng*; Zhu, Wei


Call for Abstracts

We invite all contributions in connection with the theme of the workshop as described here, from the fields of Astrophysics and Machine Learning, but also from other scientific fields facing similar challenges.

Original contributions and early research works are encouraged. Contributions presenting recently published or currently under review results are also welcome. The workshop will not have formal proceedings, but accepted submissions will be linked on the workshop webpage.

Submissions, in the form of extended abstracts, need to adhere to the ICML 2022 format (LaTeX style files), be anonymized, and be no longer than 4 pages (excluding references). After double-blind review, a limited set of submissions will be selected for contributed talks, and a wider set of submissions will be selected for poster presentations.

Please submit your anonymized extended abstract through CMT at https://cmt3.research.microsoft.com/ML4Astro2022 before May 23rd, 23:59 AOE.



Logistics and FAQs

ICML 2022 is currently planned as an in-person event. As such, this workshop is currently assuming a hybrid format, with physical poster sessions and in-person speakers, but with support for virtual elements to facilitate participation of people unable to travel. We encourage all interested participants (regardless of their ability to physically travel to ICML) to submit an extended abstracts.

Registration for ICML workshops is handled through the main ICML conference registration here. The workshop will be able to guarantee the ICML registration for participants with accepted contributions.

Inquiries regarding the workshop can be directed to icml2022ml4astro@gmail.com


Important Dates

  • Submission deadline: May 23rd
  • Author Notification: June 10th
  • Slideslive upload deadline for online talks: July 1st (SlidesLive will only guarantee your recording will be available in time for the conference if you respect the official July 1st deadline, so we highly encourage you to submit it by then.)
  • Camera-ready paper deadline: July 10th
  • Camera-ready poster deadline: July 15th (see instructions below)
  • Workshop date: July 22nd


Instructions for Posters

To prepare your poster, please use the following dimensions:

  • Recommended size: 24”W x 36”H, 61 x 90cm (A1 portrait)
  • Maximum size: 48”W x 36”H, 122 x 90 cm (A0 landscape)
  • Please use lightweight paper for printing, poster boards may not be available for workshops, but you will be able to tape your poster to the wall using provided tape.

In addition to the physical poster, please visit this page for instructions on how to upload the camera-ready version of your poster to the ICML website: https://wiki.eventhosts.cc/en/reference/posteruploads You will be able to submit your poster at this link: https://icml.cc/PosterUpload



SOC

Scientific Organizing Committee for the ICML 2022 Machine Learning for Astrophysics workshop:


Francois Lanusse
CNRS (Co-Chair)

Marc Huertas-Company
IAC (Co-Chair)

Vanessa Boehm
UC Berkeley


Brice Ménard
Johns Hopkins University

Xavier J. Prochaska
UC Santa-Cruz

Uros Seljak
UC Berkeley


Francisco Villaescusa-Navarro
Simons Foundation

Ashley Villar
Pennsylvania State University