As modern astrophysical surveys deliver an unprecedented amount of data, from the imaging of hundreds of millions of distant galaxies to the mapping of cosmic radiation fields at ultra-high resolution, conventional data analysis methods are reaching their limits in both computational complexity and optimality. Deep Learning has rapidly been adopted by the astronomical community as a promising way of exploiting these forthcoming big-data datasets and of extracting the physical principles that underlie these complex observations. This has led to an unprecedented exponential growth of publications combining Machine Learning and astrophysics. Yet, many of these works remain at an exploratory level and have not been translated into real scientific breakthroughs.
Following a successful initial iteration of this workshop at ICML 2022, our continued goal for this workshop series is to bring together Machine Learning researchers and domain experts in the field of Astrophysics to discuss the key open issues which hamper the use of Deep Learning for scientific discovery, and to present high-quality and cutting-edge work at the intersection between machine learning and astrophysics.
An important aspect to the success of Machine Learning in Astrophysics is to create a two-way interdisciplinary dialog in which concrete data-analysis challenges can spur the development of dedicated Machine Learning tools, which this workshop aims to facilitate. We expect this workshop to appeal to ICML audiences as an opportunity to connect their research interests to concrete and outstanding scientific challenges.
We welcome in particular submissions that target or report on the following non-exhaustive list of problems:
We encourage both submissions on these topics with an astrophysics focus, as well as more methodologically oriented works with potential applications in the physical sciences.
All times are in Hawaii Time. Please visit the ICML Workshop Page for live schedule (requires registration).
Introduction | Welcome and Introduction of the Workshop | |
Keynote | Chelsea Finn: Detecting and Adapting to Distribution Shift | |
Spotlight | Vidhi Ramesh: Shared Stochastic Gaussian process Decoders: A Probabilistic Generative model for Quasar Spectra | |
Spotlight | Yitian Sun: Disentangling gamma-ray observations of the Galactic Center using differentiable probabilistic programming | |
Break | Morning Coffee Break | |
Keynote | Anna Scaife: Foundation Models for Radio Astronomy | |
Spotlight | Guillermo Cabrera-Vives: Positional Encodings for Light Curve Transformers: Playing with Positions and Attention | |
Spotlight | Alice Desmons: Detecting Tidal Features using Self-Supervised Learning | |
Spotlight | Jonas Wildberger: Flow Matching for Scalable Simulation-Based Inference | |
Spotlight | Eve Campeau-Poirier: Time Delay Cosmography with a Neural Ratio Estimator | |
Break | Lunch Break | |
Keynote | Dmitry Duev: Astrophysics Meets MLOps | |
Spotlight | Carolina Cuesta: Diffusion generative modeling for galaxy surveys: emulating clustering for inference at the field level | |
Spotlight | Adrian Bayer: Field-Level Inference with Microcanonical Langevin Monte Carlo | |
Spotlight | Matt Sampson: Spotting Hallucinations in Inverse Problems with Data-Driven Priors | |
Keynote | Ross Taylor: Teaching LLMs to Reason | |
Poster Session | Main Poster Session | |
Panel Discussion | How will new technologies such as foundation models/generative models/LLMs change the way we do scientific discoveries? Panelists: Megan Andsell, Yashar Hezaveh, David W. Hogg, Peter Melchior, Irina Rish, Yuan-Sen Ting |
Cosmological Data Compression and Inference with Self-Supervised Machine Learning | Akhmetzhanova, Aizhan*; Mishra-Sharma, Siddharth; Dvorkin, Cora |
Bayesian Uncertainty Quantification in High-dimensional Stellar Magnetic Field Models | Andersson, Jennifer R*; Kochukhov, Oleg ; Zhao, Zheng; Sjölund, Jens |
Field-Level Inference with Microcanonical Langevin Monte Carlo | Bayer, Adrian*; Seljak, Uros; Modi, Chirag |
Graph Representation of the Magnetic Field Topology in High-Fidelity Plasma Simulations for Machine Learning Applications | Bouri, Ioanna*; Franssila, Fanni; Alho, Markku; Cozzani, Giulia; Zaitsev, Ivan; Palmroth, Minna; Roos, Teemu |
Domain Adaptation via Minimax Entropy for Real/Bogus Classification of Astronomical Alerts | Cabrera-Vives, Guillermo*; Bolívar, César Andrés; Förster, Francisco; Muñoz Arancibia, Alejandra M.; Pérez-Carrasco, Manuel; reyes, esteban dirk |
Time Delay Cosmography with a Neural Ratio Estimator | Campeau-Poirier, Ève*; Perreault-Levasseur, Laurence; Coogan, Adam; Hezaveh, Yashar |
A Comparative Study on Generative Models for High Resolution Solar Observation Imaging | Cherti, Mehdi*; Czernik, Alexander; Kesselheim, Stefan; Effenberger, Frederic; Jitsev, Jenia |
Harnessing the Power of Adversarial Prompting and Large Language Models for Robust Hypothesis Generation in Astronomy | Ciuca, Ioana*; Ting, Yuan-Sen; Kruk, Sandor; Iyer, Kartheik |
Diffusion generative modeling for galaxy surveys: emulating clustering for inference at the field level | Cuesta, Carolina*; Mishra-Sharma, Siddharth |
Multiscale Flow for Robust and Optimal Cosmological Analysis | Dai, Biwei*; Seljak, Uros |
Detecting Tidal Features using Self-Supervised Learning | Desmons, Alice*; Brough, Sarah; Lanusse, Francois |
Multi-fidelity Emulator for Cosmological Large Scale 21 cm Lightcone Images: a Few-shot Transfer Learning Approach with GAN | Diao, Kangning*; Mao, Yi |
SimBIG: Galaxy Clustering beyond the Power Spectrum | Hahn, ChangHoon*; Lemos, Pablo; Régaldo-Saint Blancard, Bruno; Parker, Liam H; Eickenberg, Michael; Ho, Shirley; Hou, Jiamin; Massara, Elena ; Modi, Chirag; Moradinezhad Dizgah, Azadeh; Spergel, David |
Cosmology with Galaxy Photometry Alone | Hahn, ChangHoon*; Melchior, Peter M; Villaescusa-Navarro, Francisco; Teyssier, Romain |
Shared Stochastic Gaussian process Decoders: A Probabilistic Generative model for Quasar Spectra | Lalchand, Vidhi *; Eilers, Anna-Christina |
Closing the stellar labels gap: An unsupervised, generative model for Gaia BP/RP spectra | Laroche, Alexander L*; Speagle, Joshua S |
Towards Unbiased Gravitational-Wave Parameter Estimation using Score-Based Likelihood Characterization | Legin, Ronan*; Wong, Kaze; Isi, Maximiliano; Adam, Alexandre; Perreault-Levasseur, Laurence; Hezaveh, Yashar |
SimBIG: Field-level Simulation-based Inference of Large-scale Structure | Lemos, Pablo*; Parker, Liam H; Hahn, ChangHoon; Ho, Shirley; Eickenberg, Michael; Hou, Jiamin; Massara, Elena ; Modi, Chirag; Moradinezhad Dizgah, Azadeh; Régaldo-Saint Blancard, Bruno; Spergel, David |
Using Multiple Vector Channels Improves $E(n)$-Equivariant Graph Neural Networks | Levy, Daniel*; Kaba, Sékou-Oumar; Gonzales, Carmelo; Miret, Santiago; Ravanbakhsh, Siamak |
Population-Level Inference for Galaxy Properties from Broadband Photometry | Li, Jiaxuan*; Melchior, Peter M; Hahn, ChangHoon; Huang, Song |
A Hierarchy of Normalizing Flows for Modelling the Galaxy-Halo Relationship | Lovell, Christopher C*; Hassan, Sultan; Villaescusa-Navarro, Francisco; Genel, Shy; Hahn, ChangHoon; Angles-Alcazar, Daniel; Kwon, James; de Santi, Natali; Iyer, Kartheik; Fabbian, Giulio; Bryan, Greg |
PPDONet: Deep Operator Networks for Fast Prediction of Steady-State Solutions in Disk-Planet Systems | Mao, Shunyuan*; Dong, Ruobing; Lu, Lu; Yi, Kwang Moo; Wang, Sifan; Perdikaris, Paris |
Positional Encodings for Light Curve Transformers: Playing with Positions and Attention | Moreno-Cartagena, Daniel A; Cabrera-Vives, Guillermo*; Protopapas, Pavlos; Donoso, Cristobal R; Pérez-Carrasco, Manuel Ignacio; Cádiz-Leyton, Martina A |
Neural Astrophysical Wind Models | Nguyen, Dustin* |
FLORAH: A generative model for halo assembly histories | Nguyen, Tri*; Modi, Chirag; Somerville, Rachel; Yung, Aaron |
Multi-Class Deep SVDD: Anomaly Detection Approach in Astronomy with Distinct Inlier Categories | Pérez-Carrasco, Manuel Ignacio*; Cabrera-Vives, Guillermo; Hernandez-García, Lorena; Förster, Francisco; Sanchez-Saez, Paula; Muñoz-Arancibia, Alejandra; Astorga, Nicolás; Bauer, Franz; Bayo, Amelia; Cádiz-Leyton, Martina A; Catelan, Márcio; Estevez, Pablo |
BTSbot: A Multi-input Convolutional Neural Network to Automate and Expedite Bright Transient Identification for the Zwicky Transient Facility | Rehemtulla, Nabeel*; Miller, Adam; Coughlin, Michael; Jegou Du Laz, Theophile |
Toward a Spectral Foundation Model: An Attention-Based Approach with Domain-Inspired Fine-Tuning and Wavelength Parameterization | Różański, Tomasz; Ting, Yuan-Sen*; Jablonska, Maja |
Spotting Hallucinations in Inverse Problems with Data-Driven Priors | Sampson, Matt L*; Melchior, Peter M |
Evaluating Summary Statistics with Mutual Information for Cosmological Inference | Sui, Ce*; Zhao, Xiaosheng; Jing, Tao; Mao, Yi |
Disentangling gamma-ray observations of the Galactic Center using differentiable probabilistic programming | Sun, Yitian*; Mishra-Sharma, Siddharth; Slatyer, Tracy R; Wu, Yuqing |
Weisfeiler-Lehman Graph Kernel Method: A New Approach to Weak Chemical Tagging | Ting, Yuan-Sen*; Sharma, Bhavesh |
A Novel Application of Conditional Normalizing Flows: Stellar Age Inference with Gyrochronology | Van-Lane, Phil*; Speagle, Joshua S; Douglas, Stephanie |
Flow Matching for Scalable Simulation-Based Inference | Wildberger, Jonas Bernhard*; Dax, Maximilian; Buchholz, Simon; Green, Stephen R; Macke , Jakob; Schölkopf, Bernhard |
Learning the galaxy-environment connection with graph neural networks | Wu, John F*; Jespersen, Christian |
Diffusion Models for Probabilistic Deconvolution of Galaxy Images | Xue, Zhiwei; Li, Yuhang; Patel, Yash P*; Regier, Jeffrey |
A cross-modal adversarial learning method for estimating photometric redshift of quasars | Zhang, Chen*; Zhang, Yanxia; Jiang, Bin; Qu, Meixia; Wang, Wenyu |
nbi: the Astronomer's Package for Neural Posterior Estimation | Zhang, Keming*; Bloom, Joshua; Hernitschek, Nina |
Stellar Spectra Fitting with Amortized Neural Posterior Estimation and nbi | Zhang, Keming*; Jayasinghe, Tharindu; Bloom, Joshua |
3D ScatterNet: Inference from 21 cm Light-cones | Zhao, Xiaosheng*; Zuo, Shifan; Mao, Yi |
We invite all contributions in connection with the theme of the workshop as described here.
An important selection criterion will be the novelty of the work - novel methodologies and novel applications. Applications of standard and established deep learning techniques to a new astrophysical data set are not considered as novel applications in this context.
Original contributions and early research works are encouraged. Contributions presenting recently published or currently under review results are also welcome. The workshop will not have formal proceedings, but accepted submissions will be linked on the workshop webpage.
Submissions, in the form of extended abstracts, need to adhere to the ICML 2023 format (LaTeX style files), be anonymized, and be no longer than 4 pages (excluding references). After double-blind review, a limited set of submissions will be selected for contributed talks, and a wider set of submissions will be selected for poster presentations.
Please submit your anonymized extended abstract through CMT at https://cmt3.research.microsoft.com/ML4Astro2023 before May 19th May 25th, 23:59 AOE.
Our goal is to ensure that all extended abstracts will receive at least two independent reviews, in a double-blind process. As we aim for high quality and constructive reviews, we do not want to ask volunteers to review many papers, which translates into needing a large pool of volunteers.
As a result we are always looking for volunteers to help us review workshop submissions. If you are interested in serving as a reviewer, please let us know through this form before May 25th. Depending on the reviewing needs we may then contact you with further details and to confirm your availability.
ICML 2023 is currently planned as an in-person event. As such, this workshop is currently assuming a hybrid format, with physical poster sessions and in-person speakers, but with support for virtual elements to facilitate participation of people unable to travel. We encourage all interested participants (regardless of their ability to physically travel to ICML) to submit an extended abstracts.
Registration for ICML workshops is handled through the main ICML conference registration here.
Inquiries regarding the workshop can be directed to icml2023ml4astro@gmail.com
All dates are in AOE (Anywhere on Earth).
To prepare your poster, please use the following dimensions:
Scientific Organizing Committee for the ICML 2023 Machine Learning for Astrophysics workshop: