Discover
Earthly Machine Learning
Earthly Machine Learning
Author: Amirpasha
Subscribed: 1Played: 39Subscribe
Share
© Amirpasha
Description
“Earthly Machine Learning (EML)” offers AI-generated insights into cutting-edge machine learning research in weather and climate sciences. Powered by Google NotebookLM, each episode distils the essence of a standout paper, helping you decide if it’s worth a deeper look. Stay updated on the ML innovations shaping our understanding of Earth.
It may contain hallucinations.
It may contain hallucinations.
44 Episodes
Reverse
Artificial Intelligence for Atmospheric Sciences: A Research RoadmapCitation: Zaidan, M. A., Motlagh, N. H., Nurmi, P., Hussein, T., Kulmala, M., Petäjä, T., & Tarkoma, S. (2025). Artificial Intelligence for Atmospheric Sciences: A Research Roadmap.Revolutionizing Environmental Monitoring: The paper illustrates how AI is transforming atmospheric sciences by bridging the gap between computer science and environmental research. It details how AI processes massive datasets generated by diverse sources—including satellite imagery, ground-based research stations, and low-cost IoT sensors—to improve our understanding of air quality, extreme weather events, and climate change.Optimizing Infrastructure and Prediction: Current AI applications are already enhancing operational meteorology and Earth system modeling. By utilizing techniques like deep learning and neural networks, researchers can automate sensor calibration, detect anomalies in real-time, and simulate complex climate scenarios with greater speed and efficiency than traditional physical models allow.A Roadmap for Future Hardware: To handle the escalating demand for data, the authors propose a hardware roadmap that includes self-sustaining and biodegradable sensor networks, CubeSat constellations for high-resolution monitoring, and the adoption of cutting-edge computing paradigms like quantum, neuromorphic, and DNA-based molecular computing.Next-Generation AI Methodologies: The paper argues for the adoption of advanced AI techniques such as Foundation Models and Generative AI (including Digital Twins of Earth) to predict complex atmospheric phenomena. Crucially, it emphasizes the need for Explainable AI (XAI) and Physics-Informed Machine Learning to solve the "black box" problem, ensuring that AI predictions abide by physical laws and are transparent enough for scientists and policymakers to trust.From Data to Action: Beyond observation, the research highlights the shift toward actionable insights. This includes automated feedback loops (such as smart HVAC systems responding to air quality data), the integration of citizen science to augment data collection, and the establishment of robust ethical frameworks to manage data privacy and governance in global monitoring networks.
Differentiable and accelerated spherical harmonic and Wigner transformsMatthew A. Price, Jason D. McEwen*Journal of Computational Physics (2024)** This work introduces novel algorithmic structures for the **accelerated and differentiable computation** of generalized Fourier transforms on the sphere ($S^2$) and the rotation group ($SO(3)$), specifically spherical harmonic and Wigner transforms.* A key component is a **recursive algorithm for Wigner d-functions** designed to be stable to high harmonic degrees and extremely parallelizable, making the algorithms well-suited for high throughput computing on modern hardware accelerators such as GPUs.* The transforms support efficient computation of gradients, which is critical for machine learning and other differentiable programming tasks, achieved through a **hybrid automatic and manual differentiation approach** to avoid the memory overhead associated with full automatic differentiation.* Implemented in the open-source **S2FFT** software code (within the JAX differentiable programming framework), the algorithms support various sampling schemes, including equiangular samplings that admit exact spherical harmonic transforms.* Benchmarking results demonstrate **up to a 400-fold acceleration** compared to alternative C codes, and the transforms exhibit **very close to optimal linear scaling** when distributed over multiple GPUs, yielding an unprecedented effective linear time complexity (O(L)) given sufficient computational resources.
Score-based diffusion nowcasting of GOES imagery*Randy J. Chase, Katherine Haynes, Lander Ver Hoef, Imme Ebert-Uphoff, a Cooperative Institute for Research in the Atmosphere, Colorado State University, Fort Collins, CO, b Electrical and Computer Engineering, Colorado State University, Fort Collins, CO** The research explored score-based diffusion models to perform short-term forecasts (nowcasting) of GOES geostationary infrared satellite imagery (zero to three hours). This newer machine learning methodology combats the issue of **blurry forecasts** often produced by earlier neural network types, enabling the generation of clearer and more realistic-looking forecasts.* The **residual correction diffusion model (CorrDiff)** proved to be the best-performing model, quantitatively outperforming all other tested diffusion models, a traditional Mean Squared Error trained U-Net, and a persistence forecast by one to two kelvin on root mean squared error.* The diffusion models demonstrated sophisticated predictive capabilities, showing the ability to not only advect existing clouds but also to **generate and decay clouds**, including initiating convection, despite being initialized with only the past 20 minutes of satellite imagery.* A key benefit of the diffusion framework is the capacity for **out-of-the-box ensemble generation**, which enhances pixel-based metrics and provides useful uncertainty quantification where the spread of the ensemble generally correlates well to the forecast error.* However, the diffusion models are computationally intensive, with the Diff and CorrDiff models taking approximately five days to train on specialized hardware and about 10 minutes to generate a 10-member, three-hour forecast, compared to just 10 seconds for the baseline U-Net forecast.
FuXi-Ocean: A Global Ocean Forecasting System with Sub-Daily Resolution*Qiusheng Huang, Yuan Niu, Xiaohui Zhong, Anboyu Guo, Lei Chen, Dianjun Zhang, Xuefeng Zhang, Hao Li*---* **First Data-Driven Sub-Daily Global Forecast:** FuXi-Ocean is the first deep learning-based global ocean forecasting model to achieve six-hour temporal resolution at an eddy-resolving 1/12° spatial resolution, with vertical coverage extending up to 1500 meters. This capability addresses a crucial need for high-frequency predictions that traditional numerical models struggle to deliver efficiently.* **Adaptive Temporal Modeling Innovation:** A key component of the model is the **Mixture-of-Time (MoT) module**, which adaptively integrates predictions from multiple temporal contexts based on variable-specific reliability. This mechanism is crucial for accommodating the diverse temporal dynamics of different ocean variables (e.g., fast-changing surface variables vs. slowly evolving deep-ocean processes) and effectively mitigates the accumulation of forecast errors in sequential prediction.* **Superior Performance and Efficiency:** The model demonstrates superior skill in predicting key variables (temperature, salinity, and currents) compared to state-of-the-art operational numerical forecasting systems (like HYCOM, BLK, and FOAM) at sub-daily intervals. Furthermore, it achieves this high performance with remarkable data efficiency, requiring only approximately 9 years of training data and relying solely on ocean variables (T, S, U, V, SSH) as input, without external data dependencies like atmospheric forcing.* **High-Impact Applications:** By providing accurate, high-resolution, sub-daily forecasts, FuXi-Ocean creates critical opportunities for maritime operations, including improved navigation, search and rescue, oil spill trajectory tracking, and enhanced marine resource management, particularly due to its comprehensive vertical coverage (0-1500 m).
Beyond the Training Data: Confidence-Guided Mixing of Parameterizations in a Hybrid AI-Climate Model*By Helge Heuer, Tom Beucler, Mierk Schwabe, Julien Savre, Manuel Schlund, and Veronika Eyring** This paper presents a **successful proof-of-concept for transferring a machine learning (ML) convection parameterization**—trained on the ClimSim dataset—to the ICON-A climate model. The resulting hybrid ML-physics model achieved stable and accurate simulations in long-term AMIP-style runs lasting at least 20 years.* A core innovation is the **confidence-guided mixing scheme**, which allows the Neural Network (NN) to predict its own error. When the NN's predicted confidence is low (e.g., in moist, unstable regimes or high-variability areas), its prediction is mixed with the conventional Tiedtke convection scheme. This mechanism improves reliability, prevents unphysical outputs by detecting potential extrapolation beyond the training domain, and makes the hybrid model tunable against observations.* The scheme's robustness and accuracy were further enhanced through the **use of a physics-informed loss function**—which encourages adherence to conservation laws like enthalpy and mass—and **noise-augmented training**. These techniques mitigate stability issues commonly faced by ML parameterizations and significantly improve physical consistency compared to purely data-driven models.* In evaluation against observational data, several hybrid configurations **outperformed the default Tiedtke scheme**, demonstrating improved precipitation statistics and showing a better representation of global climate variables. The confidence-guided approach demonstrated a fundamental change in the model's behavior, with the ML component contributing approximately 67% of the convective tendencies on average.
Climate in a Bottle: Towards a Generative Foundation Model for the Kilometer-Scale Global Atmosphere(By Noah D. Brenowitz, Tao Ge, Akshay Subramaniam, Peter Manshausen, Aayush Gupta, David M. Hall, Morteza Mardani, Arash Vahdat, Karthik Kashinath, Michael S. Pritchard, NVIDIA* The paper introduces **Climate in a Bottle (cBottle)**, a generative diffusion-based AI framework capable of synthesizing full global atmospheric states at an unprecedented $\mathbf{5 \text{ km resolution}}$ (over 12.5 million pixels per sample). Unlike prevailing auto-regressive paradigms, cBottle samples directly from the full distribution of atmospheric states without requiring a previous time step, thereby avoiding issues like drifts and instabilities inherent to time-stepping models.* cBottle utilizes a **two-stage cascaded diffusion approach**: a global coarse-resolution generator conditioned on minimal climate-controlling inputs (such as monthly sea surface temperature and solar position), followed by a patch-based 16x super-resolution module.* The model demonstrates **foundational versatility** by being trained jointly on multiple data modalities, including ERA5 reanalysis and ICON global cloud-resolving simulations. This enables various zero-shot applications such as climate downscaling, channel infilling for missing or corrupted variables, bias correction between datasets, and translation between these modalities.* cBottle proposes a new form of **interactive climate modeling** through the use of guided diffusion. By training a classifier alongside the generator, users can steer the model to conditionally generate physically plausible **extreme weather events, such as Tropical Cyclones**, at specified locations on demand, circumventing the need to sift through petabytes of output to find rare events.* The model exhibits **high climate faithfulness** across a battery of tests, including reproducing diurnal-to-seasonal scale variability, large-scale modes of variability (like the Northern Annular Mode), and tropical cyclone statistics. Furthermore, it achieves **extreme distillation** by encapsulating massive datasets into a few GB of neural network weights, offering a 256x compression ratio per channel.
Probabilistic measures afford fair comparisons of AIWP and NWP model output (Tilmann Gneiting, Tobias Biegert, Kristof Kraus, Eva-Maria Walz, Alexander I. Jordan, Sebastian Lerch, June 10, 2025)Introduction of a New Fair Comparison Metric: The paper introduces the Potential Continuous Ranked Probability Score (PC), a new measure designed to allow fair and meaningful comparisons between single-valued output from data-driven Artificial Intelligence based Weather Prediction (AIWP) models and physics-based Numerical Weather Prediction (NWP) models. This approach addresses concerns that traditional loss functions (like RMSE) may unfairly favor AIWP models, which often optimize their training using these metrics. Methodology Based on Probabilistic Postprocessing: PC is calculated by applying the same statistical postprocessing technique—specifically Isotonic Distributional Regression (IDR), also known as Easy Uncertainty Quantification (EasyUQ)—to the deterministic output of both AIWP and NWP models. PC is then defined as the mean Continuous Ranked Probability Score (CRPS) of these newly generated probabilistic forecasts. Measure of Potential Skill and Invariance: PC quantifies potential predictive performance. A key property of PC is that it is invariant under strictly increasing transformations of the model output, treating both forecasts equally and facilitating comparisons where the pre-specification of a loss function might otherwise place competitors on unequal footings. AIWP Outperformance and Operational Proxy: When applied to WeatherBench 2 data, the PC measure demonstrated that the data-driven GraphCast model outperforms the leading physics-based ECMWF high-resolution (HRES) model. Furthermore, the PC measure for the HRES model was found to align exceptionally well with the mean CRPS of the operational ECMWF ensemble, confirming that PC serves as a reliable proxy for the performance of real-time operational probabilistic products.
Jigsaw: Training Multi-Billion-Parameter AI Weather Models With Optimized Model ParallelismAuthors: Deifilia Kieckhefen, Markus Götz, Lars H. Heyen, Achim Streit, and Charlotte Debus (Karlsruhe Institute of Technology, Helmholtz AI)The paper introduces WeatherMixer (WM), a multi-layer perceptron (MLP)-based architecture designed for atmospheric forecasting, which serves as a competitive alternative to Transformer-based models. WM's workload scales linearly with input size, addressing the scaling challenges and quadratic computational complexity associated with the self-attention mechanism in Transformers when dealing with gigabyte-sized atmospheric data.• A novel parallelization scheme called Jigsaw parallelism is proposed, combining both domain parallelism and tensor parallelism to efficiently train multi-billion-parameter models. Jigsaw is optimized for large input data by fully sharding the data, model parameters, and optimizer states across devices, eliminating memory redundancy. Jigsaw effectively mitigates hardware bottlenecks, particularly I/O-bandwidth limitations frequently encountered in training large scientific AI models. Due to its partitioned data loading (domain parallelism), the scheme achieves superscalar weak scaling in I/O-bandwidth-limited systems. The method demonstrates excellent scaling behavior on high-performance computing systems, exceeding state-of-the-art performance in strong scaling in computation–communication-limited systems. The training was successfully scaled up to 256 GPUs, reaching peak performances of 9 and 11 PFLOPs.• Beyond hardware efficiency, Jigsaw improves predictive performance: by partitioning the model across more GPUs (model parallelism) instead of relying solely on data parallelism, it naturally enforces smaller global batch sizes, which empirically helps mitigate the problematic large-batch effects observed in AI weather models, leading to lower loss values.
XiChen: An observation-scalable fully AI-driven global weather forecasting system with 4D variational knowledgeAuthors: Wuxin Wang, Weicheng Ni, Lilan Huang, Tao Han, Ben Fei, Shuo Ma, Taikang Yuan, Yanlai Zhao, Kefeng Deng, Xiaoyong Li, Boheng Duan, Lei Bai, Kaijun RenXiChen is the first observation-scalable fully AI-driven global weather forecasting system. Its entire pipeline, from Data Assimilation (DA) to 10-day medium-range forecasting, can be accomplished within only 17 seconds using a single A100 GPU. This speed represents an acceleration exceeding 400-fold compared to the computational time required by operational Numerical Weather Prediction (NWP) systems. The system is architected upon a foundation model that is initially pre-trained for weather forecasting and subsequently fine-tuned to function as both observation operators and DA models. Crucially, the integration of four-dimensional variational (4DVar) knowledge ensures that XiChen’s DA and medium-range forecasting accuracy rivals that of operational NWP systems. XiChen demonstrates high scalability and robustness by employing a cascaded sequential DA framework to effectively assimilate both conventional observations (GDAS prepbufr) and raw satellite observations (AMSU-A and MHS). This design allows for the future integration of new observations simply by fine-tuning the respective observation operators and DA model components, which is critical for operational deployment. In terms of performance, XiChen achieves a skillful weather forecasting lead time exceeding 8.25 days (with ACC of Z500 > 0.6). This result is comparable to the Global Forecasting System (GFS) and substantially surpasses the performance of other end-to-end AI-based global weather forecasting systems, such as Aardvark (less than 8 days) and GraphDOP (about 5 days). A dual DA framework is implemented to operationalize XiChen as a continuous forecasting system. This framework utilizes separate 12-hour and 3-hour Data Assimilation Windows (DAW) to circumvent the multi-hour latency characteristic of high-resolution systems (like IFS HRES), thereby enabling the real-time acquisition of medium-range forecast products.
A data-to-forecast machine learning system for global weather Xiuyu Sun et al. (2025). A data-to-forecast machine learning system for global weather. Nature Communications, https://doi.org/10.1038/s41467-025-62024-1• FuXi Weather is introduced as a groundbreaking end-to-end machine learning system for global weather forecasting. It autonomously performs data assimilation and forecasting in a 6-hour cycle, directly processing raw multi-satellite observations, and notably, it is the first such system to demonstrate continuous cycling operation over a full one-year period.• The system exhibits superior forecast accuracy in observation-sparse regions, outperforming traditional high-resolution forecasts from the European Centre for Medium-Range Weather Forecasts (ECMWF HRES) beyond day one in areas like central Africa and northern South America, despite utilizing substantially fewer observations.• Globally, FuXi Weather delivers comparable 10-day forecast performance to ECMWF HRES, generating reliable forecasts at a 0.25° resolution and extending the skillful lead times for a number of key meteorological variables.• FuXi Weather offers a cost-effective and physically consistent alternative to traditional Numerical Weather Prediction (NWP) systems. Its computational efficiency and reduced complexity are valuable for improving operational forecasts and enhancing climate resilience in regions with limited land-based observational infrastructure.• This development challenges the prevailing view that standalone machine learning-based weather forecasting systems are not viable for operational use, demonstrating a significant step forward in the application of AI to real-world weather prediction.
FourCastNet 3: A geometric approach to probabilistic machine-learning weather forecasting at scale Boris Bonev, Thorsten Kurth, Ankur Mahesh, Mauro Bisson, Jean Kossaifi, Karthik Kashinath, Anima Anandkumar, William D. Collins, Michael S. Pritchard, and Alexander Keller• FourCastNet 3 (FCN3) introduces a pioneering geometric machine learning approach for probabilistic ensemble weather forecasting. It is designed to respect spherical geometry and accurately model the spatially correlated probabilistic nature of weather, resulting in stable spectra and realistic dynamics across multiple scales. The architecture is a purely convolutional neural network tailored for spherical geometry.• Achieves superior forecasting accuracy and speed, surpassing leading conventional ensemble models and rivaling the best diffusion-based ML methods. FCN3 produces forecasts 8 to 60 times faster than these approaches; for instance, a 60-day global forecast at 0.25°, 6-hourly resolution is generated in under 4 minutes on a single GPU.• Demonstrates exceptional physical fidelity and long-term stability, maintaining excellent probabilistic calibration and realistic spectra even at extended lead times of up to 60 days. This crucial achievement mitigates issues like blurring and the build-up of small-scale noise, which challenge other machine learning models, paving the way for physically faithful data-driven probabilistic weather models.• Enables scalable and efficient operations through a novel training paradigm that combines model- and data-parallelism, allowing large-scale training on 1024 GPUs and more. All key components, including training and inference code, are fully open-source, providing transparent and reproducible tools for meteorological forecasting and atmospheric science research.
🧠 Abstract:Climate change is increasing the frequency and severity of disasters, demanding more effective Early Warning Systems (EWS). While current systems face hurdles in forecasting, communication, and decision-making, this episode examines how integrated Artificial Intelligence (AI) can revolutionize risk detection and response.📌 Bullet points summary:Current EWS struggle with forecasting accuracy, impact prediction across diverse contexts, and effective communication with affected communities.Integrated AI and Foundation Models (FMs) enhance EWS by improving forecast precision, offering impact-specific alerts, and utilizing diverse data sources—from weather to social media.Foundation Models for geospatial and meteorological data, combined with natural language processing, pave the way for user-adaptive, intuitive warning systems, including chatbots and realistic visualizations.Ensuring equity and effectiveness in AI-driven EWS requires addressing data bias, robustness, ownership issues, and power dynamics—guided by FATES principles and supported by open-source tools, global cooperation, and digital inclusivity.💡 The Big Idea:Integrated AI holds the key to transforming climate early warning—from hazard alerts to adaptive, inclusive, and impact-driven systems that empower communities worldwide.📖 Citation:Reichstein, Markus, et al. "Early warning of complex climate risk with integrated artificial intelligence." Nature Communications 16.1 (2025): 2564. https://doi.org/10.1038/s41467-025-57640-w
🧠 Abstract:Machine Learning (ML) is increasingly influential in weather and climate prediction. Recent advances have led to fully data-driven ML models that often claim to outperform traditional physics-based systems. This episode evaluates forecasts from three leading ML models—Pangu-Weather, FourCastNet, and GraphCast—focusing on their accuracy and physical realism.📌 Bullet points summary:ML models like Pangu-Weather, FourCastNet, and GraphCast fail to capture sub-synoptic and mesoscale phenomena with adequate fidelity, producing forecasts that become overly smooth over time.Their energy spectra diverge significantly from traditional models and reanalysis data, leading to poor representation of features below 300–400 km scales.They lack accurate representation of key physical balances in the atmosphere, such as geostrophic wind balance and the divergent-rotational wind ratio, affecting the realism of weather diagnostics.Though computationally efficient and strong in certain metrics, these models should be seen as forecast refiners rather than full-fledged atmospheric simulators or "digital twins," as they still rely heavily on traditional models for training and input.💡 The Big Idea:While ML models mark a significant advancement, their current limitations highlight the indispensable role of physical principles and traditional modeling in weather prediction.📖 Citation:Bonavita, Massimo. "On some limitations of current machine learning weather prediction models." Geophysical Research Letters 51.12 (2024): e2023GL107377. https://doi.org/10.1029/2023GL107377
🌍 Abstract:Artificial intelligence (AI) is transforming Earth system science, especially in modeling and understanding extreme weather and climate events. This episode explores how AI tackles the challenges of analyzing rare, high-impact phenomena using limited, noisy data—and the push to make AI models more transparent, interpretable, and actionable.📌 Bullet points summary:🌪️ AI is revolutionizing how we model, detect, and forecast extreme climate events like floods, droughts, wildfires, and heatwaves, and plays a growing role in attribution and risk assessment.⚠️ Key challenges include limited data, lack of annotations, and the complexity of defining extremes, all of which demand robust, flexible AI approaches that perform well under novel conditions.🧠 Trustworthy AI is critical for safety-related decisions, requiring transparency, interpretability (XAI), causal inference, and uncertainty quantification.📢 The “last mile” focuses on operational use and risk communication, ensuring AI outputs are accessible, fair, and actionable in early warning systems and public alerts.🤝 Cross-disciplinary collaboration is vital—linking AI developers, climate scientists, field experts, and policymakers to build practical and ethical AI tools that serve real-world needs.💡 Big idea:AI holds powerful promise for extreme climate analysis—but only if it's built to be trustworthy, explainable, and operationally useful in the face of uncertainty.📚 Citation:Camps-Valls, Gustau, et al. "Artificial intelligence for modeling and understanding extreme weather and climate events." Nature Communications 16.1 (2025): 1919.https://doi.org/10.1038/s41467-025-56573-8
🎙️ Abstract:Recent progress in data-driven weather forecasting has surpassed traditional physics-based systems. Yet, the common use of mean squared error (MSE) loss functions introduces a “double penalty,” smoothing out fine-scale structures. This episode discusses a simple, parameter-free fix to this issue by modifying the loss to disentangle decorrelation errors from spectral amplitude errors.🌪️ Data-driven weather models like GraphCast often produce overly smooth outputs due to MSE loss, limiting resolution and underestimating extremes.⚙️ The proposed Adjusted Mean Squared Error (AMSE) loss function addresses this by separating decorrelation and amplitude errors, improving spectrum fidelity.📈 Fine-tuning GraphCast with AMSE boosts resolution dramatically (from 1,250km to 160km), enhances ensemble spread, and sharpens forecasts of cyclones and surface winds.🔬 This shows deterministic forecasts can remain sharp and realistic without explicitly modeling ensemble uncertainty.Redefining the loss function in data-driven weather forecasting can drastically sharpen predictions and enhance realism—without adding complexity or parameters.📚 Citation:https://doi.org/10.48550/arXiv.2501.19374🔍 Bullet points summary:💡 Big idea:
🌍 Abstract:Projecting climate change is a generalization problem: we extrapolate the recent past using physical models across past, present, and future climates. Current climate models require representations of processes that occur at scales smaller than the model grid size, which remain the main source of projection uncertainty. Recent machine learning (ML) algorithms offer promise for improving these process representations but often extrapolate poorly outside their training climates. To bridge this gap, the authors propose a “climate-invariant” ML framework, incorporating knowledge of climate processes into ML algorithms, and show that this approach enhances generalization across different climate regimes.📌 Key Points:Highlights how ML models in climate science struggle to generalize beyond their training data, limiting their utility in future climate projections.Introduces a "climate-invariant" ML framework, embedding physical climate process knowledge into ML models through feature transformations of input and output data.Demonstrates that neural networks with climate-invariant design generalize better across diverse climate conditions in three atmospheric models, outperforming raw-data ML approaches.Utilizes explainable AI methods to show that climate-informed mappings learned by neural networks are more spatially local, improving both interpretability and data efficiency.💡 The Big Idea:Combining machine learning with physical insights through a climate-invariant approach enables models that not only learn from data but also respect the underlying physics—paving the way for more reliable and generalizable climate projections.📖 Citation:Beucler, Tom, et al. "Climate-invariant machine learning." Science Advances 10.6 (2024): eadj7250. DOI: 10.1126/sciadv.adj7250
🎙️ Episode 25: ClimaX: A foundation model for weather and climateDOI: https://doi.org/10.48550/arXiv.2301.10343🌀 Abstract:Most cutting-edge approaches for weather and climate modeling rely on physics-informed numerical models to simulate the atmosphere's complex dynamics. These methods, while accurate, are often computationally demanding, especially at high spatial and temporal resolutions. In contrast, recent machine learning methods seek to learn data-driven mappings directly from curated climate datasets but often lack flexibility and generalization. ClimaX introduces a versatile and generalizable deep learning model for weather and climate science, capable of learning from diverse, heterogeneous datasets that cover various variables, time spans, and physical contexts.📌 Bullet points summary:ClimaX is a flexible foundation model for weather and climate, overcoming the rigidity of physics-based models and the narrow focus of traditional ML approaches by training on heterogeneous datasets.The model utilizes Transformer-based architecture with novel variable tokenization and aggregation mechanisms, allowing it to handle diverse climate data efficiently.Pre-trained via a self-supervised randomized forecasting objective on CMIP6-derived datasets, ClimaX learns intricate inter-variable relationships, enhancing its adaptability to various forecasting tasks.Demonstrates strong, often state-of-the-art performance across tasks like multi-scale weather forecasting, climate projections (ClimateBench), and downscaling — sometimes outperforming even operational systems like IFS.The study highlights ClimaX's scalability, showing performance gains with more pretraining data and higher resolutions, underscoring its potential for future developments with increased data and compute resources.💡 Big idea:ClimaX represents a shift toward foundation models in climate science, offering a single, adaptable architecture capable of generalizing across a wide array of weather and climate modeling tasks — setting the stage for more efficient, data-driven climate research.📖 Citation:Nguyen, Tung, et al. "Climax: A foundation model for weather and climate." arXiv preprint arXiv:2301.10343 (2023).
🎙️ Episode 24: AI-empowered Next-Generation Multiscale Climate Modelling for Mitigation and Adaptation🔗 DOI: https://doi.org/10.1038/s41561-024-01527-w🌐 AbstractDespite decades of progress, Earth system models (ESMs) still face significant gaps in accuracy and uncertainty, largely due to challenges in representing small-scale or poorly understood processes. This episode explores a transformative vision for next-generation climate modeling—one that embeds AI across multiple scales to enhance resolution, improve model fidelity, and better inform climate mitigation and adaptation strategies.📌 Bullet points summaryExisting ESMs struggle with inaccuracies in climate projections due to subgrid-scale and unknown process limitations.A new approach is proposed that blends AI with multiscale modeling, combining fine-resolution simulations with coarser hybrid models that capture key Earth system feedbacks.This strategy is built on four pillars:Higher resolution via advanced computingPhysics-aware machine learning to enhance hybrid modelsSystematic use of Earth observations to constrain modelsModernized scientific infrastructure to operationalize insightsAims to deliver faster, more actionable climate data to support urgent policy needs for both mitigation and adaptation.Envisions hybrid ESMs and interactive Earth digital twins, where AI helps simulate processes more realistically and supports climate decision-making at scale.💡 The Big IdeaIntegrating AI into climate models across scales is not just an upgrade—it’s a shift towards smarter, faster, and more adaptive climate science, essential for responding to the climate crisis with precision and urgency.📖 CitationEyring, Veronika, et al. "AI-empowered next-generation multiscale climate modelling for mitigation and adaptation." Nature Geoscience 17.10 (2024): 963–971.
🎙️ Episode 23: FourCastNet – Accelerating Global High-Resolution Weather Forecasting Using Adaptive Fourier Neural Operators🔗 DOI: https://doi.org/10.1145/3592979.3593412🌍 AbstractAs climate change intensifies extreme weather events, traditional numerical weather prediction (NWP) struggles to keep pace due to computational limits. This episode explores FourCastNet, a deep learning Earth system emulator that delivers high-resolution, medium-range global forecasts at unprecedented speed—up to five orders of magnitude faster than NWP—while maintaining near state-of-the-art accuracy.📌 Bullet points summaryFourCastNet outpaces traditional NWP with forecasts that are not only faster by several magnitudes but also comparably accurate, thanks to its data-driven deep learning approach.Powered by Adaptive Fourier Neural Operators (AFNO), the model efficiently handles high-resolution data, leveraging spectral convolutions, model/data parallelism, and performance optimizations like CUDA graphs and JIT compilation.Scales excellently across supercomputers such as Selene, Perlmutter, and JUWELS Booster, reaching 140.8 petaFLOPS and enabling rapid training and large-scale ensemble forecasts.Addresses long-standing challenges in weather and climate modeling, including limits in resolution, complexity, and throughput, paving the way for emulating fine-scale Earth system processes.Enables "Interactivity at Scale"—supporting digital Earth twins and empowering users to explore future climate scenarios interactively, aiding science, policy, and public understanding.💡 The Big IdeaFourCastNet revolutionizes weather forecasting by merging the power of deep learning and spectral methods, unlocking interactive, ultra-fast, and high-fidelity Earth system simulations for a changing world.📖 CitationKurth, Thorsten, et al. "Fourcastnet: Accelerating global high-resolution weather forecasting using adaptive fourier neural operators." Proceedings of the Platform for Advanced Scientific Computing Conference. 2023.
🎙️ Episode 22: Knowledge-guided machine learning can improve carbon cycle quantification in agroecosystems🔗 DOI: https://doi.org/10.1038/s41467-023-43860-5🧠 AbstractImproving the accuracy and scalability of carbon cycle quantification in agroecosystems is essential for climate mitigation and sustainable agriculture. This episode discusses a new Knowledge-Guided Machine Learning (KGML) framework that integrates process-based models, high-resolution remote sensing, and machine learning to address key limitations in conventional approaches.📌 Bullet points summaryIntroduces KGML-ag-Carbon, a hybrid model combining process-based simulation (ecosys), remote sensing, and ML to improve carbon cycle modeling in agroecosystems.Outperforms traditional models in capturing spatial and temporal carbon dynamics across the U.S. Corn Belt, especially under data-scarce conditions.Delivers high-resolution (250m daily) estimates for critical carbon metrics such as GPP, Ra, Rh, NEE, and crop yield, with field-level precision.Benefits from pre-training with synthetic data, remote sensing assimilation, and a hierarchical architecture with knowledge-guided loss functions for better accuracy and interpretability.Shows promise for broader applications including nutrient cycle modeling, large-scale carbon assessment, and scenario testing under various management and climate conditions.💡 The Big IdeaKGML-ag-Carbon represents a leap in modeling agroecosystem carbon cycles, blending scientific knowledge with data-driven insights to unlock precision and scalability in climate-smart agriculture.📖 CitationLiu, Licheng, et al. "Knowledge-guided machine learning can improve carbon cycle quantification in agroecosystems." Nature Communications 15.1 (2024): 357.




