Learning Bayesian Statistics

<p>Are you a researcher or data scientist / analyst / ninja? Do you want to learn Bayesian inference, stay up to date or simply want to understand what Bayesian inference is? </p><p></p><p>Then this podcast is for you! You'll hear from researchers and practitioners of all fields about how they use Bayesian statistics, and how in turn YOU can apply these methods in your modeling workflow. </p><p></p><p>When I started learning Bayesian methods, I really wished there were a podcast out there that could introduce me to the methods, the projects and the people who make all that possible. </p><p></p><p>So I created "Learning Bayesian Statistics", where you'll get to hear how Bayesian statistics are used to detect black matter in outer space, forecast elections or understand how diseases spread and can ultimately be stopped. But this show is not only about successes -- it's also about failures, because that's how we learn best. </p><p></p><p>So you'll often hear the guests talking about what *didn't* work in their projects, why, and how they overcame these challenges. Because, in the end, we're all lifelong learners! </p><p></p><p>My name is <a href="https://alexandorra.github.io/" rel="noopener noreferrer nofollow" target="_blank">Alex Andorra</a> by the way. By day, I'm a Senior data scientist. By night, I don't (yet) fight crime, but I'm an open-source enthusiast and core contributor to the python packages <a href="https://docs.pymc.io/" rel="noopener noreferrer nofollow" target="_blank">PyMC</a> and <a href="https://arviz-devs.github.io/arviz/" rel="noopener noreferrer nofollow" target="_blank">ArviZ</a>. I also love Nutella, but I don't like talking about it – I prefer eating it. </p><p></p><p>So, whether you want to learn Bayesian statistics or hear about the latest libraries, books and applications, this podcast is for you -- just subscribe! You can also support the show and <a href="https://www.patreon.com/learnbayesstats" rel="noopener noreferrer nofollow" target="_blank">unlock exclusive Bayesian swag on Patreon</a>!</p>

BITESIZE | Making Variational Inference Reliable: From ADVI to DADVI

Today’s clip is from episode 147 of the podcast, with Martin Ingram.Alex and Martin discuss the intricacies of variational inference, particularly focusing on the ADVI method and its challenges. They explore the evolution of approximate inference methods, the significance of mean field variational inference, and the innovative linear response technique for covariance estimation. The discussion also delves into the trade-offs between stochastic and deterministic optimization techniques, providing insights into their implications for Bayesian statistics.Get the full discussion here.Intro to Bayes Course (first 2 lessons free)Advanced Regression Course (first 2 lessons free)Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work!Visit our Patreon page to unlock exclusive Bayesian swag ;)TranscriptThis is an automatic transcript and may therefore contain errors. Please get in touch if you're willing to correct them.

12-17
21:59

#147 Fast Approximate Inference without Convergence Worries, with Martin Ingram

Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!Intro to Bayes Course (first 2 lessons free)Advanced Regression Course (first 2 lessons free)Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work!Visit our Patreon page to unlock exclusive Bayesian swag ;)Takeaways:DADVI is a new approach to variational inference that aims to improve speed and accuracy.DADVI allows for faster Bayesian inference without sacrificing model flexibility.Linear response can help recover covariance estimates from mean estimates.DADVI performs well in mixed models and hierarchical structures.Normalizing flows present an interesting avenue for enhancing variational inference.DADVI can handle large datasets effectively, improving predictive performance.Future enhancements for DADVI may include GPU support and linear response integration.Chapters:13:17 Understanding DADVI: A New Approach21:54 Mean Field Variational Inference Explained26:38 Linear Response and Covariance Estimation31:21 Deterministic vs Stochastic Optimization in DADVI35:00 Understanding DADVI and Its Optimization Landscape37:59 Theoretical Insights and Practical Applications of DADVI42:12 Comparative Performance of DADVI in Real Applications45:03 Challenges and Effectiveness of DADVI in Various Models48:51 Exploring Future Directions for Variational Inference53:04 Final Thoughts and Advice for PractitionersThank you to my Patrons for making this episode possible!Yusuke Saito, Avi Bryant, Giuliano Cruz, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Aubrey Clayton, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël...

12-12
01:09:55

BITESIZE | Why Bayesian Stats Matter When the Physics Gets Extreme

Today’s clip is from episode 146 of the podcast, with Ethan Smith.Alex and Ethan discuss the application of Bayesian inference in high energy density physics, particularly in analyzing complex data sets. They highlight the advantages of Bayesian techniques, such as incorporating prior knowledge and managing uncertainties. They also shares insights from an ongoing experimental project focused on measuring the equation of state of plasma at extreme pressures. Finally, Alex and Ethan advocate for best practices in managing large codebases and ensuring model reliability.Get the full discussion here.Intro to Bayes Course (first 2 lessons free)Advanced Regression Course (first 2 lessons free)Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work!Visit our Patreon page to unlock exclusive Bayesian swag ;)TranscriptThis is an automatic transcript and may therefore contain errors. Please get in touch if you're willing to correct them.

12-05
19:12

#146 Lasers, Planets, and Bayesian Inference, with Ethan Smith

Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!Intro to Bayes Course (first 2 lessons free)Advanced Regression Course (first 2 lessons free)Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work!Visit our Patreon page to unlock exclusive Bayesian swag ;)Takeaways:Ethan's research involves using lasers to compress matter to extreme conditions to study astrophysical phenomena.Bayesian inference is a key tool in analyzing complex data from high energy density experiments.The future of high energy density physics lies in developing new diagnostic technologies and increasing experimental scale.High energy density physics can provide insights into planetary science and astrophysics.Emerging technologies in diagnostics are set to revolutionize the field.Ethan's dream project involves exploring picno nuclear fusion.Chapters:14:31 Understanding High Energy Density Physics and Plasma Spectroscopy21:24 Challenges in Data Analysis and Experimentation36:11 The Role of Bayesian Inference in High Energy Density Physics47:17 Transitioning to Advanced Sampling Techniques51:35 Best Practices in Model Development55:30 Evaluating Model Performance01:02:10 The Role of High Energy Density Physics01:11:15 Innovations in Diagnostic Technologies01:22:51 Future Directions in Experimental Physics01:26:08 Advice for Aspiring ScientistsThank you to my Patrons for making this episode possible!Yusuke Saito, Avi Bryant, Giuliano Cruz, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Aubrey Clayton, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady,

11-27
01:35:19

BITESIZE | How to Thrive in an AI-Driven Workplace?

Today’s clip is from episode 145 of the podcast, with Jordan Thibodeau.Alexandre Andorra and Jordan Thibodeau discuss the transformative impact of AI on productivity, career opportunities in the tech industry, and the intricacies of the job interview process. They emphasize the importance of expertise, networking, and the evolving landscape of tech companies, while also providing actionable advice for individuals looking to enhance their careers in AI and related fields.Get the full discussion here.Intro to Bayes Course (first 2 lessons free)Advanced Regression Course (first 2 lessons free)Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work!Visit our Patreon page to unlock exclusive Bayesian swag ;)TranscriptThis is an automatic transcript and may therefore contain errors. Please get in touch if you're willing to correct them.

11-20
19:34

#145 Career Advice in the Age of AI, with Jordan Thibodeau

Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!Intro to Bayes Course (first 2 lessons free)Advanced Regression Course (first 2 lessons free)Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work!Visit our Patreon page to unlock exclusive Bayesian swag ;)Thank you to my Patrons for making this episode possible!Yusuke Saito, Avi Bryant, Giuliano Cruz, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Aubrey Clayton, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Joshua Meehl, Javier Sabio, Kristian Higgins, Matt Rosinski, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas, Luke Gorrie, Cory Kiser, Julio, Edvin Saveljev, Frederick Ayala, Jeffrey Powell, Gal Kampel, Adan Romero, Will Geary, Blake Walters, Jonathan Morgan, Francesco Madrisotti, Ivy Huang, Gary Clarke, Robert Flannery, Rasmus Hindström, Stefan, Corey Abshire, Mike Loncaric, David McCormick, Ronald Legere, Sergio Dolia, Michael Cao, Yiğit Aşık, Suyog Chandramouli and Guillaume Berthon.Takeaways:AI is reshaping the workplace, but we're still in early stages.Networking is crucial for job applications in top firms.AI tools can augment work but are not replacements for skilled labor.Understanding the tech landscape requires continuous learning.Timing and cultural readiness are key for tech innovations.Expertise can be gained without formal education.Bayesian statistics is a valuable skill for tech professionals.The importance of personal branding in the job market. You just need to know 1% more than the person you're talking to.Sharing knowledge can elevate your status within a company.Embracing chaos in tech can create new opportunities.Investing in people leads...

11-12
01:52:18

BITESIZE | Why is Bayesian Deep Learning so Powerful?

Today’s clip is from episode 144 of the podcast, with Maurizio Filippone.In this conversation, Alex and Maurizio delve into the intricacies of Gaussian processes and their deep learning counterparts. They explain the foundational concepts of Gaussian processes, the transition to deep Gaussian processes, and the advantages they offer in modeling complex data. The discussion also touches on practical applications, model selection, and the evolving landscape of machine learning, particularly in relation to transfer learning and the integration of deep learning techniques with Gaussian processes.Get the full discussion here.Intro to Bayes Course (first 2 lessons free)Advanced Regression Course (first 2 lessons free)Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work!Visit our Patreon page to unlock exclusive Bayesian swag ;)TranscriptThis is an automatic transcript and may therefore contain errors. Please get in touch if you're willing to correct them.

11-05
19:00

#144 Why is Bayesian Deep Learning so Powerful, with Maurizio Filippone

Sign up for Alex's first live cohort, about Hierarchical Model building!Get 25% off "Building AI Applications for Data Scientists and Software Engineers"Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work!Visit our Patreon page to unlock exclusive Bayesian swag ;)Takeaways:Why GPs still matter: Gaussian Processes remain a go-to for function estimation, active learning, and experimental design – especially when calibrated uncertainty is non-negotiable.Scaling GP inference: Variational methods with inducing points (as in GPflow) make GPs practical on larger datasets without throwing away principled Bayes.MCMC in practice: Clever parameterizations and gradient-based samplers tighten mixing and efficiency; use MCMC when you need gold-standard posteriors.Bayesian deep learning, pragmatically: Stochastic-gradient training and approximate posteriors bring Bayesian ideas to neural networks at scale.Uncertainty that ships: Monte Carlo dropout and related tricks provide fast, usable uncertainty – even if they’re approximations.Model complexity ≠ model quality: Understanding capacity, priors, and inductive bias is key to getting trustworthy predictions.Deep Gaussian Processes: Layered GPs offer flexibility for complex functions, with clear trade-offs in interpretability and compute.Generative models through a Bayesian lens: GANs and friends benefit from explicit priors and uncertainty – useful for safety and downstream decisions.Tooling that matters: Frameworks like GPflow lower the friction from idea to implementation, encouraging reproducible, well-tested modeling.Where we’re headed: The future of ML is uncertainty-aware by default – integrating UQ tightly into optimization, design, and deployment.Chapters:08:44 Function Estimation and Bayesian Deep Learning10:41 Understanding Deep Gaussian Processes25:17 Choosing Between Deep GPs and Neural Networks32:01 Interpretability and Practical Tools for GPs43:52 Variational Methods in Gaussian Processes54:44 Deep Neural Networks and Bayesian Inference01:06:13 The Future of Bayesian Deep Learning01:12:28 Advice for Aspiring Researchersp...

10-30
01:28:22

BITESIZE | Are Bayesian Models the Missing Ingredient in Nutrition Research?

Sign up for Alex's first live cohort, about Hierarchical Model buildingSoccer Factor Model DashboardToday’s clip is from episode 143 of the podcast, with Christoph Bamberg.Christoph shares his journey into Bayesian statistics and computational modeling, the challenges faced in academia, and the technical tools used in research. Alex and Christoph delve into a specific study on appetite regulation and cognitive performance, exploring the implications of framing in psychological research and the importance of careful communication in health-related contexts.Get the full discussion here.Intro to Bayes Course (first 2 lessons free)Advanced Regression Course (first 2 lessons free)Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work!Visit our Patreon page to unlock exclusive Bayesian swag ;)TranscriptThis is an automatic transcript and may therefore contain errors. Please get in touch if you're willing to correct them.

10-23
23:14

#143 Transforming Nutrition Science with Bayesian Methods, with Christoph Bamberg

Sign up for Alex's first live cohort, about Hierarchical Model building!Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!Intro to Bayes Course (first 2 lessons free)Advanced Regression Course (first 2 lessons free)Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work!Visit our Patreon page to unlock exclusive Bayesian swag ;)Takeaways:Bayesian mindset in psychology: Why priors, model checking, and full uncertainty reporting make findings more honest and useful.Intermittent fasting & cognition: A Bayesian meta-analysis suggests effects are context- and age-dependent – and often small but meaningful.Framing matters: The way we frame dietary advice (focus, flexibility, timing) can shape adherence and perceived cognitive benefits.From cravings to choices: Appetite, craving, stress, and mood interact to influence eating and cognitive performance throughout the day.Define before you measure: Clear definitions (and DAGs to encode assumptions) reduce ambiguity and guide better study design.DAGs for causal thinking: Directed acyclic graphs help separate hypotheses from data pipelines and make causal claims auditable.Small effects, big implications: Well-estimated “small” effects can scale to public-health relevance when decisions repeat daily.Teaching by modeling: Helping students write models (not just run them) builds statistical thinking and scientific literacy.Bridging lab and life: Balancing careful experiments with real-world measurement is key to actionable health-psychology insights.Trust through transparency: Openly communicating assumptions, uncertainty, and limitations strengthens scientific credibility.Chapters:10:35 The Struggles of Bayesian Statistics in Psychology22:30 Exploring Appetite and Cognitive Performance29:45 Research Methodology and Causal Inference36:36 Understanding Cravings and Definitions39:02 Intermittent Fasting and Cognitive Performance42:57 Practical Recommendations for Intermittent Fasting49:40 Balancing Experimental Psychology and Statistical Modeling55:00 Pressing Questions in Health Psychology01:04:50 Future Directions in ResearchThank you to my Patrons for...

10-15
01:12:56

BITESIZE | How Bayesian Additive Regression Trees Work in Practice

Soccer Factor Model DashboardUnveiling True Talent: The Soccer Factor Model for Skill EvaluationLBS #91, Exploring European Football Analytics, with Max GöbelGet early access to Alex's next live-cohort courses!Today’s clip is from episode 142 of the podcast, with Gabriel Stechschulte.Alex and Garbriel explore the re-implementation of BART (Bayesian Additive Regression Trees) in Rust, detailing the technical challenges and performance improvements achieved.They also share insights into the benefits of BART, such as uncertainty quantification, and its application in various data-intensive fields.Get the full discussion here.Intro to Bayes Course (first 2 lessons free)Advanced Regression Course (first 2 lessons free)Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work!Visit our Patreon page to unlock exclusive Bayesian swag ;)TranscriptThis is an automatic transcript and may therefore contain errors. Please get in touch if you're willing to correct them.

10-09
22:49

#142 Bayesian Trees & Deep Learning for Optimization & Big Data, with Gabriel Stechschulte

Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!Get early access to Alex's next live-cohort courses!Intro to Bayes Course (first 2 lessons free)Advanced Regression Course (first 2 lessons free)Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work!Visit our Patreon page to unlock exclusive Bayesian swag ;)Takeaways:BART as a core tool: Gabriel explains how Bayesian Additive Regression Trees provide robust uncertainty quantification and serve as a reliable baseline model in many domains.Rust for performance: His Rust re-implementation of BART dramatically improves speed and scalability, making it feasible for larger datasets and real-world IoT applications.Strengths and trade-offs: BART avoids overfitting and handles missing data gracefully, though it is slower than other tree-based approaches.Big data meets Bayes: Gabriel shares strategies for applying Bayesian methods with big data, including when variational inference helps balance scale with rigor.Optimization and decision-making: He highlights how BART models can be embedded into optimization frameworks, opening doors for sequential decision-making.Open source matters: Gabriel emphasizes the importance of communities like PyMC and Bambi, encouraging newcomers to start with small contributions.Chapters:05:10 – From economics to IoT and Bayesian statistics18:55 – Introduction to BART (Bayesian Additive Regression Trees)24:40 – Re-implementing BART in Rust for speed and scalability32:05 – Comparing BART with Gaussian Processes and other tree methods39:50 – Strengths and limitations of BART47:15 – Handling missing data and different likelihoods54:30 – Variational inference and big data challenges01:01:10 – Embedding BART into optimization and decision-making frameworks01:08:45 – Open source, PyMC, and community support01:15:20 – Advice for newcomers01:20:55 – Future of BART, Rust, and probabilistic programmingThank you to my Patrons for making this episode possible!Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian...

10-02
01:10:28

BITESIZE | How Probability Becomes Causality?

Get early access to Alex's next live-cohort courses!Today’s clip is from episode 141 of the podcast, with Sam Witty.Alex and Sam discuss the ChiRho project, delving into the intricacies of causal inference, particularly focusing on Do-Calculus, regression discontinuity designs, and Bayesian structural causal inference. They explain ChiRho's design philosophy, emphasizing its modular and extensible nature, and highlights the importance of efficient estimation in causal inference, making complex statistical methods accessible to users without extensive expertise.Get the full discussion here.Intro to Bayes Course (first 2 lessons free)Advanced Regression Course (first 2 lessons free)Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work!Visit our Patreon page to unlock exclusive Bayesian swag ;)TranscriptThis is an automatic transcript and may therefore contain errors. Please get in touch if you're willing to correct them.

09-24
22:03

#141 AI Assisted Causal Inference, with Sam Witty

Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!Get early access to Alex's next live-cohort courses!Enroll in the Causal AI workshop, to learn live with Alex (15% off if you're a Patron of the show)Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work!Visit our Patreon page to unlock exclusive Bayesian swag ;)Takeaways:Causal inference is crucial for understanding the impact of interventions in various fields.ChiRho is a causal probabilistic programming language that bridges mechanistic and data-driven models.ChiRho allows for easy manipulation of causal models and counterfactual reasoning.The design of ChiRho emphasizes modularity and extensibility for diverse applications.Causal inference requires careful consideration of assumptions and model structures.Real-world applications of causal inference can lead to significant insights in science and engineering.Collaboration and communication are key in translating causal questions into actionable models.The future of causal inference lies in integrating probabilistic programming with scientific discovery.Chapters:05:53 Bridging Mechanistic and Data-Driven Models09:13 Understanding Causal Probabilistic Programming12:10 ChiRho and Its Design Principles15:03 ChiRho’s Functionality and Use Cases17:55 Counterfactual Worlds and Mediation Analysis20:47 Efficient Estimation in ChiRho24:08 Future Directions for Causal AI50:21 Understanding the Do-Operator in Causal Inference56:45 ChiRho’s Role in Causal Inference and Bayesian Modeling01:01:36 Roadmap and Future Developments for ChiRho01:05:29 Real-World Applications of Causal Probabilistic Programming01:10:51 Challenges in Causal Inference Adoption01:11:50 The Importance of Causal Claims in Research01:18:11 Bayesian Approaches to Causal Inference01:22:08 Combining Gaussian Processes with Causal Inference01:28:27 Future Directions in Probabilistic Programming and Causal InferenceThank you to my Patrons for making this episode possible!Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad...

09-18
01:37:47

BITESIZE | How to Think Causally About Your Models?

Get early access to Alex's next live-cohort courses!Today’s clip is from episode 140 of the podcast, with Ron Yurko.Alex and Ron discuss the challenges of model deployment, and the complexities of modeling player contributions in team sports like soccer and football.They emphasize the importance of understanding replacement levels, the Going Deep framework in football analytics, and the need for proper modeling of expected points. Additionally, they share insights on teaching Bayesian modeling to students and the difficulties they face in grasping the concepts of model writing and application.Get the full discussion here.Intro to Bayes Course (first 2 lessons free)Advanced Regression Course (first 2 lessons free)Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work!Visit our Patreon page to unlock exclusive Bayesian swag ;)TranscriptThis is an automatic transcript and may therefore contain errors. Please get in touch if you're willing to correct them.

09-10
24:01

#140 NFL Analytics & Teaching Bayesian Stats, with Ron Yurko

Get early access to Alex's next live-cohort courses!Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!Intro to Bayes Course (first 2 lessons free)Advanced Regression Course (first 2 lessons free)Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work!Visit our Patreon page to unlock exclusive Bayesian swag ;)Takeaways:Teaching students to write out their own models is crucial.Developing a sports analytics portfolio is essential for aspiring analysts.Modeling expectations in sports analytics can be misleading.Tracking data can significantly improve player performance models.Ron encourages students to engage in active learning through projects.The importance of understanding the dependency structure in data is vital.Ron aims to integrate more diverse sports analytics topics into his teaching.Chapters:03:51 The Journey into Sports Analytics15:20 The Evolution of Bayesian Statistics in Sports26:01 Innovations in NFL WAR Modeling39:23 Causal Modeling in Sports Analytics46:29 Defining Replacement Levels in Sports48:26 The Going Deep Framework and Big Data in Football52:47 Modeling Expectations in Football Data55:40 Teaching Statistical Concepts in Sports Analytics01:01:54 The Importance of Model Building in Education01:04:46 Statistical Thinking in Sports Analytics01:10:55 Innovative Research in Player Movement01:15:47 Exploring Data Needs in American Football01:18:43 Building a Sports Analytics PortfolioThank you to my Patrons for making this episode possible!Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell,...

09-03
01:33:01

BITESIZE | Is Bayesian Optimization the Answer?

Today’s clip is from episode 139 of the podcast, with with Max Balandat.Alex and Max discuss the integration of BoTorch with PyTorch, exploring its applications in Bayesian optimization and Gaussian processes. They highlight the advantages of using GPyTorch for structured matrices and the flexibility it offers for research. The discussion also covers the motivations behind building BoTorch, the importance of open-source culture at Meta, and the role of PyTorch in modern machine learning.Get the full discussion here.Attend Alex's tutorial at PyData Berlin: A Beginner's Guide to State Space Modeling Intro to Bayes Course (first 2 lessons free)Advanced Regression Course (first 2 lessons free)Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work!Visit our Patreon page to unlock exclusive Bayesian swag ;)TranscriptThis is an automatic transcript and may therefore contain errors. Please get in touch if you're willing to correct them.

08-27
25:13

#139 Efficient Bayesian Optimization in PyTorch, with Max Balandat

Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!Intro to Bayes Course (first 2 lessons free)Advanced Regression Course (first 2 lessons free)Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work!Visit our Patreon page to unlock exclusive Bayesian swag ;)Takeaways:BoTorch is designed for researchers who want flexibility in Bayesian optimization.The integration of BoTorch with PyTorch allows for differentiable programming.Scalability at Meta involves careful software engineering practices and testing.Open-source contributions enhance the development and community engagement of BoTorch.LLMs can help incorporate human knowledge into optimization processes.Max emphasizes the importance of clear communication of uncertainty to stakeholders.The role of a researcher in industry is often more application-focused than in academia.Max's team at Meta works on adaptive experimentation and Bayesian optimization.Chapters:08:51 Understanding BoTorch12:12 Use Cases and Flexibility of BoTorch15:02 Integration with PyTorch and GPyTorch17:57 Practical Applications of BoTorch20:50 Open Source Culture at Meta and BoTorch's Development43:10 The Power of Open Source Collaboration47:49 Scalability Challenges at Meta51:02 Balancing Depth and Breadth in Problem Solving55:08 Communicating Uncertainty to Stakeholders01:00:53 Learning from Missteps in Research01:05:06 Integrating External Contributions into BoTorch01:08:00 The Future of Optimization with LLMsThank you to my Patrons for making this episode possible!Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode,...

08-20
01:25:23

BITESIZE | What's Missing in Bayesian Deep Learning?

Today’s clip is from episode 138 of the podcast, with Mélodie Monod, François-Xavier Briol and Yingzhen Li.During this live show at Imperial College London, Alex and his guests delve into the complexities and advancements in Bayesian deep learning, focusing on uncertainty quantification, the integration of machine learning tools, and the challenges faced in simulation-based inference.The speakers discuss their current projects, the evolution of Bayesian models, and the need for better computational tools in the field.Get the full discussion here.Attend Alex's tutorial at PyData Berlin: A Beginner's Guide to State Space Modeling Intro to Bayes Course (first 2 lessons free)Advanced Regression Course (first 2 lessons free)Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work!Visit our Patreon page to unlock exclusive Bayesian swag ;)TranscriptThis is an automatic transcript and may therefore contain errors. Please get in touch if you're willing to correct them.

08-13
20:34

#138 Quantifying Uncertainty in Bayesian Deep Learning, Live from Imperial College London

Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!Intro to Bayes Course (first 2 lessons free)Advanced Regression Course (first 2 lessons free)Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work!Visit our Patreon page to unlock exclusive Bayesian swag ;)Takeaways:Bayesian deep learning is a growing field with many challenges.Current research focuses on applying Bayesian methods to neural networks.Diffusion methods are emerging as a new approach for uncertainty quantification.The integration of machine learning tools into Bayesian models is a key area of research.The complexity of Bayesian neural networks poses significant computational challenges.Future research will focus on improving methods for uncertainty quantification. Generalized Bayesian inference offers a more robust approach to uncertainty.Uncertainty quantification is crucial in fields like medicine and epidemiology.Detecting out-of-distribution examples is essential for model reliability.Exploration-exploitation trade-off is vital in reinforcement learning.Marginal likelihood can be misleading for model selection.The integration of Bayesian methods in LLMs presents unique challenges.Chapters:00:00 Introduction to Bayesian Deep Learning03:12 Panelist Introductions and Backgrounds10:37 Current Research and Challenges in Bayesian Deep Learning18:04 Contrasting Approaches: Bayesian vs. Machine Learning26:09 Tools and Techniques for Bayesian Deep Learning31:18 Innovative Methods in Uncertainty Quantification36:23 Generalized Bayesian Inference and Its Implications41:38 Robust Bayesian Inference and Gaussian Processes44:24 Software Development in Bayesian Statistics46:51 Understanding Uncertainty in Language Models50:03 Hallucinations in Language Models53:48 Bayesian Neural Networks vs Traditional Neural Networks58:00 Challenges with Likelihood Assumptions01:01:22 Practical Applications of Uncertainty Quantification01:04:33 Meta Decision-Making with Uncertainty01:06:50 Exploring Bayesian Priors in Neural Networks01:09:17 Model Complexity and Data Signal01:12:10 Marginal Likelihood and Model Selection01:15:03 Implementing Bayesian Methods in LLMs01:19:21 Out-of-Distribution Detection in LLMsThank you to my Patrons for making this episode possible!Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer,...

08-06
01:23:10

Sara Javadi

where can I find the scripts of the episide?

06-04 Reply

Reynold Okudzeto

hey, this is awesome!

02-08 Reply

Recommend Channels