Building brain-like AIs, with Alexander Ororbia
Description
Some people say that all that’s necessary to improve the capabilities of AI is to scale up existing systems. That is, to use more training data, to have larger models with more parameters in them, and more computer chips to crunch through the training data. However, in this episode, we’ll be hearing from a computer scientist who thinks there are many other options for improving AI. He is Alexander Ororbia, a professor at the Rochester Institute of Technology in New York State, where he directs the Neural Adaptive Computing Laboratory.
David had the pleasure of watching Alex give a talk at the AGI 2024 conference in Seattle earlier this year, and found it fascinating. After you hear this episode, we hope you reach a similar conclusion.
Selected follow-ups:
- Alexander Ororbia - Rochester Institute of Technology
- Alexander G. Ororbia II - Personal website
- AGI-24: The 17th Annual AGI Conference - AGI Society
- Joseph Tranquillo - Bucknell University
- Hopfield network - Wikipedia
- Karl Friston - UCL
- Predictive coding - Wikipedia
- Mortal Computation: A Foundation for Biomimetic Intelligence - Quantitative Biology
- The free-energy principle: a unified brain theory? - Nature Reviews Neuroscience
- I Am a Strange Loop (book by Douglas Hofstadter) - Wikipedia
- Mark Solms - Wikipedia
- Conscium: Pioneering Safe, Efficient AI
- The Hidden Spring: A Journey to the Source of Consciousness (book by Mark Solms)
- Carver Mead - Wikipedia
- Event camera (includes Dynamic Vision Sensors) - Wikipedia
- ICRA (International Conference on Robotics and Automation)
- Brain-Inspired Machine Intelligence: A Survey of Neurobiologically-Plausible Credit Assignment
- A Review of Neuroscience-Inspired Machine Learning
- ngc-learn
- Taking Neuromorphic Computing to the Next Level with Loihi 2 Technology Brief - Intel
Music: Spike Protein, by Koi Discovery, available under CC0 1.0 Public Domain Declaration