Re-imagining how we train LLMs using physics-based AI
Description
Machine-learning based Generative AI is inherently inefficient. Training models by sifting findings again and again until a suitable output is generated is a time-consuming – end energy-consuming – process. So, could there be a better way to look at training our AI systems?
Well, one possible option is physics-based AI, where training is viewed as an energy grid, and the best possible route though that grid mapped to find outputs. It’s a novel way of thinking, but it could change our whole approach to AI.
Joining us again today to find out more is Ray Beausoleil, a physicist, senior fellow and senior vice president at HPE. He leads the large scale integrated photonics lab at Hewlett Packard Labs.
This is Technology Now, a weekly show from Hewlett Packard Enterprise. Every week we look at a story that's been making headlines, take a look at the technology behind it, and explain why it matters to organizations and what we can learn from it.
Do you have a question for the expert? Ask it here using this Google form: https://forms.gle/8vzFNnPa94awARHMA
About this week's guest: Ray Beausoleil: https://www.linkedin.com/in/ray-beausoleil-22b148a/
Sources and statistics cited in this episode:
WEF paper on data centre energy usage: https://www.weforum.org/agenda/2024/07/generative-ai-energy-emissions/
IEA sats on energy usage in IT: https://www.iea.org/energy-system/buildings/data-centres-and-data-transmission-networks#overview
Novel insulins grand challenge: https://type1diabetesgrandchallenge.org.uk/funding/closed-funding/novel-insulins-innovation-incubator/