Deploying Edge and Embedded AI Systems with Heather Gorr - #655

Deploying Edge and Embedded AI Systems with Heather Gorr - #655

Update: 2023-11-13
Share

Description

Today we’re joined by Heather Gorr, principal MATLAB product marketing manager at MathWorks. In our conversation with Heather, we discuss the deployment of AI models to hardware devices and embedded AI systems. We explore factors to consider during data preparation, model development, and ultimately deployment, to ensure a successful project. Factors such as device constraints and latency requirements which dictate the amount and frequency of data flowing onto the device are discussed, as are modeling needs such as explainability, robustness and quantization; the use of simulation throughout the modeling process; the need to apply robust verification and validation methodologies to ensure safety and reliability; and the need to adapt and apply MLOps techniques for speed and consistency. Heather also shares noteworthy anecdotes about embedded AI deployments in industries including automotive and oil & gas.


The complete show notes for this episode can be found at twimlai.com/go/655.

Comments 
In Channel
loading
00:00
00:00
1.0x

0.5x

0.8x

1.0x

1.25x

1.5x

2.0x

3.0x

Sleep Timer

Off

End of Episode

5 Minutes

10 Minutes

15 Minutes

30 Minutes

45 Minutes

60 Minutes

120 Minutes

Deploying Edge and Embedded AI Systems with Heather Gorr - #655

Deploying Edge and Embedded AI Systems with Heather Gorr - #655

Sam Charrington

We and our partners use cookies to personalize your experience, to show you ads based on your interests, and for measurement and analytics purposes. By using our website and our services, you agree to our use of cookies as described in our Cookie Policy.