#66 From Will Smith to Meta's MovieGen: How AI Video Got Real. Plus Claude 3.5’s “Computer Use” & Open Source Tools
Description
Welcome to Datatopics Unplugged, where the tech world’s buzz meets laid-back banter. In each episode, we dive into the latest in AI, data science, and technology—perfect for your inner geek or curious mind. Pull up a seat, tune in, and join us for insights, laughs, and the occasional hot take on the digital world.
In this episode, we are joined by Vitale to discuss:
Meta’s video generation breakthrough: Explore Meta’s new “MovieGen” model family that generates hyper-realistic, 16-second video clips with reflections, consistent spatial details, and multi-frame coherence. Also discussed: Sora, a sneak peek at Meta’s open-source possibilities.
For a look back, check out this classic AI-generated video of Will Smith eating spaghetti.
Anthropic’s Claude 3.5 updates: Meet Claude 3.5 and its “computer use” feature, letting it navigate your screen for you.
Easily fine-tune & train LLMs, faster with Unsloth: Discover tools that simplify model fine-tuning and deployment, making it easier for small-scale developers to harness AI’s power. Don’t miss Gerganov’s GitHub contributions in this space, too.
Deno 2.0 release hype: With a splashy promo video, Deno’s JavaScript runtime enters the scene as a streamlined, secure alternative to Node.js.