Beyond the Magic Box: Solving AI Hallucinations with Precision RAG (with Evgeny Ilinykh)
Description
In this episode of the ShipTalk Podcast, host Dewan Ahmed (Principal Developer Advocate at Harness) sits down with Evgeny Ilinykh (Founder of GuidedMind.ai and former Tesla Engineering Manager) to move past the AI hype and get into the engineering reality of Retrieval-Augmented Generation (RAG).
If your AI agents are hallucinating, the problem probably isn't your model—it’s your retrieval layer. Evgeny breaks down how to turn the "black box" of LLMs into a transparent, production-ready system that developers can actually trust.
What we cover:
- The Death of Deterministic Software: Moving from hardcoded paths to agentic AI logic.
- The "Dark Spots" of Vector Space: Why hallucinations are actually retrieval failures.
- Contextual Retrieval: Insights into how system-level context changes the game for accuracy.
- Scaling to Production: Solving the "dirty work" of messy PDFs, table parsing, and chunking.
- Standardizing AI Delivery: Will RAG become as common as CI/CD in the modern dev stack?
RESOURCES & LINKS:
1. Connect with our guest: https://www.linkedin.com/in/eilinykh/
2. Explore GuidedMind.ai: https://guidedmind.ai/
3. Read the Anthropic Research: https://www.anthropic.com/engineering/contextual-retrieval












