DiscoverKubernetes for Humans#049 - The AI Translator: Using LLMs & MCP for K8s Operations & Self-Healing Infra with Alexei Ledenev (doit)
#049 - The AI Translator: Using LLMs & MCP for K8s Operations & Self-Healing Infra with Alexei Ledenev (doit)

#049 - The AI Translator: Using LLMs & MCP for K8s Operations & Self-Healing Infra with Alexei Ledenev (doit)

Update: 2025-09-24
Share

Description

In this episode, Itiel Shwartz kicks off a series on MLOps, LLM, and GenAI in Kubernetes. 


Starting with Alexei Ledenev, who has over two decades in software development and deep experience in cloud architecture and distributed systems. He shares his journey from CoreOS Fleet to his current role on the Platform Team at Doit.


The conversation focuses on tackling the complexity of Kubernetes, which Alexei notes can be overwhelming even for experienced DevOps engineers. He discusses how he developed the idea to leverage AI assistants and the Model Context Protocol (MCP) to access and execute tools like kubectl. This concept creates a "translator between AI and the Kubernetes environment", allowing users to troubleshoot complex cluster issues or quickly create ad hoc testing environments using natural language.


They also explore the challenges of implementation, such as hallucination, and how providing context helps the AI self-correct. Looking ahead, Alexei predicts that infrastructure is moving towards self-aware and self-healing platforms that integrate AI deeply.

Comments 
In Channel
loading
00:00
00:00
x

0.5x

0.8x

1.0x

1.25x

1.5x

2.0x

3.0x

Sleep Timer

Off

End of Episode

5 Minutes

10 Minutes

15 Minutes

30 Minutes

45 Minutes

60 Minutes

120 Minutes

#049 - The AI Translator: Using LLMs & MCP for K8s Operations & Self-Healing Infra with Alexei Ledenev (doit)

#049 - The AI Translator: Using LLMs & MCP for K8s Operations & Self-Healing Infra with Alexei Ledenev (doit)

Komodor