How AI Digital Twins Solve the Unavailable Co-worker Problem
Description
This podcast explores Viven, an AI digital twin startup launched by Eightfold co-founders Ashutosh Garg and Varun Kacholia. Viven recently emerged from stealth mode after raising $35 million in seed funding from investors including Khosla Ventures, Foundation Capital, and FPV Ventures.
Viven addresses the costly problem of project delays caused when colleagues with vital information are unavailable, perhaps due to being on vacation or in a different time zone. The co-founders believe that advances in large language models (LLMs) and data privacy technologies can solve aspects of this issue.
The company develops a specialized LLM for each employee, creating a digital twin by accessing internal electronic documents such as Google Docs, Slack, and email. Other employees can then query this digital twin to get immediate answers related to shared knowledge and common projects. The goal is to allow users to "talk to their twin as if you’re talking to that person and get the response," according to Ashutosh Garg.
A major concern addressed by Viven is privacy and handling sensitive information. Viven’s technology uses a concept known as pairwise context and privacy. This allows the startup's LLMs to precisely determine what information can be shared and with whom across the organization. The LLMs are smart enough to recognize personal context and know what needs to stay private. As an important safeguard, everyone can see the query history of their digital twin, which acts as a deterrent against people asking inappropriate questions.
Viven is already being used by several enterprise clients, including Eightfold and Genpact. Investors are excited, noting that Viven is automating a "horizontal problem across all jobs of coordination and communication" that no one else is. While competitors like Google’s Gemini, Anthropic, Microsoft Copilot, and OpenAI’s enterprise search products have personalization components, Viven hopes its pairwise context technology will serve as its moat.







