DiscoverSecurity You Should KnowStopping AI Oversharing with Knostic
Stopping AI Oversharing with Knostic

Stopping AI Oversharing with Knostic

Update: 2025-05-28
Share

Description

Large language models are most useful to your business when they have access to your data. But these models also overshare by default, providing need-to-know information without sophisticated access controls. But organizations that try to limit the data accessed by an LLM risk undersharing within their organization, not giving the information users need to do their jobs more efficiently.

In this episode, Sounil Yu, CTO at Knostic, explains how they address internal knowledge segmentation, offer continuous assessments, and help prevent oversharing while also identifying under-sharing opportunities. Joining him are our panelists, Ross Young, CISO-in-residence at Team8, and David Cross, CISO at Atlassian.

Huge thanks to our sponsor, Knostic


Knostic protects enterprises from LLM oversharing by applying need-to-know access controls to AI tools like Microsoft 365 Copilot. Get visibility into overshared data, fix risky exposures, and deploy AI confidently—without data leakage. If you’re rolling out Copilot or Glean, you need Knostic.

 

Comments 
00:00
00:00
x

0.5x

0.8x

1.0x

1.25x

1.5x

2.0x

3.0x

Sleep Timer

Off

End of Episode

5 Minutes

10 Minutes

15 Minutes

30 Minutes

45 Minutes

60 Minutes

120 Minutes

Stopping AI Oversharing with Knostic

Stopping AI Oversharing with Knostic