DiscoverLessWrong (30+ Karma)[Linkpost] “We are likely in an AI overhang, and this is bad.” by Gabriel Alfour
[Linkpost] “We are likely in an AI overhang, and this is bad.” by Gabriel Alfour

[Linkpost] “We are likely in an AI overhang, and this is bad.” by Gabriel Alfour

Update: 2025-09-23
Share

Description

This is a link post.


By racing to the next generation of models faster than we can understand the current one, AI companies are creating an overhang. This overhang is not visible, and our current safety frameworks do not take it into account.

1) AI models have untapped capabilities

At the time GPT-3 was released, most of its currently-known capabilities were unknown.

As we play more with models, build better scaffolding, get better at prompting, inspect their internals, and study them, we discover more about what's possible to do with them.

This has also been my direct experience studying and researching open-source models at Conjecture.

2) SOTA models have a lot of untapped capabilities

Companies are racing hard.

There's a trade-off between studying existing models and pushing forward. They are doing the latter, and they are doing it hard.

There is much more research into boosting SOTA models than [...]

---

Outline:

(00:26 ) 1) AI models have untapped capabilities

(00:53 ) 2) SOTA models have a lot of untapped capabilities

(01:29 ) 3) This is bad news.

(01:48 ) 4) This is not accounted for.

---


First published:

September 23rd, 2025



Source:

https://www.lesswrong.com/posts/4YvSSKTPhPC43K3vn/we-are-likely-in-an-ai-overhang-and-this-is-bad



Linkpost URL:
https://cognition.cafe/p/we-are-likely-in-an-ai-overhang-and


---


Narrated by TYPE III AUDIO.

Comments 
In Channel
loading
00:00
00:00
1.0x

0.5x

0.8x

1.0x

1.25x

1.5x

2.0x

3.0x

Sleep Timer

Off

End of Episode

5 Minutes

10 Minutes

15 Minutes

30 Minutes

45 Minutes

60 Minutes

120 Minutes

[Linkpost] “We are likely in an AI overhang, and this is bad.” by Gabriel Alfour

[Linkpost] “We are likely in an AI overhang, and this is bad.” by Gabriel Alfour