DiscoverFuture of Life Institute PodcastDarren McKee on Uncontrollable Superintelligence
Darren McKee on Uncontrollable Superintelligence

Darren McKee on Uncontrollable Superintelligence

Update: 2023-12-011
Share

Description

Darren McKee joins the podcast to discuss how AI might be difficult to control, which goals and traits AI systems will develop, and whether there's a unified solution to AI alignment.

Timestamps:
00:00 Uncontrollable superintelligence
16:41 AI goals and the "virus analogy"
28:36 Speed of AI cognition
39:25 Narrow AI and autonomy
52:23 Reliability of current and future AI
1:02:33 Planning for multiple AI scenarios
1:18:57 Will AIs seek self-preservation?
1:27:57 Is there a unified solution to AI alignment?
1:30:26 Concrete AI safety proposals
Comments 
In Channel
loading
00:00
00:00
x

0.5x

0.8x

1.0x

1.25x

1.5x

2.0x

3.0x

Sleep Timer

Off

End of Episode

5 Minutes

10 Minutes

15 Minutes

30 Minutes

45 Minutes

60 Minutes

120 Minutes

Darren McKee on Uncontrollable Superintelligence

Darren McKee on Uncontrollable Superintelligence

Gus Docker