Apr 10, 2023
Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization | Lex Fridman Podcast #368
Posted by Dan Breeden in categories: alien life, robotics/AI
Eliezer Yudkowsky is a researcher, writer, and philosopher on the topic of superintelligent AI. Please support this podcast by checking out our sponsors:
- Linode: https://linode.com/lex to get $100 free credit.
- House of Macadamias: https://houseofmacadamias.com/lex and use code LEX to get 20% off your first order.
- InsideTracker: https://insidetracker.com/lex to get 20% off.
EPISODE LINKS:
Eliezer’s Twitter: https://twitter.com/ESYudkowsky.
LessWrong Blog: https://lesswrong.com.
Eliezer’s Blog page: https://www.lesswrong.com/users/eliezer_yudkowsky.
Books and resources mentioned:
1. AGI Ruin (blog post): https://lesswrong.com/posts/uMQ3cqWDPHhjtiesc/agi-ruin-a-list-of-lethalities.
2. Adaptation and Natural Selection: https://amzn.to/40F5gfa.