Max Tegmark: The Case for Halting AI Development | Lex Fridman Podcast

Описание к видео Max Tegmark: The Case for Halting AI Development | Lex Fridman Podcast

Max Tegmark is a physicist and AI researcher at MIT, co-founder of the Future of Life Institute, and author of Life 3.0: Being Human in the Age of Artificial Intelligence. Please support this podcast by checking out our sponsors:
Notion: https://notion.com
InsideTracker: https://insidetracker.com/lex to get 20% off
Indeed: https://indeed.com/lex to get $75 credit

EPISODE LINKS:
Max's Twitter:   / tegmark  
Max's Website: https://space.mit.edu/home/tegmark
Pause Giant AI Experiments (open letter): https://futureoflife.org/open-letter/...
Future of Life Institute: https://futureoflife.org
Books and resources mentioned:
1. Life 3.0 (book): https://amzn.to/3UB9rXB
2. Meditations on Moloch (essay): https://slatestarcodex.com/2014/07/30...
3. Nuclear winter paper: https://nature.com/articles/s43016-02...

PODCAST INFO:
Podcast website: https://lexfridman.com/podcast
Apple Podcasts: https://apple.co/2lwqZIr
Spotify: https://spoti.fi/2nEwCF8
RSS: https://lexfridman.com/feed/podcast/
Full episodes playlist:    • Lex Fridman Podcast  
Clips playlist:    • Lex Fridman Podcast Clips  

OUTLINE:
0:00 - Introduction
1:56 - Intelligent alien civilizations
14:20 - Life 3.0 and superintelligent AI
25:47 - Open letter to pause Giant AI Experiments
50:54 - Maintaining control
1:19:44 - Regulation
1:30:34 - Job automation
1:39:48 - Elon Musk
2:01:31 - Open source
2:08:01 - How AI may kill all humans
2:18:32 - Consciousness
2:27:54 - Nuclear winter
2:38:21 - Questions for AGI

SOCIAL:
Twitter:   / lexfridman  
LinkedIn:   / lexfridman  
Facebook:   / lexfridman  
Instagram:   / lexfridman  
Medium:   / lexfridman  
Reddit:   / lexfridman  
Support on Patreon:   / lexfridman  

Комментарии

Информация по комментариям в разработке