Episode #30 TRAILER - “Dangerous Days At Open AI” For Humanity: An AI Risk Podcast

Описание к видео Episode #30 TRAILER - “Dangerous Days At Open AI” For Humanity: An AI Risk Podcast

Please Donate Here To Help Promote For Humanity
https://www.paypal.com/paypalme/forhu...

In episode 30 TRAILER, John Sherman interviews Professor Olle Häggström on a wide range of AI risk topics. At the top of the list is the super-instability and the super-exodus from OpenAI’s super alignment team following the resignations of Jan Lieke and Ilya Sutskyver.

This podcast is not journalism. But it’s not opinion either. This is a long form public service announcement. This show simply strings together the existing facts and underscores the unthinkable probable outcome, the end of all life on earth.

For Humanity: An AI Risk Podcast, is the accessible AI Risk Podcast for all humans, no tech background required. Our show focuses solely on the threat of human extinction from AI.

Peabody Award-winning former journalist John Sherman explores the shocking worst-case scenario of artificial intelligence: human extinction. The makers of AI openly admit it their work could kill all humans, in as soon as 2 years. This podcast is solely about the threat of human extinction from AGI. We’ll meet the heroes and villains, explore the issues and ideas, and what you can do to help save humanity.


RESOURCES:

JOIN THE FIGHT, help Pause AI!!!!
Pause AI

Join the Pause AI Weekly Discord Thursdays at 2pm EST
/ discord
  / discord  

22 Word Statement from Center for AI Safety
Statement on AI Risk | CAIS
https://www.safe.ai/work/statement-on...

Best Account on Twitter: AI Notkilleveryoneism Memes
  / aisafetymemes  

Комментарии

Информация по комментариям в разработке