Logo video2dn
  • Сохранить видео с ютуба
  • Категории
    • Музыка
    • Кино и Анимация
    • Автомобили
    • Животные
    • Спорт
    • Путешествия
    • Игры
    • Люди и Блоги
    • Юмор
    • Развлечения
    • Новости и Политика
    • Howto и Стиль
    • Diy своими руками
    • Образование
    • Наука и Технологии
    • Некоммерческие Организации
  • О сайте

Скачать или смотреть Jeff Dean & Noam Shazeer — 25 years at Google: from PageRank to AGI

  • Dwarkesh Patel
  • 2025-02-12
  • 196404
Jeff Dean & Noam Shazeer — 25 years at Google: from PageRank to AGI
  • ok logo

Скачать Jeff Dean & Noam Shazeer — 25 years at Google: from PageRank to AGI бесплатно в качестве 4к (2к / 1080p)

У нас вы можете скачать бесплатно Jeff Dean & Noam Shazeer — 25 years at Google: from PageRank to AGI или посмотреть видео с ютуба в максимальном доступном качестве.

Для скачивания выберите вариант из формы ниже:

  • Информация по загрузке:

Cкачать музыку Jeff Dean & Noam Shazeer — 25 years at Google: from PageRank to AGI бесплатно в формате MP3:

Если иконки загрузки не отобразились, ПОЖАЛУЙСТА, НАЖМИТЕ ЗДЕСЬ или обновите страницу
Если у вас возникли трудности с загрузкой, пожалуйста, свяжитесь с нами по контактам, указанным в нижней части страницы.
Спасибо за использование сервиса video2dn.com

Описание к видео Jeff Dean & Noam Shazeer — 25 years at Google: from PageRank to AGI

This week I welcome two of the most important technologists in any field. Jeff Dean is Google's Chief Scientist, and through 25 years at the company, has worked on basically the most transformative systems in modern computing: from MapReduce, BigTable, Tensorflow, AlphaChip, to Gemini. Noam Shazeer invented or co-invented all the main architectures and techniques that are used for modern LLMs: from the Transformer itself, to Mixture of Experts, to Mesh Tensorflow, to Gemini and many other things. We talk about their 25 years at Google, going from PageRank to MapReduce to the Transformer to MoEs to AlphaChip – and soon to ASI.

𝐄𝐏𝐈𝐒𝐎𝐃𝐄 𝐋𝐈𝐍𝐊𝐒
Transcript: https://www.dwarkesh.com/p/jeff-dean-...
Apple Podcasts: https://podcasts.apple.com/us/podcast...
Spotify: https://open.spotify.com/episode/4atx...

𝐒𝐏𝐎𝐍𝐒𝐎𝐑𝐒
Meter wants to radically improve the digital world we take for granted. They’re developing a foundation model that automates network management end-to-end. To do this, they just announced a long-term partnership with Microsoft for tens of thousands of GPUs, and they’re recruiting a world class AI research team. To learn more, go to https://meter.com/dwarkesh

Scale partners with major AI labs like Meta, Google Deepmind, and OpenAI. Through Scale’s Data Foundry, labs get access to high-quality data to fuel post-training, including advanced reasoning capabilities. If you’re an AI researcher or engineer, learn about how Scale’s Data Foundry and research lab, SEAL, can help you go beyond the current frontier at https://scale.com/dwarkesh

Curious how Jane Street teaches their new traders? They use Figgie, a rapid-fire card game that simulates the most exciting parts of markets and trading. It’s become so popular that Jane Street hosts an inter-office Figgie championship every year. Download from the app store or play on your desktop at https://www.figgie.com/

To sponsor a future episode, visit https://www.dwarkesh.com/p/advertise

𝐓𝐈𝐌𝐄𝐒𝐓𝐀𝐌𝐏𝐒
00:00:00 - Intro
00:03:29 - Joining Google in 1999
00:06:20 - Future of Moore's Law
00:11:04 - Future TPUs
00:13:56 - Jeff’s undergrad thesis: parallel backprop
00:15:54 - LLMs in 2007
00:25:09 - “Holy shit” moments
00:27:28 - AI fulfills Google’s original mission
00:32:00 - Doing Search in-context
00:36:12 - The internal coding model
00:37:29 - What will 2027 models do?
00:43:20 - A new architecture every day?
00:49:10 - Automated chips and intelligence explosion
00:53:07 - Future of inference scaling
01:02:38 - Already doing multi-datacenter runs
01:08:15 - Debugging at scale
01:12:41 - Fast takeoff and superalignment
01:20:51 - A million evil Jeff Deans
01:24:22 - Fun times at Google
01:27:51 - World compute demand in 2030
01:34:37 - Getting back to modularity
01:44:48 - Keeping a giga-MoE in-memory
01:49:35 - All of Google in one model
01:57:59 - What’s missing from distillation
02:03:10 - Open research, pros and cons
02:09:58 - Going the distance

Комментарии

Информация по комментариям в разработке

Похожие видео

  • О нас
  • Контакты
  • Отказ от ответственности - Disclaimer
  • Условия использования сайта - TOS
  • Политика конфиденциальности

video2dn Copyright © 2023 - 2025

Контакты для правообладателей [email protected]