How Did Open Source Catch Up To OpenAI? [Mixtral-8x7B]

Описание к видео How Did Open Source Catch Up To OpenAI? [Mixtral-8x7B]

Sign-up for GTC24 now using this link!
https://nvda.ws/48s4tmc
For the giveaway of the RTX4080 Super, the full detailed plans are still being developed. However, it'll be along the line of you taking a photo of yourself attending a GTC virtual session, so you can sign-up to the conference now to set an early reminder!


What is Mixtral8-7B? The secret Mixture of Experts (MoE) technique that has beaten OpenAI's GPT-3.5 which was published around a year ago? In this video, you will learn what is Mixtral8x7B and how Mixture of Experts work which made them the new rising standard of LLM format.


Mixture of Experts
[Paper] https://arxiv.org/abs/2401.04088
[Project Page] https://mistral.ai/news/mixtral-of-ex...
[Huggingface Doc] https://huggingface.co/docs/transform...


This video is supported by the kind Patrons & YouTube Members:
🙏Andrew Lescelius, alex j, Chris LeDoux, Alex Maurice, Miguilim, Deagan, FiFaŁ, Daddy Wen, Tony Jimenez, Panther Modern, Jake Disco, Demilson Quintao, Shuhong Chen, Hongbo Men, happi nyuu nyaa, Carol Lo, Mose Sakashita, Miguel, Bandera, Gennaro Schiano, gunwoo, Ravid Freedman, Mert Seftali, Mrityunjay, Richárd Nagyfi, Timo Steiner, Henrik G Sundt, projectAnthony, Brigham Hall, Kyle Hudson, Kalila, Jef Come, Jvari Williams, Tien Tien, BIll Mangrum, owned, Janne Kytölä, SO, Richárd Nagyfi

[Discord]   / discord  
[Twitter]   / bycloudai  
[Patreon]   / bycloud  

[Music] massobeats - waiting
[Profile & Banner Art]   / pygm7  
[Video Editor] @askejm

Комментарии

Информация по комментариям в разработке