Logo video2dn
  • Сохранить видео с ютуба
  • Категории
    • Музыка
    • Кино и Анимация
    • Автомобили
    • Животные
    • Спорт
    • Путешествия
    • Игры
    • Люди и Блоги
    • Юмор
    • Развлечения
    • Новости и Политика
    • Howto и Стиль
    • Diy своими руками
    • Образование
    • Наука и Технологии
    • Некоммерческие Организации
  • О сайте

Скачать или смотреть S3E2 How Encoder only LLMs really work -- BERT and ELECTRA

  • Partha Seetala
  • 2025-08-03
  • 38032
S3E2  How Encoder only LLMs really work -- BERT and ELECTRA
  • ok logo

Скачать S3E2 How Encoder only LLMs really work -- BERT and ELECTRA бесплатно в качестве 4к (2к / 1080p)

У нас вы можете скачать бесплатно S3E2 How Encoder only LLMs really work -- BERT and ELECTRA или посмотреть видео с ютуба в максимальном доступном качестве.

Для скачивания выберите вариант из формы ниже:

  • Информация по загрузке:

Cкачать музыку S3E2 How Encoder only LLMs really work -- BERT and ELECTRA бесплатно в формате MP3:

Если иконки загрузки не отобразились, ПОЖАЛУЙСТА, НАЖМИТЕ ЗДЕСЬ или обновите страницу
Если у вас возникли трудности с загрузкой, пожалуйста, свяжитесь с нами по контактам, указанным в нижней части страницы.
Спасибо за использование сервиса video2dn.com

Описание к видео S3E2 How Encoder only LLMs really work -- BERT and ELECTRA

Webseries #1: Comprehensive and Intuitive Introduction to Deep Learning -- Numbers and Text by @parthaseetala

SEASON 3 - EPISODE 2:

In this episode, we dive deep into Encoder-only LLMs like BERT and ELECTRA , breaking down *how they really work*.

We start with clear intuition , then peel back the layers to explain the internals , pretraining objectives , and finetuning strategies . Finally, we walk through 4 killer real-world use cases , including classification , labeling/tagging , entity extraction (NER) , and semantic similarity — all grounded in practical, real-world examples.

Why watch this video?

Unlike many videos that stay at a surface level or get lost in dry math, this episode delivers:

Intuitive mental models that stick
Visual explanations and code-level insights side by side
A balance of theory, architecture, and implementation
Real-world applications mapped to actual tasks you're likely to encounter

Whether you're a developer, ML practitioner, or just Transformer-curious, this episode will help you truly understand and apply Encoder-only LLMs — not just memorize them.

SOURCE CODE

https://github.com/parthaseetala/cidl...

CONTENTS

00:00:28 Timeline of Encoder-only LLMs
00:01:57 What can you build with Encoder-only LLMs?
00:04:04 The BERT paper
00:05:35 Quick recap of Encoder-only Transformer
00:09:13 Breaking training into pre-training and fine-tuning steps
00:20:11 STEP 1: Pre-training an Encoder-only LLMs
00:20:41 Technique #1: Masked Language Model (MLM) -- BERT
00:32:04 Technique #2: Replaced Token Detection (RTD) -- ELECTRA
00:39:03 Technique #3: Next Sentence Prediction (NSP)
00:43:36 Technique #4: Sentence Order Prediction (SOP)
00:45:32 What has Encoder-only LLM learnt during Pre-training?
00:49:08 STEP 2: Task-specific Finetuning of an Encoder-only LLM
00:49:27 Task 1: Classification Use Case
00:51:42 Task 2: Labeling Use Case
00:53:32 Task 3: Multi-task Finetuning (Classification + Labeling in one-shot)
00:58:13 DEMO 1: Multi-task Finetuning (classifying and labeling support tickets)
01:26:08 Task 4: Named Entity Recognition (NER) Use Case
01:43:12 DEMO 2: Entity Extraction in milliseconds using Encoder-only LLM without writing a single regex rule
02:02:29 Task 5: Similarity Recommendation/Duplication Detection Use Case
02:08:32 Sentence Transformer/Siamese Network
02:13:33 DEMO 3: Demo of showing similar topics given a topic purely based on Encoder-only LLMs and VectorDB
02:27:28 Summary of Season 3, Episode 2 (this episode)

Комментарии

Информация по комментариям в разработке

Похожие видео

  • О нас
  • Контакты
  • Отказ от ответственности - Disclaimer
  • Условия использования сайта - TOS
  • Политика конфиденциальности

video2dn Copyright © 2023 - 2025

Контакты для правообладателей [email protected]