Automatic Summarization using Deep Learning | Abstractive Summarization with Pegasus

Описание к видео Automatic Summarization using Deep Learning | Abstractive Summarization with Pegasus

So you're tired of reading Emma too?

Pegasus is here to help. The Pegasus model is built using a Transformer Encoder-Decoder architecture and is ridiculously powerful when it comes to summarizing big blocks of text.

You can get started with it super quickly using the Transformers library from Hugging Face and Python. This Python tutorial will walk you through how to do it all from start to finish.

In this video, you'll learn how to:
1. Install Dependencies for Transformers in Python
2. Import and Configure the Pegasus X-Sum Model
3. Perform Abstractive Summarization on Wikipedia, News and Scientific Journals

Get the code: https://github.com/nicknochnack/Pegas...

Links:
Pegasus Paper: https://arxiv.org/pdf/1912.08777.pdf
X-Sum Model: https://huggingface.co/google/pegasus...
Install PyTorch: https://pytorch.org/get-started/locally/

Chapters:
0:00 - Start
2:56 - What you'll learn
4:09 - Tutorial Kickoff
4:42 - Install Dependencies
7:40 - Load Model and Tokenizer
11:53 - Perform Abstractive Summarization on Wikipedia Articles
17:36 - Results of Summarization
21:12 - Summarizing News Articles
22:52 - Summarizing Scientific Research

Oh, and don't forget to connect with me!
LinkedIn: https://bit.ly/324Epgo
Facebook: https://bit.ly/3mB1sZD
GitHub: https://bit.ly/3mDJllD
Patreon: https://bit.ly/2OCn3UW
Join the Discussion on Discord: https://bit.ly/3dQiZsV

Happy coding!
Nick

P.s. Let me know how you go and drop a comment if you need a hand!

Комментарии

Информация по комментариям в разработке