Generate Blogs | Articles with OpenAI-GPT2 🔥🔥🔥

Описание к видео Generate Blogs | Articles with OpenAI-GPT2 🔥🔥🔥

#nlp #transformers #gpt2

Hey guys, so in this video, we will be writing or "generating" blogs-articles-texts using the almighty OpenAI-GPT2 transformers model from the huggingface 🤗 transformers library in a few lines of code!

GPT-2 is a large transformer-based language model with 1.5 billion parameters, trained on a dataset of 8 million web pages. GPT-2 is trained with a simple objective: predict the next word, given all of the previous words within some text. The diversity of the dataset causes this simple goal to contain naturally occurring demonstrations of many tasks across diverse domains. GPT-2 is a direct scale-up of GPT, with more than 10X the parameters and trained on more than 10X the amount of data.

So please do like, share and subscribe if you find the video interesting and helpful!

Blog Version: https://bit.ly/3kBKCLg
Code: https://bit.ly/3wZNVOF

More Resources:
https://openai.com/blog/better-langua...
https://jalammar.github.io/illustrate...

Комментарии

Информация по комментариям в разработке