Making the best NLU with Rasa and BERT, Rasa Developer Summit 2019

Описание к видео Making the best NLU with Rasa and BERT, Rasa Developer Summit 2019

Mady Mantha, AI Platform Leader at Sirius Computer Solutions, shares how to build highly performant NLP by integrating BERT with a custom NLU pipeline.

Bidirectional Encoder Representations from Transformers (BERT) is a NLP pre-training technique released by Google. BERT's key innovation is its ability to pre-train bidirectional, contextual language representations modeled on a large text corpus. The model can then be used for downstream NLP tasks like Natural Language Understanding (NLU) and question answering. Named Entity Recognition (NER) is a subtask of NLU that attempts to identify and classify entities in a given text into pre-defined categories like names, places, organizations, currency, and quantities. A NER model can be trained using BERT. Integration of BERT NER with Rasa using a custom pipeline resulted in highly performant NLP and engaging conversations between humans and Rasa agents.

Комментарии

Информация по комментариям в разработке