Distilling the Knowledge in a Neural Network

Описание к видео Distilling the Knowledge in a Neural Network

This is the first and foundational paper that started the research area of Knowledge Distillation.

Knowledge Distillation is a study of methods and techniques to extract the information from a cumbersome model (also called the Teacher model) and provide it to a simpler model (also called the Student model). Student models are the ones that are used for inference (especially on resource-constrained devices) and are supposed to excel at both accuracy and speed of prediction

Link to the paper:
https://arxiv.org/abs/1503.02531

Link to the summary of the paper:
https://towardsdatascience.com/paper-...

#KnowledgeDistillation
#deeplearning
#softmax
#machinelearning

Комментарии

Информация по комментариям в разработке