What is MaxOut in Deep Learning?

Описание к видео What is MaxOut in Deep Learning?

MaxOut is a technique introduced by Ian Goodfellow in 2013, which can learn different activation functions within each of its units.

In this tutorial, we'll be reviewing what a MaxOut network is and how it's constructed, as well as checking out it's relevance with more modern architecture.

Table of Content
- What is Drop Out: 0:00
- Drop Out vs Stochastic Gradient Descent: 0:52
- What is MaxOut?: 1:27
- MaxOut can learn Activation Functions: 2:45
- MaxOut is a Universal Approximator: 4:02
- MaxOut Performance across Benchmarks: 4:41
- MaxOut vs Rectifiers: 5:50
- Conclusion: 8:08

To learn more about MaxOut here are a few interesting links:
📌 [Paper] MaxOut Networks: https://arxiv.org/abs/1302.4389
📌 [Code] A Pytorch Implementation that is very easy to read: https://github.com/paniabhisek/maxout...
📌 [Paper] MaxOut vs Relu over multiple data sets: Evaluation of maxout activations in deep learning across several big data domains: https://journalofbigdata.springeropen...

Abstract:
"We consider the problem of designing models to leverage a recently introduced approximate model averaging technique called dropout.

We define a simple new model called maxout (so named because its output is the max of a set of inputs, and because it is a natural companion to dropout) designed to both facilitate optimization by dropout and improve the accuracy of dropout's fast approximate model averaging technique.

We empirically verify that the model successfully accomplishes both of these tasks. We use maxout and dropout to demonstrate state of the art classification performance on four benchmark datasets: MNIST, CIFAR-10, CIFAR-100, and SVHN."

----
Join the Discord for general discussion:   / discord  

----
Follow Me Online Here:

Twitter:   / yacineaxya  
GitHub: https://github.com/yacineMahdid
LinkedIn:   / yacinemahdid  
___

Have a great week! 👋

Комментарии

Информация по комментариям в разработке