How I think about Logistic Regression - Part 2

Описание к видео How I think about Logistic Regression - Part 2

A (hopefully) simple and intuitive explanation of how logistic regression works with more than 2 classes.

We first talk about One v Rest Classification, then talk about the softmax function, how it's derived, and what role it plays in logistic regression.

Note: if you watched the previous version of this video, I decided to republish it with a few minor edits. It's mostly the same but I wanted to do a better job of clarifying the two equivalent ways to model more than 2 classes using multinomial logistic regression (one uses a default class and the other uses the softmax function). This video's main edit compared to the previous version is clarifying the difference and equivalence between the two in the latter half of the video (04:15 onward).


One V Rest Classification 00:00-01:30
Adapting the Sigmoid Function 01:30-04:14
The Softmax Function 04:15-


Part 1:    • How I think about Logistic Regression...  
The Math Behind Logistic Regression:    • How I think about Logistic Regression...  
Part 3: Coming Soon!

Visualization and animation code on GitHub: https://github.com/gallettilance/repr...

Thumbnail by   / endless.yarning  

#statistics #machinelearning #logisticregression #education #classification #explained #mathexplained #machinelearningalgorithms #mathformachinelearning #machinelearningbasics #datascience #datasciencebasics #linearregression #probability #probabilitytheory

Комментарии

Информация по комментариям в разработке