Using Power Transformers (Box-Cox & Yeo-Johnson) to make features Gaussian-like | Machine Learning

Описание к видео Using Power Transformers (Box-Cox & Yeo-Johnson) to make features Gaussian-like | Machine Learning

In this tutorial, we'll look at Power Transformer, a powerful feature transformation technique for linear Machine Learning models.

In the tutorial, we'll be going through all the nitty-gritties of Power Transformer, when to use them, when NOT to use them, how is it helpful, how is it NOT so helpful etc etc.

Feature scaling is so important that your model performance could shoot up by many a percentage points if you use the correct feature scaling techniques.

In a nutshell, Power Transformer works by trying to induce homoscedasticity in the model which is a basic assumption for Linear Machine Learning models, and tries to make each individual feature as much closer to a Gaussian curve as possible.

Here are links to some useful resources to learn more about Power Transformers:

  / how-to-use-powertransformer-to-improve-mod...  

  / using-scipys-powertransformer  

https://blog.minitab.com/blog/applyin...

https://en.wikipedia.org/wiki/Homosce...

https://en.wikipedia.org/wiki/Power_t...

I've uploaded all the relevant code and datasets used here (and all other tutorials for that matter) on my github page which is accessible here:

Link:

https://github.com/rachittoshniwal/ma...

If you like my content, please do not forget to upvote this video and subscribe to my channel.

If you have any qualms regarding any of the content here, please feel free to comment below and I'll be happy to assist you in whatever capacity possible.

Thank you!

Комментарии

Информация по комментариям в разработке