Activation Functions in Deep Learning | Sigmoid, Tanh and Relu Activation Function

Описание к видео Activation Functions in Deep Learning | Sigmoid, Tanh and Relu Activation Function

In artificial neural networks, each neuron forms a weighted sum of its inputs and passes the resulting scalar value through a function referred to as an activation function or transfer function. In this video, we explain the basics of Sigmoid, Tanh, and Relu—important parts of how computers learn.

👍If you find this video helpful, consider giving it a thumbs up and subscribing for more educational videos on data science!

💭Share your thoughts, experiences, or questions in the comments below. I love hearing from you!

============================
Do you want to learn from me?
Check my affordable mentorship program at : https://learnwith.campusx.in
============================

📱 Grow with us:
CampusX' LinkedIn:   / campusx-official  
CampusX on Instagram for daily tips:   / campusx.official  
My LinkedIn:   / nitish-singh-03412789  
Discord:   / discord  

✨ Hashtags✨
#SimpleLearning #ActivationFunctionsExplained #EasyTech

⌚Time Stamps⌚

00:00 - Intro
00:47 - What are activation functions?
03:28 - Importance of AF
04:58 - Code Demo
06:38 - Why activation functions are needed?
11:05 - Ideal Activation function
18:41 - Sigmoid Activation Function
20:37 - Advantages
22:56 - Disadvantages
36:15 - Tan h Activation Function
38:00 - Advantages
39:02 - Disadvantages
40:17 - Relu Activation Function
40:50 - Advantages
42:43 - Disadvantages
44:24 - Outro

Комментарии

Информация по комментариям в разработке