Leave one out and k-fold cross validation| using R| cv.glm | train and test data | prediction error

Описание к видео Leave one out and k-fold cross validation| using R| cv.glm | train and test data | prediction error

In this video, I show you in R:
1. how to split data into training and test data
2. how to use training data to build the model and test data to check the model
3. how to do cross-validation using cv.glm()
4. how to do k-fold cross validation using cv.glm()

To learn about the idea behind cross-validation, please check out this video:    • What is cross validation? Why we need...   It takes you 8 minutes. and it explains the basics of training and testing data, training and testing errors.
1. Why do we need cross-validations?
2. Two methods of cross-validations:
a. leave one out
b. k-fold.

Again, the full R code from this video is available here: https://github.com/yz-DataScience/R-f...

Комментарии

Информация по комментариям в разработке