Can I use cross_validate in sklearn with cv=10 to instead of using Kfold with n_splits=10? Does they work as same?
1 Answer 1
I believe that KFold will simply carve your training data into 10 splits.
cross_validate, however, will also carve the data into 10 splits (with the cv=10 parameter) but it will also actually perform the cross-validation. In other words, it will run your model 10x and you will be able to report on the performance of your model, which KFold does not do.
In other words, KFold is 1 small step in cross_validation.
answered May 20, 2019 at 13:34
Sign up to request clarification or add additional context in comments.
1 Comment
Dr Nisha Arora
And if you've unbalanced data, you should use StratifiedKFold() instead of KFold() to solid your data into multiple folds. cross_val_score() or cross_validate() are used to perform cross-validation.
lang-py