1

Can I use cross_validate in sklearn with cv=10 to instead of using Kfold with n_splits=10? Does they work as same?

asked May 20, 2019 at 13:10

1 Answer 1

1

I believe that KFold will simply carve your training data into 10 splits.

cross_validate, however, will also carve the data into 10 splits (with the cv=10 parameter) but it will also actually perform the cross-validation. In other words, it will run your model 10x and you will be able to report on the performance of your model, which KFold does not do.

In other words, KFold is 1 small step in cross_validation.

answered May 20, 2019 at 13:34
Sign up to request clarification or add additional context in comments.

1 Comment

And if you've unbalanced data, you should use StratifiedKFold() instead of KFold() to solid your data into multiple folds. cross_val_score() or cross_validate() are used to perform cross-validation.

Your Answer

Draft saved
Draft discarded

Sign up or log in

Sign up using Google
Sign up using Email and Password

Post as a guest

Required, but never shown

Post as a guest

Required, but never shown

By clicking "Post Your Answer", you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.