Thanks to visit codestin.com
Credit goes to github.com

Skip to content

Leave-one-out cross validation is very valuable #107

@TimothyMasters

Description

@TimothyMasters

Near the end of Section 3.4.1 the statement is made that leave-one-out cross validation is deprecated. Because this version is the most compute-intensive of all (training is done most often, and training-set sizes are largest, an expensive combination), and many modern model-fitting applications are themselves compute-intensive, it may occasionally be the case that leave-one-out is impractical due to limitations on computing time. However, when one is not faced with such limits, leave-one-out is usually the best possible choice for cross validation. This is because stability of models is nearly always dependent on training-set size. As an extreme example, even simple linear regression is undefined when there are fewer cases than coefficients. On a more practical note, neural networks can be wildly unstable with small training sets. Thus, in order to minimize the variance due to model variation, not to mention prevent completely illegal training environments, it is nearly always in our best interest to make each fold's parameter-learning set as large as possible, which is obtained with leave-one-out cross validation.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions