Model selection: Beyond the Bayesian/frequentist divide

Guyon, I, Saffari, A, Dror, G and Cawley, G (2010) Model selection: Beyond the Bayesian/frequentist divide. Journal of Machine Learning Research, 11. pp. 61-87. ISSN 1533-7928

Full text not available from this repository. (Request a copy)


The principle of parsimony also known as "Ockham's razor" has inspired many theories of model selection. Yet such theories, all making arguments in favor of parsimony, are based on very different premises and have developed distinct methodologies to derive algorithms. We have organized challenges and edited a special issue of JMLR and several conference proceedings around the theme of model selection. In this editorial, we revisit the problem of avoiding overfitting in light of the latest results. We note the remarkable convergence of theories as different as Bayesian theory, Minimum Description Length, bias/variance tradeoff, Structural Risk Minimization, and regularization, in some approaches. We also present new and interesting examples of the complementarity of theories leading to hybrid algorithms, neither frequentist, nor Bayesian, or perhaps both frequentist and Bayesian!

Item Type: Article
Faculty \ School: Faculty of Science > School of Computing Sciences

University of East Anglia > Faculty of Science > Research Groups > Computational Biology (subgroups are shown below) > Machine learning in computational biology
Related URLs:
Depositing User: EPrints Services
Date Deposited: 01 Oct 2010 13:42
Last Modified: 21 Apr 2020 17:04

Actions (login required)

View Item View Item