You Just Keep on Pushing My Love over the Borderline: A Rejoinder
Permanent link
https://hdl.handle.net/10037/13281Date
2017-04-06Type
Journal articleTidsskriftartikkel
Peer reviewed
Author
Simpson, Daniel; Rue, Håvard; Riebler, Andrea Ingeborg; Martins, Thiago Guerrera; Sørbye, Sigrunn HolbekAbstract
INTRODUCTION: The point of departure for our paper is that most modern statistical models are built to be flexible enough to model diverse data generating mechanisms. Good statistical practice requires us to limit this flexibility, which is typically controlled by a small number of parameters, to the amount “needed” to model the data at hand. The Bayesian framework provides a natural method for doing this although, as DD points out, this trend for penalising model complexity casts a broad shadow over all of modern statistics and data science.The PC prior framework argues for setting priors on these flexibility parameters that are specifically built to penalise a certain type of complexity and avoid overfitting. The discussants raised various points about this core idea. First, DD pointed out that while over-fitting a model is a bad thing, under-fitting is not better: we do not want Occam’s razor to slit our throat. We saw this behaviour when using a half-Normal prior on the distance, while the exponential prior does not lead to obvious attenuation of the estimates. This is confirmed experimentally by Klein and Kneib (2016).
Description
Source at https://doi.org/10.1214/17-sts576rej.