Daniel Herrmann (University of California, Irvine)
There are justifications for taking Occam's Razor as an epistemic principle in the computational learning theory and machine learning literature. I clarify and argue against one widely used justification of the epistemic value of Occam's Razor---that from probably approximately correct (PAC) learning. This result, first proved by Blumer, Ehrenfeucht, Haussler, and Warmuth in their 1987 paper (Blum et al., 1987) is still a highly reproduced and cited result.
I present the theorem given to justify Occam's Razor in the PAC learning literature. I then motivate my work by stating and proving a similar theorem that, by the same reasoning, would justify Anti-Occam's Razor: the principle that we should favour complex hypotheses over simple ones. I discuss what both these theorems actually say, why they do not conflict, and suggest a different interpretation than the one standard in the literature. Rather than providing a justification for Occam's Razor, what we find is that we are able learn when we are able to restrict our hypothesis space to one that grows polynomially with the input string and we are guaranteed to output an approximately correct hypothesis. This requires that each hypothesis space in the family exhibits a certain similarity property. It is this requirement that is the unstated assumption that is doing most of the work. Finally, I also suggest some ways to move forward, highlighting what I think are the reasons that philosophers should care about PAC learning, and computational complexity in general.