MML Inference of Single-Layer Neural Networks

Enes Makalic, Lloyd Allison & David L. Dowe,
School of Computer Science and Software Engineering,
Monash University, Clayton, Victoria 3800, Australia.

(The Third IASTED International Conference on Artificial Intelligence and Applications, AIA 2003, September 8-10, 2003, Benalmadena, Spain.)

home1 home2
and the

 Ind. Inf.

Abstract: The architecture selection problem is of great importance when designing neural networks. A network that is too simple does not learn the problem sufficiently well. Conversely, a larger than necessary network presumably indicates overfitting and provides low generalisation performance. This paper presents a novel architecture selection criterion for single hidden layer feedforward networks. The optimal network size is determined using a version of the Minimum Message Length (MML) inference method. Performance is demonstrated on several problems and compared with a Minimum Description Length (MDL) based selection criterion.

[], [~enesm][10/'04].

Coding Ockham's Razor, L. Allison, Springer

A Practical Introduction to Denotational Semantics, L. Allison, CUP

free op. sys.
free office suite
~ free photoshop
web browser

Also see:

AIA 2003

© L. Allison   (or as otherwise indicated),
Faculty of Information Technology (Clayton), Monash University, Australia 3800 (6/'05 was School of Computer Science and Software Engineering, Fac. Info. Tech., Monash University,
was Department of Computer Science, Fac. Comp. & Info. Tech., '89 was Department of Computer Science, Fac. Sci., '68-'71 was Department of Information Science, Fac. Sci.)
Created with "vi (Linux + Solaris)",  charset=iso-8859-1,  fetched Saturday, 24-Feb-2024 13:28:27 AEDT.