next up previous
Next: Query learning Up: Statistical inference and neural Previous: Gaussian Processes, Support Vector

Finite size effects

P Sollich. Finite-size effects in learning and generalization in linear perceptrons. Journal of Physics A, 27:7771-7784, 1994.
Abstract and full paper

P Sollich. Learning in large linear perceptrons and why the thermodynamic limit is relevant to the real world. In G Tesauro, D S Touretzky, and T K Leen, editors, Advances in Neural Information Processing Systems 7, pages 207-214, Cambridge, MA, 1995. MIT Press.
Abstract and full paper

D Barber, D Saad, and P Sollich. Test error fluctuations in finite linear perceptrons. Neural Computation, 7:809-821, 1995.
Abstract and full paper

D Barber, D Saad, and P Sollich. Finite-size effects and optimal test set size in linear perceptrons. Journal of Physics A, 28:1325-1334, 1995.
Abstract and full paper

D Barber, D Saad, and P Sollich. Finite size effects in online learning of multilayer neural networks. Europhysics Letters, 34:151-156, 1996.
Abstract and full paper

D Barber, P Sollich, and D Saad. Finite size effects in on-line learning in multilayer neural networks. In S W Ellacott, J C Mason, and I J Anderson, editors, Mathematics of Neural Networks: Models, Algorithms and Applications, pages 84-88, Boston, MA, 1997. Kluwer Academic.
Abstract and full paper



Last updated Mon Oct 17 2016
Contact: Peter Sollich
Back to my home page