• Medientyp: E-Artikel
  • Titel: Stochastic Optimization Methods for Fitting Polyclass and Feed-Forward Neural Network Models
  • Beteiligte: Kooperberg, Charles; Stone, Charles J.
  • Erschienen: American Statistical Association, Institute of Mathematical Statistics, and Interface Foundation of North America, 1999
  • Erschienen in: Journal of Computational and Graphical Statistics
  • Sprache: Englisch
  • ISSN: 1061-8600
  • Entstehung:
  • Anmerkungen:
  • Beschreibung: <p>Kooperberg, Bose, and Stone introduced POLYCLASS, a methodology that uses adaptively selected linear splines and their tensor products to model conditional class probabilities. The authors attempted to develop a methodology that would work well on small and moderate size problems and would scale up to large problems. However, the version of POLYCLASS that was developed for large problems was computationally impractical beyond a certain point. In this article we gain further insight into the fitting of large POLYCLASS models by simultaneously considering the fitting of large feed-forward neural network models with a single hidden layer (NEURALNET). In this combined setting, the stochastic gradient method, as used in the online version of the backpropagation method for fitting neural network models, and a stochastic version of the conjugate gradient method for fitting such models emerge as being computationally attractive. In particular, these stochastic methods are successfully applied to the fitting of POLYCLASS and NEURALNET models in the context of a phoneme recognition problem involving 45 phonemes, 81 features, 150,000 cases in the training sample, up to 1,000 basis functions and 44,000 parameters for POLYCLASS, and up to 800 hidden nodes and about 100,000 parameters for NEURALNET.</p>