Robust learning algorithm for the mixture of experts

Hector Allende, Romina Torres, Rodrigo Salas, Claudio Moraga

Producción científica: Capítulo del libro/informe/acta de congresoCapítulorevisión exhaustiva

2 Citas (Scopus)

Resumen

The Mixture of Experts model (ME) is a type of modular artificial neural network (MANN) whose architecture is composed by different kinds of networks who compete to learn different aspects of the problem. This model is used when the searching space is stratified. The learning algorithm of the ME model consists in estimating the network parameters to achieve a desired performance. To estimate the parameters, some distributional assumptions are made, so the learning algorithm and, consequently, the parameters obtained depends on the distribution. But when the data is exposed to outliers the assumption is not longer valid, the model is affected and is very sensible to the data as it is showed in this work. We propose a robust learning estimator by means of the generalization of the maximum likelihood estimator called M-estimator. Finally a simulation study is shown, where the robust estimator presents a better performance than the maximum likelihood estimator (MLE).

Idioma originalInglés
Título de la publicación alojadaLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
EditoresFrancisco Jose Perales, Aurelio J. C. Campilho, Nicolas Perez Perez, Nicolas Perez Perez
EditorialSpringer Verlag
Páginas19-27
Número de páginas9
ISBN (versión impresa)3540402179, 9783540402176
DOI
EstadoPublicada - 2003
Publicado de forma externa

Serie de la publicación

NombreLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volumen2652
ISSN (versión impresa)0302-9743
ISSN (versión digital)1611-3349

Huella

Profundice en los temas de investigación de 'Robust learning algorithm for the mixture of experts'. En conjunto forman una huella única.

Citar esto