Paper
25 March 1998 Use of localized gating in mixture of experts networks
Viswanath Ramamurti, Joydeep Ghosh
Author Affiliations +
Abstract
The 'mixture-of-experts (MOE)' is a popular architecture for function approximation. In the standard architecture, each expert is gated via a softmax function, and its domain of application is not very localized. This paper summarizes several recent results showing the advantages of using localized gating instead. These include a natural framework for model selection/adaptation by growing and shrinking the number of experts, modeling of non-stationary environments, improving the generalization performance and obtaining confidence intervals of network outputs. These results substantially increase the scope and power of MOE networks. Several simulation results are presented to support the theoretical arguments.
© (1998) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Viswanath Ramamurti and Joydeep Ghosh "Use of localized gating in mixture of experts networks", Proc. SPIE 3390, Applications and Science of Computational Intelligence, (25 March 1998); https://doi.org/10.1117/12.304812
Lens.org Logo
CITATIONS
Cited by 6 scholarly publications.
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Data modeling

Network architectures

Performance modeling

Algorithm development

Expectation maximization algorithms

Solar radiation models

Computer simulations

Back to Top