Paper
11 March 2002 Interdependencies in data preprocessing, training methods, and neural network topology generation
Stephan Rudolph, Steffen Brueckner
Author Affiliations +
Abstract
Artificial neural networks are adaptive methods which can be trained to approximate a functional relationship implicitly encoded in training data. A large variety of neural network types (e.g. linear versus non-linear) gives rise to principal questions about the appropriateness of data pre-processing techniques, training methodologies, the resulting neural net-work topology and possible interdependencies thereof. The a posteriori interpretation of the numerical results gives hints for some guidelines for neural network applications in engineering applications. Data pre-processing techniques are a powerful means for pre-structuring the problem setting of function approximation through an adaptive training procedure. Especially integral transforms may change the nature of the training problem significantly without loss of generality if carefully selected and represent an excellent opportunity to incorporate additional knowledge about the process to improve the training and the result interpretation. Some numerical examples from engineering domains are used to illustrate the theoretical arguments in the context of a practical setting.
© (2002) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Stephan Rudolph and Steffen Brueckner "Interdependencies in data preprocessing, training methods, and neural network topology generation", Proc. SPIE 4739, Applications and Science of Computational Intelligence V, (11 March 2002); https://doi.org/10.1117/12.458702
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Neural networks

Wavelets

Acoustics

Time-frequency analysis

Fourier transforms

Genetic algorithms

Neurons

Back to Top