Paper
1 July 1992 Parametric and additive perturbations for global optimization
James Ting-Ho Lo
Author Affiliations +
Abstract
A new iterative approach to global minimization is proposed. In each iteration, the approach rocks the `landscape' of the objective function and rolls the ball representing the current state of the variable down to the bottom of a nearby `valley.' In the process of lowering the rock level, some critical rock levels are sufficient to rock the ball out of the attraction region of a strictly local minimum, but insufficient to rock it out of that of a global minimum. If these critical rock levels are maintained long enough, the attraction region of a global minimum is expected to be reached. When the rock stops, the ball rolls right into the global minimum. The approach applies to both continuous and combinatorial optimization. Rock is performed by either perturbing the constants of the objective function, adding a perturbing function to it, or both. Roll is carried out by any local minimization method. Although some initial numerical results are encouraging, a systematic way to schedule the rock level lowering, which guarantees convergence to a global minimum, is yet to be discovered. To demonstrate the application of the rock and roll approach under the assumption that a rock schedule is available, we show in the last two sections, how the backpropagation training algorithm can be rocked to produce a globally optimal multilayer perceptron and how the Hopfield net can be rocked to produce a combinatorially minimal solution.
© (1992) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
James Ting-Ho Lo "Parametric and additive perturbations for global optimization", Proc. SPIE 1710, Science of Artificial Neural Networks, (1 July 1992); https://doi.org/10.1117/12.140090
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Algorithms

Neurons

Artificial neural networks

Promethium

Statistical analysis

Stochastic processes

Iterative methods

Back to Top