Header menu link for other important links
X
Gradient Schemes with Simultaneous Perturbation Stochastic Approximation
, Bhatnagar S., Prasad H.
Published in Springer London
2013
Pages: 41 - 76
Abstract

In this chapter, we discuss an interesting class of gradient search algorithms which estimate the gradient by perturbing all parameter components simultaneously. These algorithms go by the name Simultaneous Perturbation Stochastic Approximation (SPSA) and require only one or two evaluations of the cost objective regardless of the parameter dimension. The original SPSA algorithm due to Spall [26] required two function measurements and used random perturbations with properties that are seen to be easily satisfied by independent, symmetric, zero-mean, ±1 valued, Bernoulli random variables. A one-measurement version of this algorithm was subsequently proposed [27] but did not perform as well as its two-measurement counterpart. In [8], certain deterministic constructions for the perturbation sequences were proposed. It was observed in particular that a construction based on Hadamard matrices considerably improves the performance of one-measurement SPSA over its random perturbation counterpart. We discuss gradient SPSA algorithms in this chapter, both for the case of expected cost as well as long-run average cost objectives.

About the journal
JournalData powered by TypesetStochastic Recursive Algorithms for Optimization
PublisherData powered by TypesetSpringer London
Open AccessNo