Header menu link for other important links
X
Kiefer-Wolfowitz Algorithm
, Bhatnagar S., Prasad H.
Published in Springer London
2013
Pages: 31 - 39
Abstract

In this chapter, we review the Finite Difference Stochastic Approximation (FDSA) algorithm, also known as Kiefer-Wolfowitz (K-W) algorithm, and some of its variants for finding a local minimum of an objective function. The K-W scheme is a version of the Robbins-Monro stochastic approximation algorithm and incorporates balanced two-sided estimates of the gradient using two objective function measurements for a scalar parameter. When the parameter is an N-dimensional vector, the number of function measurements using this algorithm scales up to 2N. A one-sided variant of this algorithm in the latter case requires N + 1 function measurements. We present the original K-W scheme, first for the case of a scalar parameter, and subsequently for a vector parameter of arbitrary dimension. Variants including the one-sided version are then presented. We only consider here the case when the objective function is a simple expectation over noisy cost samples and not when it has a long-run average form. The latter form of the cost objective would require multi-timescale stochastic approximation, the general case of which was discussed in Chapter 3. Stochastic algorithms for the long-run average cost objectives will be considered in later chapters.

About the journal
JournalData powered by TypesetStochastic Recursive Algorithms for Optimization
PublisherData powered by TypesetSpringer London
Open AccessNo