We studied SPSA-based gradient estimation techniques in the previous chapter. Along similar lines, we present in this chapter, smoothed functional (SF)-based estimators of the gradient. While SF is also based on simultaneously perturbing the parameter vector, unlike SPSA, for the purpose of perturbation, one uses a smoothing function that possesses certain properties. An alternate view of the SF approach is that the gradient is convolved with a smoothing function, which in turn could possibly help in finding the global minimum of the smoothed objective. We discuss SF-based algorithms where the smoothing is done using Gaussian and Cauchy density functions. The regular SF algorithms require only one measurement of the objective function. We also provide the two-measurement variants of all the algorithms presented. © 2013, Springer-Verlag London.