Loading…

On the Convergence Rates of IPA and FDC Derivative Estimators

We show that under the (sufficient) conditions usually given for infinitesimal perturbation analysis (IPA) to apply for derivative estimation, a finite-difference scheme with common random numbers (FDC) has the same order of convergence, namely O ( n –1/2 ), provided that the size of the finite-diff...

Full description

Saved in:
Bibliographic Details
Published in:Operations research 1994-07, Vol.42 (4), p.643-656
Main Authors: L'Ecuyer, Pierre, Perron, Gaetan
Format: Article
Language:English
Subjects:
Citations: Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:We show that under the (sufficient) conditions usually given for infinitesimal perturbation analysis (IPA) to apply for derivative estimation, a finite-difference scheme with common random numbers (FDC) has the same order of convergence, namely O ( n –1/2 ), provided that the size of the finite-difference interval converges to zero fast enough. This holds for both one- and two-sided FDC. This also holds for different variants of IPA, such as some versions of smoothed perturbation analysis (SPA), which is based on conditional expectation. Finally, this also holds for the estimation of steady-state performance measures by truncated-horizon estimators, under some ergodicity assumptions. Our developments do not involve monotonicity, but are based on continuity and smoothness. We give examples and numerical illustrations which show that the actual difference in mean square error (MSE) between IPA and FDC is typically negligible. We also obtain the order of convergence of that difference, which is faster than the convergence of the MSE to zero.
ISSN:0030-364X
1526-5463
DOI:10.1287/opre.42.4.643