We analyze the asymptotic behavior of two-stage procedures for multiple comparisons with the best (MCB) for comparing the steady-state means of alternative systems using simulation. The two procedures we consider differ in how they estimate the variance parameters of the alternatives in the first stage. One procedure uses a consistent estimator, and the other employs an estimator based on one of Schruben's standardized time series (STS) methods. While both methods lead to mean total run lengths that are of the same asymptotic order of magnitude, the limiting variability of the run lengths is strictly smaller for the method based on a consistent variance estimator. We also provide some analysis showing how to choose the first-stage run length.