We consider the problem of estimating the time-average variance constant for a stationary process. A previous paper described an approach based on multiple integrations of the simulation output path, and described the efficiency improvement that can result compared with the method of batch means (which is a special case of the method). In this paper we describe versions of the method that have low bias for moderate simulation run lengths. The method constructs an estimator based on applying a quadratic function to the simulation output. The particular quadratic form is chosen to minimize variance subject to constraints on the order of the bias. Estimators that are first-order and second-order unbiased are described.