We study minimax rates of convergence for nonparametric regression under a random design with dependent errors. It is shown that when the errors are independent of the explanatory variables, long-range dependence among the errors does not necessarily hurt regression estimation, which at first glance contradicts with earlier results by Hall and Hart, Wang, and Johnstone and Silverman under a fixed design. In fact we show that, in general, the minimax rate of convergence under the square $L_2$ loss is simply at the worse of two quantities: one determined by the massiveness of the class alone and the other by the severity of the dependence among the errors alone. The clear separation of the effects of the function class and dependence among the errors in determining the minimax rate of convergence is somewhat surprising. Examples of function classes under different covariance structures including both short- and long-range dependences are given.
"Nonparametric regression with dependent errors." Bernoulli 7 (4) 633 - 655, August 2001.