Aggregating Regression Procedures for a Better Performance
A fundamental question regarding combining procedures concerns the potential gain and how much one needs to pay for it in terms of statistical risk. Juditsky and Nemirovski considered the case where a large number of procedures are to be combined. We give upper and lower bounds for complementary cases. Under an l1 constraint on the linear coefficients, it is shown that for pursuing the best linear combination of nτ procedures, in terms of rate of convergence under the squared L2 loss, one can pay a price of order O(log n/n1-τ) when 0 < τ < 1/2 and a price of order O((log n/n)1/2) when 1/2 ≤ τ < ∞. These rates cannot be improved or essentially improved in a uniform sense. This result suggests that one should be cautious in pursuing the best linear combination, because one may end up paying a high price for nothing when linear combination in fact does not help. We show that with care in aggregation, the final procedure can automatically avoid paying the high price for such a case and then behaves as well as the best candidate procedure.
This preprint was published as Yuhong Yang, "Aggregating Regression Procedures to Improve Performance", Bernoulli (2004): 25-47, doi: 10.3150/bj/1077544602.