Adaptive Regression by Mixing

dc.contributor.author Yang, Yuhong
dc.contributor.department Statistics
dc.date 2018-02-16T21:38:23.000
dc.date.accessioned 2020-07-02T06:55:55Z
dc.date.available 2020-07-02T06:55:55Z
dc.date.issued 1999
dc.description.abstract <p>Adaptation over different procedures is of practical importance. Different procedures perform well under different conditions. In many practical situations, it is rather hard to assess which conditions are (approximately) satisfied so as to identify the best procedure for the data at hand. Thus automatic adaptation over various scenarios is desirable. A practically feasible method, named adaptive regression by mixing (ARM), is proposed to convexly combine general candidate regression procedures. Under mild conditions, the resulting estimator is theoretically shown to perform optimally in rates of convergence without knowing which of the original procedures work the best. Simulations are conducted in several settings, including comparing a parametric model with nonparametric alternatives, comparing a neural network with a projection pursuit in multidimensional regression, and combining bandwidths in kernel regression. The results clearly support the theoretical property of ARM. The ARM algorithm assigns weights on the candidate models-procedures via proper assessment of performance of the estimators. The data are split into two parts, one for estimation and the other for measuring behavior in prediction. Although there are many plausible ways to assign the weights, ARM has a connection with information theory, which ensures the desired adaptation capability. Indeed, under mild conditions, we show that the squared L<sub>2</sub> risk of the estimator based on ARM is basically bounded above by the risk of each candidate procedure plus a small penalty term of order 1/n. Minimizing over the procedures gives the automatically optimal rate of convergence for ARM. Model selection often induces unnecessarily large variability in estimation. Alternatively, a proper weighting of the candidate models can be more stable, resulting in a smaller risk. Simulations suggest that ARM works better than model selection using Akaike or Bayesian information criteria when the error variance is not very small.</p>
dc.description.comments <p>This preprint was published as Yuhong Yang, "Adaptive Regression by Mixing", <em>Journal of the American Statistical Association</em> (2001): 574-588, doi: <a href="http://dx.doi.org/10.1198/016214501753168262" target="_blank">10.1198/016214501753168262</a>.</p>
dc.identifier archive/lib.dr.iastate.edu/stat_las_preprints/124/
dc.identifier.articleid 1110
dc.identifier.contextkey 7445808
dc.identifier.s3bucket isulib-bepress-aws-west
dc.identifier.submissionpath stat_las_preprints/124
dc.identifier.uri https://dr.lib.iastate.edu/handle/20.500.12876/90284
dc.language.iso en
dc.source.bitstream archive/lib.dr.iastate.edu/stat_las_preprints/124/1999_YangY_AdaptiveRegressionMixing.pdf|||Fri Jan 14 19:20:31 UTC 2022
dc.subject.disciplines Statistics and Probability
dc.subject.keywords adaptive estimation
dc.subject.keywords combining procedures
dc.subject.keywords nonparametric regression
dc.title Adaptive Regression by Mixing
dc.type article
dc.type.genre article
dspace.entity.type Publication
relation.isOrgUnitOfPublication 264904d9-9e66-4169-8e11-034e537ddbca
File
Original bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
1999_YangY_AdaptiveRegressionMixing.pdf
Size:
1008.86 KB
Format:
Adobe Portable Document Format
Description:
Collections