On certain criteria for optimum estimation
For a continuous bivariate distribution with the two variates possessing the same mean and finite (but unequal) variances, it has been shown that the variate having the smaller variance will be the closer estimator of the mean, except when the ratio of the variances is in a finite interval determined by the distribution. From this it was possible to obtain a necessary and sufficient condition for the equivalence of closeness and minimum variance, namely:;If X and Y are two random variates having variances s21 and s22 respectively and if (i) E(X) = E(Y) = theta, (ii) f(x,y) is the joint probability density function of X and Y, and (iii) f(0,0) ≠ 0 and, further, if S = X/ s1 and T = Y/ s2 , then a necessary and sufficient condition for closeness and minimum variance to be equivalent for X and Y is P(|S| < |T|) = 1/2;The above results were then modified slightly for the case where the variates admit only a discrete bivariate distribution;Asymptotic closeness was defined and, under certain general conditions, it was shown that asymptotic smaller variance and asymptotic closeness were equivalent. As an extension of the theorem concerning the equivalence of asymptotic closeness and asymptotic smaller variance, it was shown that the asymptotically closest estimator (if it be normally distributed in the limit) is an asymptotically efficient estimator. With certain modifications, the results were extended to the case of variates admitting only a discrete distribution;The results were further extended to the case of biased estimators, using the concept of minimum mean square deviation rather than minimum variance;A definition of closeness was formulated for the multi-parameter situation and a solution, corresponding to that provided in the one-parameter case, was indicated.