We re-visit the age-old problem of estimating the parameters of a distribution from its observations. Traditionally, scientists and statisticians have attempted to obtain strong estimates by "extracting" the information contained in the observations taken as a set. However, generally speaking, the information contained in the sequence in which the observations have appeared, has been ignored - i.e., except to consider dependence information as in the case of Markov models and n-gram statistics. In this paper, we present results which, to the best of our knowledge, are the first reported results, which consider how estimation can be enhanced by utilizing both the information in the observations and in their sequence of appearance. The strategy, known as Sequence Based Estimation (SBE) works as follows. We first quickly allude to the results pertaining to computing the Maximum Likelihood Estimates (MLE) of the data when the samples are taken individually. We then derive the corresponding MLE results when the samples are taken two-at-a-time, and then extend these for the cases when they are processed three-at-a-time, four-at-a-time etc. In each case, we also experimentally demonstrate the convergence of the corresponding estimates. We then suggest various avenues for future research, including those by which these estimates can be fused to yield a superior overall cumulative estimate of the parameter of the distribution. We believe that our new estimates have great potential for practitioners, especially when the cardinality of the observation set is small.

Additional Metadata
Series Lecture Notes in Computer Science
Oommen, J, Kim, S.-W. (Sang-Woon), & Horn, G. (Geir). (2006). On the theory and applications of sequence based estimation of independent binomial random variables. In Lecture Notes in Computer Science.