Abstract:
This paper compares the power divergence statistics of orders alpha > 1 with the information divergence statistic in the problem of testing the uniformity of a distributi...Show MoreMetadata
Abstract:
This paper compares the power divergence statistics of orders alpha > 1 with the information divergence statistic in the problem of testing the uniformity of a distribution. In this problem the information divergence statistic is equivalent to the entropy statistic. Extending some previously established results about information diagrams, it is proved that in this problem the information divergence statistic is more efficient in the Bahadur sense than any power divergence statistic of order alpha > 1. This means that the entropy provides in a certain sense the most efficient way of characterizing the uniformity of a distribution.
Published in: 2007 IEEE International Symposium on Information Theory
Date of Conference: 24-29 June 2007
Date Added to IEEE Xplore: 09 July 2008
ISBN Information: