IEEE Transactions on Information Theory 50(1), 78-88.
ISSN/ISBN: 0018-9448 DOI: Not available at this time.
Abstract: ABSTRACT: If a sequence of independent unbiased random bits is fed into a finite automaton, it is straightforward to calculate the expected number of acceptances among the first n prefixes of the sequence. This paper deals with the situation in which the random bits are neither independent nor unbiased, but are nearly so. We show that, under suitable assumptions concerning the automaton, if the difference between the entropy of the first n bits and n converges to a constant exponentially fast, then the change in the expected number of acceptances also converges to a constant exponentially fast. We illustrate this result with a variety of examples in which numbers following the reciprocal distribution, which governs the significands of floating-point numbers, are recoded in the execution of various multiplication algorithms
Bibtex:
Not available at this time.
Reference Type: Journal Article
Subject Area(s): Probability Theory