We re-take the possibilistic (strictly non-probabilistic) model for information sources and information coding put forward in Fuzzy Sets and Systems 132-1 (2002); the coding-theoretic possibilistic entropy is defined there as the asymptotic rate of compression codes, which are optimal with respect to a possibilistic (not probabilistic) criterion. By proving a uniqueness theorem, in this paper we provide also an axiomatic derivation for such a possibilistic entropy, and so we are able to support its use as an adequate measure of non-specificity, or rather of possibilistic ignorance, as we shall prefer to say. We compare our possibilistic entropy with two well-known measures of non-specificity: Hartley measure as found in set theory and U-uncertainty as found in possibility theory. The comparison allows us to show that the latter posseses a coding-theoretic meaning.
An axiomatic derivation of the coding-theoretic possibilistic entropy
SGARRO, ANDREA
2004-01-01
Abstract
We re-take the possibilistic (strictly non-probabilistic) model for information sources and information coding put forward in Fuzzy Sets and Systems 132-1 (2002); the coding-theoretic possibilistic entropy is defined there as the asymptotic rate of compression codes, which are optimal with respect to a possibilistic (not probabilistic) criterion. By proving a uniqueness theorem, in this paper we provide also an axiomatic derivation for such a possibilistic entropy, and so we are able to support its use as an adequate measure of non-specificity, or rather of possibilistic ignorance, as we shall prefer to say. We compare our possibilistic entropy with two well-known measures of non-specificity: Hartley measure as found in set theory and U-uncertainty as found in possibility theory. The comparison allows us to show that the latter posseses a coding-theoretic meaning.Pubblicazioni consigliate
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.