home *** CD-ROM | disk | FTP | other *** search
- Newsgroups: comp.compression.research
- Path: sparky!uunet!mcsun!ieunet!tcdcs!maths.tcd.ie!tim
- From: tim@maths.tcd.ie (Timothy Murphy)
- Subject: Re: C source for Fractal compression, huh !
- Message-ID: <1992Nov20.033432.18247@maths.tcd.ie>
- Organization: Dept. of Maths, Trinity College, Dublin, Ireland.
- References: <1992Nov16.184754.3170@maths.tcd.ie> <Bxu712.LvA@metaflow.com> <1992Nov18.024912.24072@maths.tcd.ie> <1992Nov18.083335.18739@adobe.com>
- Date: Fri, 20 Nov 1992 03:34:32 GMT
- Lines: 29
-
- wtyler@adobe.com (William Tyler) writes:
-
- >In article <1992Nov18.024912.24072@maths.tcd.ie> tim@maths.tcd.ie (Timothy Murphy) writes:
-
- >>Chaitin/Kolmogorov Algorithmic Information Theory
- >>does set an absolute limit to the degree to which
- >>any given data can be compressed:
- >>the string s cannot be compressed
- >>beyond its entropy (or informational content) H(s).
- >>H(s) is precisely defined,
-
- >Ah, there's the rub. H(s) is only precisely defined with respect to
- >some particular model of the information source s, expressed in the
- >form of probabilities.
-
- I'm afraid this is completely wrong.
- Algorithmic Information Theory
- has nothing to do with probability.
-
- The informational content H(s) of a string
- is the length of the shortest program
- which will output the string
- when fed into the universal Turing machine U.
-
- --
- Timothy Murphy
- e-mail: tim@maths.tcd.ie
- tel: +353-1-2842366
- s-mail: School of Mathematics, Trinity College, Dublin 2, Ireland
-