home *** CD-ROM | disk | FTP | other *** search
- Path: sparky!uunet!charon.amdahl.com!pacbell.com!sgiblab!spool.mu.edu!yale.edu!qt.cs.utexas.edu!cs.utexas.edu!sun-barr!ames!agate!doc.ic.ac.uk!uknet!ieunet!tcdcs!maths.tcd.ie!tim
- From: tim@maths.tcd.ie (Timothy Murphy)
- Newsgroups: comp.compression.research
- Subject: Re: C source for Fractal compression, huh !
- Message-ID: <1992Nov18.024912.24072@maths.tcd.ie>
- Date: 18 Nov 92 02:49:12 GMT
- References: <MICHAEL.92Nov13144140@pullet.lanl.gov> <1992Nov16.184754.3170@maths.tcd.ie> <Bxu712.LvA@metaflow.com>
- Organization: Dept. of Maths, Trinity College, Dublin, Ireland.
- Lines: 29
-
- rschnapp@metaflow.com (Russ Schnapp) writes:
-
- >The short answer is: No, information theory does not set an absolute
- >limit on maximum compression. It does set bounds on compression in
- >a particular context, though.
-
- I don't know what this means.
- As I understand it,
- Chaitin/Kolmogorov Algorithmic Information Theory
- does set an absolute limit to the degree to which
- any given data can be compressed:
- the string s cannot be compressed
- beyond its entropy (or informational content) H(s).
- H(s) is precisely defined,
- subject only to the choice of a particular universal Turing machine.
- (This last proviso would not have any effect in practice.)
-
- Although I must admit that I haven't looked into it carefully,
- I've read explanations of Barnsley's "fractal compression",
- and remain completely unconvinced.
- Nothing in the comp.compression FAQ,
- which I read at your suggestion,
- changed my mind.
-
- --
- Timothy Murphy
- e-mail: tim@maths.tcd.ie
- tel: +353-1-2842366
- s-mail: School of Mathematics, Trinity College, Dublin 2, Ireland
-