home *** CD-ROM | disk | FTP | other *** search
- Newsgroups: comp.compression.research
- Path: sparky!uunet!zaphod.mps.ohio-state.edu!cis.ohio-state.edu!news.sei.cmu.edu!fs7.ece.cmu.edu!crabapple.srv.cs.cmu.edu!spot
- From: spot@CS.CMU.EDU (Scott Draves)
- Subject: Re: C source for Fractal compression, huh !
- In-Reply-To: tim@maths.tcd.ie's message of 18 Nov 92 02:49:12 GMT
- Message-ID: <By0sD7.HDv.1@cs.cmu.edu>
- Originator: spot@COBOL.FOX.CS.CMU.EDU
- Sender: news@cs.cmu.edu (Usenet News System)
- Nntp-Posting-Host: cobol.fox.cs.cmu.edu
- Organization: School of Computer Science, Carnegie Mellon University
- References: <MICHAEL.92Nov13144140@pullet.lanl.gov>
- <1992Nov16.184754.3170@maths.tcd.ie> <Bxu712.LvA@metaflow.com>
- <1992Nov18.024912.24072@maths.tcd.ie>
- Date: Fri, 20 Nov 1992 15:00:32 GMT
- Lines: 19
-
-
- Timothy> Chaitin/Kolmogorov Algorithmic Information Theory does set an
- Timothy> absolute limit to the degree to which any given data can be
- Timothy> compressed: the string s cannot be compressed beyond its
- Timothy> entropy (or informational content) H(s).
-
- there's no limit to how well any one particular or small group of
- possible inputs can be compressed. consider a code that compresses
- input A (say this netnews post) to "0", and any other input B to "1B".
- clearly, that one bit is less than H(A), but averaged over all inputs,
- satisfies the theory. the magic of (lossless) compressing is deciding
- what inputs you want to compress and "concentrating" the code to make
- that portion of the possible input space take very little space, at
- the expense of other stuff taking lots of space.
-
- --
- orgasm
- Scott Draves nitrous
- spot@cs.cmu.edu death
-