home *** CD-ROM | disk | FTP | other *** search
- Xref: sparky sci.systems:151 alt.info-theory:70
- Newsgroups: sci.systems,alt.info-theory
- Path: sparky!uunet!caen!hellgate.utah.edu!asylum.cs.utah.edu!tolman
- From: tolman%asylum.cs.utah.edu@cs.utah.edu (Kenneth Tolman)
- Subject: Re: Can a part be greater than the whole?
- Date: 23 Nov 92 16:58:41 MST
- Message-ID: <1992Nov23.165842.6266@hellgate.utah.edu>
- Organization: University of Utah, CompSci Dept
- References: <ELIAS.92Nov23094545@fitz.TC.Cornell.EDU> <1992Nov23.210710.1395@odin.diku.dk>
- Distribution: alt
- Lines: 39
-
- >>> Can a part be greater than the whole?
- >>>
- >>> Specifically, can a part have more information than the whole?
- >>>
- >>> but consider this, the whole is a sequence like this:
- >>> 1111111111111111111111111111111111111111111111111111
- >>> and the part is:
- >>> 10110110010110101010010111100111110101101011110001000
- >
- >Bad example. I don't see why the second list of 0's and 1's is a part of the
- >first.
-
- Consider instead a volume with a cross section, this could is equivalent.
-
- >>> Here, the part seems to have a higher informaition content than the whole.
-
- >Why does the second list contain more information than the first. Because
- >the 0 to 1 ratio is close to 0.5? But what about the list: 00000...11111. This
- >list ``obviously'' have a low informations content. By induction it follow
- >that the list that contain most informations is 0101010101010101 :-).
-
- It is not clear what you mean by information at all. One standard measure
- is Kolmogrov complexity, where the information content of a sequence
- is equivalent to the minimal description of that sequence on a universal
- Turing machine. From this measure your comments are meaningless. It is
- not obvious in what interpretation you intend, if any.
-
- >Anyway
- >this is a old discussion on the net---What is the information content of pi.
- >Infinite? Is it bigger than sqrt(2) or 1/3.
-
- Old conversations are obviously the most interesting and most likely to
- yield important ramifications when understood. The information content
- of PI from Kolmogrov complexity is very small. It is not infinite in
- any way whatsoever.
-
- Shannon's measure of information depends on the probability of choosing some
- symbol from an ensemble. It does not appear directly applicable unless on
- can assign probabilities to various sequences.
-