home *** CD-ROM | disk | FTP | other *** search
- Newsgroups: comp.ai.fuzzy
- Path: sparky!uunet!world!tob
- From: tob@world.std.com (Tom O Breton)
- Subject: Re: Fuzzy logic and probability: entropy
- Message-ID: <C188LM.B9r@world.std.com>
- Organization: The World Public Access UNIX, Brookline, MA
- References: <gebhardt-210193081528@129.26.128.171>
- Date: Thu, 21 Jan 1993 23:15:21 GMT
- Lines: 22
-
- Friedrich:
-
- > Kosko argues somehow like this (I know I exaggerate):
- >
- > 1. There is something in probability theory called entropy.
- > 2. There is something in fuzzy set theory called entropy which
- > 2.1 has certain similarity to the probabilistic entropy,
- > 2.2 can be derived immediately from first principles of max-min.
- > 3. Therefore fuzzy set theory starts where Western logic ends.
-
- I had never heard that before. Would you care to describe this "fuzzy
- entropy" in a bit more detail?
-
- Entropy as related to probability and thermodynamics is inversely
- proportionate to the information in the system. How is it generalized or
- extended to fuzzy logic?
-
- Tom
-
- --
- The Tom spreads its huge, scaly wings and soars into the wild sky...
- (tob@world.std.com)
-