home *** CD-ROM | disk | FTP | other *** search
- Newsgroups: comp.std.c
- Path: sparky!uunet!spool.mu.edu!think.com!enterpoop.mit.edu!eff!news.oc.com!utacfd.uta.edu!rwsys!sneaky!gordon
- From: gordon@sneaky.lonestar.org (Gordon Burditt)
- Subject: Re: Maximum depth of #if preprocessing directives
- Message-ID: <C01q3x.Bpp@sneaky.lonestar.org>
- Organization: Gordon Burditt
- References: <JET.92Dec24133237@boxer.nas.nasa.gov> <BzsK12.5Ky@jrd.dec.com> <1992Dec29.001933.25655@lucid.com>
- Date: Wed, 30 Dec 1992 00:16:40 GMT
- Lines: 20
-
- >Just to put my cards on the table. X3J16/SC22 (C++ standards
- >committee) is sharply divided on whether to have a section on
- >limits. Some of us, including myself, argue that without such
- >a section any limit is a bug. With such a section arbitrary limits
- >are somehow condoned as suggested by Norman's comment, and we
- >don't want to condone any limits.
-
- How can you possibly avoid some kind of limit? Even an implementation
- that uses arbitrary-precision arithmetic for something such as
- #if nesting level is going to have SOME limit, like
- 2^(MAXLONG*sizeof(unsigned long)*CHAR_BIT) (where ^ is an exponentiation
- operator), even if it's outrageously large.
-
- Is 32767 an unreasonable limit on the nesting level? Well, it wouldn't
- cost that much to put the counts in a 32-bit long. Do I need to go to
- (about) 2^(2^37)? (about 137 billion bits in the number) Or is that
- too small?
-
- Gordon L. Burditt
- sneaky.lonestar.org!gordon
-