home *** CD-ROM | disk | FTP | other *** search
- Newsgroups: comp.std.c
- Path: sparky!uunet!stanford.edu!lucid.com!lucid.com!jss
- From: jss@lucid.com (Jerry Schwarz)
- Subject: Re: Maximum depth of #if preprocessing directives
- Message-ID: <1992Dec31.023341.16602@lucid.com>
- Sender: usenet@lucid.com
- Reply-To: jss@lucid.com (Jerry Schwarz)
- Organization: Lucid, Inc.
- References: <JET.92Dec24133237@boxer.nas.nasa.gov> <BzsK12.5Ky@jrd.dec.com> <1992Dec29.001933.25655@lucid.com> <C01q3x.Bpp@sneaky.lonestar.org>
- Date: Thu, 31 Dec 92 02:33:41 GMT
- Lines: 35
-
- Jss:
- |> >Just to put my cards on the table. X3J16/SC22 (C++ standards
- |> >committee) is sharply divided on whether to have a section on
- |> >limits. Some of us, including myself, argue that without such
- |> >a section any limit is a bug. With such a section arbitrary limits
- |> >are somehow condoned as suggested by Norman's comment, and we
- |> >don't want to condone any limits.
- |>
-
- Gordon Burditt
- |> How can you possibly avoid some kind of limit? Even an implementation
- |> that uses arbitrary-precision arithmetic for something such as
- |> #if nesting level is going to have SOME limit, like
- |> 2^(MAXLONG*sizeof(unsigned long)*CHAR_BIT) (where ^ is an exponentiation
- |> operator), even if it's outrageously large.
- |>
-
- Jss:
- Of course I'll run out of some resource eventually, but chances
- are it will not translate directly into any of the numbers in
- the standard, and it may vary from run to run. Why should
- the particular limits in the standard be singled out for special
- treatment?
-
- My contention is that these limits are a quality of implementation
- issue rather than a conformance issue. And as such don't belong in
- the standard proper.
-
- Most of the actual limits in the C standard are ridiculously low.
- If I tried to promulgate a compiler that enforced these limits on a
- UNIX workstation (which is the market Lucid sells into) I doubt that I
- would get a single customer.
-
- -- Jerry Schwarz
-
-