Re: deadcode optimization

"Norman Black" <>
14 Mar 2001 00:11:24 -0500

          From comp.compilers

Related articles
[8 earlier articles]
Re: deadcode optimization (2001-03-08)
Re: deadcode optimization (Norman Black) (2001-03-10)
Re: deadcode optimization (Norman Black) (2001-03-10)
Re: deadcode optimization (2001-03-10)
Re: deadcode optimization (2001-03-12)
Re: deadcode optimization (Norman Black) (2001-03-14)
Re: deadcode optimization (Norman Black) (2001-03-14)
Re: deadcode optimization (Hans-Bernhard Broeker) (2001-03-22)
Re: deadcode optimization (Marco van de Voort) (2001-04-04)
Re: deadcode optimization (Norman Black) (2001-04-10)
Re: deadcode optimization (2001-04-12)
| List of all articles for this month |

From: "Norman Black" <>
Newsgroups: comp.compilers
Date: 14 Mar 2001 00:11:24 -0500
Organization: Stony Brook Software
References: 01-03-012 01-03-022 01-03-034 01-03-060 01-03-075
Keywords: optimize
Posted-Date: 14 Mar 2001 00:11:24 EST

Those statics in source file 1 and source file 2 do not need to have
"names". Our compilers never give such symbols names. Note that I do
not have C compilers, but rather Modula-2, Ada and Pascal. These
languages support compilation units and do not have, or use for, the
"static" keyword. The same effect comes with declaring a variable in
the "implementation" portion of a compilation unit. These symbols have
no visibility outside the compilation unit.

I understand the legacy of C compilers back to the original. They have
global and local symbols in the object format. A static in a file
would use a local symbol. As a non C user I could care less about that
and have no use for it. The languages I use all support compilation
units and the C "symbol issues"(namespace stuff) disappear with the
existence of compilation units. Even so I still do not see the need,
other than legacy, to use local symbols in a C compiler.

I also understand the history of Unix is to also use the "symbol
table" (in some form) for debug information. Having come from the PC
world, again I don't care about this. Our compilers support multiple
debug formats for output. CodeView, Borland, Watcom and DWARF. All of
these are independent of symbol table issues, and any scoping
issues. Remember that DOS and 16-bit Windows programs had no concept
of a symbol table, and had no use for it. From the symbol table type
debug info I have seen, it/they are quite inadequate. As an industry
standard people can get behind, DWARF is the way to go. DWARF has
complexity I don't think it needs, but it is standardized.

> But if you're implementing a sufficiently C-like language, and you
> want to use the platform's existing debuggers, and you want
> high-quality debugger support, then you may need to deal with this
> issue.

Especially since the Unix development world is in something of a
timewarp. I have come to this conclusion during our "Unix" ports of
the past year. As a necessity I have used native C compilers to "check
certain things out".
Norman Black
Stony Brook Software

Post a followup to this message

Return to the comp.compilers page.
Search the comp.compilers archives again.