From: | "Derek M. Jones" <derek@_NOSPAM_knosof.co.uk> |
Newsgroups: | comp.compilers |
Date: | 22 Dec 2006 11:07:57 -0500 |
Organization: | ntl Cablemodem News Service |
References: | 06-09-029 06-09-042 06-09-048 06-09-060 06-09-078 06-09-093 06-12-064 06-12-066 06-12-076 06-12-078 06-12-086 06-12-087 |
Keywords: | C, parse, analysis |
Posted-Date: | 22 Dec 2006 11:07:57 EST |
Ivan,
>> I know of at least one other tool vendor who fully parses C/C++ source
>> using syntax information only, before building a symbol table.
>
> Again, are there any practical benefits with this approach?
I cannot speak for why others have taken this approach, but
possibilities include:
o gets a parser up and running quickly. This makes it possible
to see what kind of characteristics the source has and tune
subsequent processing appropriately (ie, in this situation people
are not interested in handling 100% of source 100% correctly).
In my case I am interested in measuring source code usage and
so don't need a full semantic analysis of every construct.
o improved error recovery. Syntax errors are often difficult
to recover from, ie there tend to be lots of cascading errors.
Semantic errors tend to be localized and easier to attach meaningful
messages to.
In my case I often have to analyze source where I might not have any
information on where the headers that need to be included are located
(ie, include path information), or in some cases I might not even have
the headers :-(
o it is possible to analyze fragments of code, eg single statements
or declarations (perhaps typed on the command line).
Return to the
comp.compilers page.
Search the
comp.compilers archives again.