|Designing a Domain-specific language email@example.com (2017-05-23)|
|Re: Designing a Domain-specific language firstname.lastname@example.org (bartc) (2017-05-23)|
|Re: Designing a Domain-specific language DrDiettrich1@netscape.net (Hans-Peter Diettrich) (2017-05-24)|
|Re: Designing a Domain-specific language email@example.com (firstname.lastname@example.org) (2017-05-24)|
|Re: Designing a Domain-specific language email@example.com (2017-05-25)|
|Re: Designing a Domain-specific language firstname.lastname@example.org (bartc) (2017-05-27)|
|From:||Hans-Peter Diettrich <DrDiettrich1@netscape.net>|
|Date:||Wed, 24 May 2017 06:42:15 +0200|
|Injection-Info:||miucha.iecc.com; posting-host="news.iecc.com:2001:470:1f07:1126:0:676f:7373:6970"; logging-data="63183"; mail-complaints-to="email@example.com"|
|Posted-Date:||25 May 2017 10:58:42 EDT|
Am 23.05.2017 um 16:12 schrieb firstname.lastname@example.org:
> My dilemma is: "How's the compile process should be when you have many files". I thought about this process for the second requirement, and I'd like to hear your opinion:
> 1. scan all the files (assume we have k files).
> 2. for each of the files:
> 2.1 run the lexer together with parser and build a parse tree.
> 3. now, I have k parse trees (but only one entry-point/main).
> 4. if there's a circular dependencies return an error
> 5. resolve the "include"s by merging the trees into one tree.
John already pointed out two commonly used methods, so let me only add
A *preprocessor* can be implemented as a stand-alone tool, that resolves
the "include ..." directives, and produces a combined output file that
then can be compiled. Or it does the same as kind of a filter, sitting
between the lexer and parser, so that it can feed one sequential stream
of tokens to the parser. The preprocessor either switches its input
amongst files, in order to produce a combined file, or it switches
amongst lexers, each of which processes a single input file.
A *linker* resolves references between object modules, or AST's, so that
the result is either a single object or executable file, or a single
combined AST (parse tree). A clever linker can cache individual
definitions (variables, functions...), and load only the really required
definitions on demand. This technique allows to eliminate all unused
parts of the source files from the final program. Again a linker can be
implemented as a stand-alone tool, or as part of a compiler.
Caveat: If your "include" directive allows to insert *incomplete*
definitions, e.g. individual lines of code into a function, only a
preprocessor can produce valid input for the parser. Otherwise, if all
definitions can be parsed without occurences of embedded "include"
directives, all definitions (of identifiers) can be parsed individually,
and merged into a common list of definitions, as the top level of the
parse tree, or as a list of individual parse trees for each listed
Return to the
Search the comp.compilers archives again.