Re: Designing a Domain-specific language

Hans-Peter Diettrich <DrDiettrich1@netscape.net>
Wed, 24 May 2017 06:42:15 +0200

          From comp.compilers

Related articles
Designing a Domain-specific language dror.openu@gmail.com (2017-05-23)
Re: Designing a Domain-specific language bc@freeuk.com (bartc) (2017-05-23)
Re: Designing a Domain-specific language DrDiettrich1@netscape.net (Hans-Peter Diettrich) (2017-05-24)
Re: Designing a Domain-specific language laguest9000@googlemail.com (lucretia9@lycos.co.uk) (2017-05-24)
Re: Designing a Domain-specific language dror.openu@gmail.com (2017-05-25)
Re: Designing a Domain-specific language bc@freeuk.com (bartc) (2017-05-27)
| List of all articles for this month |

From: Hans-Peter Diettrich <DrDiettrich1@netscape.net>
Newsgroups: comp.compilers
Date: Wed, 24 May 2017 06:42:15 +0200
Organization: Compilers Central
References: 17-05-007
Injection-Info: miucha.iecc.com; posting-host="news.iecc.com:2001:470:1f07:1126:0:676f:7373:6970"; logging-data="63183"; mail-complaints-to="abuse@iecc.com"
Keywords: design
Posted-Date: 25 May 2017 10:58:42 EDT

Am 23.05.2017 um 16:12 schrieb dror.openu@gmail.com:


> My dilemma is: "How's the compile process should be when you have many files". I thought about this process for the second requirement, and I'd like to hear your opinion:
> 1. scan all the files (assume we have k files).
> 2. for each of the files:
> 2.1 run the lexer together with parser and build a parse tree.
> 3. now, I have k parse trees (but only one entry-point/main).
> 4. if there's a circular dependencies return an error
> 5. resolve the "include"s by merging the trees into one tree.


John already pointed out two commonly used methods, so let me only add
some keys.


A *preprocessor* can be implemented as a stand-alone tool, that resolves
the "include ..." directives, and produces a combined output file that
then can be compiled. Or it does the same as kind of a filter, sitting
between the lexer and parser, so that it can feed one sequential stream
of tokens to the parser. The preprocessor either switches its input
amongst files, in order to produce a combined file, or it switches
amongst lexers, each of which processes a single input file.


A *linker* resolves references between object modules, or AST's, so that
the result is either a single object or executable file, or a single
combined AST (parse tree). A clever linker can cache individual
definitions (variables, functions...), and load only the really required
definitions on demand. This technique allows to eliminate all unused
parts of the source files from the final program. Again a linker can be
implemented as a stand-alone tool, or as part of a compiler.


Caveat: If your "include" directive allows to insert *incomplete*
definitions, e.g. individual lines of code into a function, only a
preprocessor can produce valid input for the parser. Otherwise, if all
definitions can be parsed without occurences of embedded "include"
directives, all definitions (of identifiers) can be parsed individually,
and merged into a common list of definitions, as the top level of the
parse tree, or as a list of individual parse trees for each listed
definition.


DoDi


Post a followup to this message

Return to the comp.compilers page.
Search the comp.compilers archives again.