|Designing a Domain-specific language email@example.com (2017-05-23)|
|Re: Designing a Domain-specific language firstname.lastname@example.org (bartc) (2017-05-23)|
|Re: Designing a Domain-specific language DrDiettrich1@netscape.net (Hans-Peter Diettrich) (2017-05-24)|
|Re: Designing a Domain-specific language email@example.com (firstname.lastname@example.org) (2017-05-24)|
|Re: Designing a Domain-specific language email@example.com (2017-05-25)|
|Re: Designing a Domain-specific language firstname.lastname@example.org (bartc) (2017-05-27)|
|Date:||Tue, 23 May 2017 07:12:14 -0700 (PDT)|
|Injection-Info:||miucha.iecc.com; posting-host="news.iecc.com:2001:470:1f07:1126:0:676f:7373:6970"; logging-data="86325"; mail-complaints-to="email@example.com"|
|Keywords:||parse, design, question|
|Posted-Date:||23 May 2017 13:02:02 EDT|
I'm implementing right now a "compiler" for a DSL that used at my work, and I'd like to get your advice for a dilemma I have.
One of the requirements at the begging was that the compiler should get an input only one file.
But, it's changed and I asked to add two more capabilities. and they are:
1. The author that uses the language should be able to include common files (std the comes with the compiler).
for example: include "file.common"
2. The author should be able to compile a directory and then split the project into many files. so the code will something like this:
state = include "./sibling"
The project status is:
1. I have a lexer/tokenizer that generates tokens and pass them to the parser.
2. The parser, which responsible for the syntax analysis, builds the parse tree.
My dilemma is: "How's the compile process should be when you have many files". I thought about this process for the second requirement, and I'd like to hear your opinion:
1. scan all the files (assume we have k files).
2. for each of the files:
2.1 run the lexer together with parser and build a parse tree.
3. now, I have k parse trees (but only one entry-point/main).
4. if there's a circular dependencies return an error
5. resolve the "include"s by merging the trees into one tree.
For handling the first requirement, I'll store the "common files" somewhere in the installation of the tool, and resolve them as a regular files.
What do you think about this solution?
How other compilers handle this kind of situation?
I'd like to get your feedback, and I'm really open for other suggestions.
Thank you guys.
[There's basically two approaches. One is more or less what you're
doing, treat each file as a separate module, and combine them at tne
end. The other is what C does, combine everything into one lexical
stream and scan and parse it as a unit. Either way you can deal with
a recursive include either by treating it as an error, or by ignoring
includes of files you've seen before, depending on how the language
works. The C approach is easier to implement, the separate module
approach has more flexibility. For example, Python uses the separate
module approach, and caches parsed modules so on subsequent runs it
can just load the parsed version. -John]
Return to the
Search the comp.compilers archives again.