|Problem with Flex/Bison and C/C++ firstname.lastname@example.org (Christoph B.) (2006-03-22)|
|Re: Problem with Flex/Bison and C/C++ email@example.com (John Millaway) (2006-03-27)|
|Re: Problem with Flex/Bison and C/C++ cfc@shell01.TheWorld.com (Chris F Clark) (2006-03-27)|
|Re: Problem with Flex/Bison and C/C++ firstname.lastname@example.org (2006-03-27)|
|From:||"Christoph B." <email@example.com>|
|Date:||22 Mar 2006 23:42:57 -0500|
|Keywords:||yacc, C++, question|
First I should mention that this is my very first project with compiler
tools such as Bison and Flex.
I got started with this simple Tutorial  and that worked pretty fine.
This Tutorial starts with a lexer in C and the yacc-file in C++. I
believe that's the reason why I have severe problems now.
First, my YYSTYPE union looked like this:
This is working fine as only data types known by C are used. In fact, my
tokens all should have either string or integer values, but I have also
defined some nonterminals that should be instances of C++-classes, now
my union looks like that:
As far as I know Yacc generates a file y.tab.h that is included by the
lexer (so the lexer knows what data types the tokens can hold). Of
course a C lexer can't handle C++ classes.
So my question is if there is a simple way to have all the tokens as
strings and integers and have nonterminals with more complicated C++
data types. Is there a way to define the data format for nonterminals
outside from YYSTYPE? There is no need for the lexer to know about these
data types, structures using these types are created by the yacc file,
lex doesn't need to know anything about these nonterminals...
If there is no other solution, can't I compile the lexer output using
g++? What do I usually need to change in my lexer to make it compile in
g++? Currently my lexer looks like that:
token yylval.string=strdup(yytext);return TOKEN;
As you already have realized, I'm quite new to this topic. Any help is
gladly appreciated. Thanks very much in advance.
Return to the
Search the comp.compilers archives again.