Related articles |
---|
How to handle identifiers with spaces? tellab5!odgate!mike@uunet.UU.NET (1993-01-15) |
How to handle identifiers with spaces? kgg@dcs.ed.ac.uk (Kees Goossens) (1993-01-18) |
Newsgroups: | comp.compilers |
From: | tellab5!odgate!mike@uunet.UU.NET (Mike J. Kelly) |
Organization: | Odesta Corporation |
Date: | Fri, 15 Jan 1993 16:41:11 GMT |
Summary: | Want variable names to have spaces |
Keywords: | lex, question, comment |
I am writing a parser for a frontend query language to be used against a
database. In the backend (i.e. database), column names follow SQL
conventions: no spaces. We would like to be able to assign aliases to
these columns in the frontend (i.e. "PRODUCT_NUM" becomes "Product
Number") and then let users enter ad-hoc SQL-like queries using the
aliases, i.e.
Product Number = 100 or Product Number = 200
would be translated into
PRODUCT_NUM = 100 OR PRODUCT_NUM = 200
prior to being sent to the backend database. The question is, how
does one handle spaces in identifiers? It seems like the fact that
the list of identifiers is restricted (to the aliases in the database)
might help, but I'm not sure how. It also seems like packaging the
identifiers into a single token should be done by the lexical analyzer,
so to the yacc grammar, it appears as an COLUMN_ALIAS token. Does this
make sense? Does anyone have any practical experience in using identifiers
containing spaces in a Yacc grammar? I couldn't think of any other
languages that allow spaces in identifiers, which scared me.
Thanks for your help.
--
Mike Kelly Odesta Corporation, Northbrook, Illinois, USA
..!clout!odgate!mike - Until odesta.com is registered.
odgate!mike@clout.uucp - From the Internet.
[You could do it either in the lexer or the scanner. In most languages,
you can't have to adjacent identifiers, so you could have a rule in your
grammar to collect the parts of a complex identifier:
identifier: SIMPLE_IDENTIFIER | identifier SIMPLE_IDENTIFIER ;
-John]
--
Return to the
comp.compilers page.
Search the
comp.compilers archives again.