Fri, 7 Oct 1994 09:01:33 GMT

Related articles |
---|

References for scanner table compression using "comp-vector" sperber@informatik.uni-tuebingen.de (1994-10-07) |

Re: References for scanner table compression using "comp-vector" vern@daffy.ee.lbl.gov (1994-10-11) |

Newsgroups: | comp.compilers |

From: | sperber@informatik.uni-tuebingen.de (Michael Sperber [Mr. Preprocessor]) |

Keywords: | lex, design |

Organization: | Compilers Central |

Date: | Fri, 7 Oct 1994 09:01:33 GMT |

It seems to be a fairly widespread technique to compress scanner

tables using the comb-vector method described in the Dragon Book. Rex

and (I think) flex use it to merge the rows of the table into three

tables that are customarily called Base, Check, Next, and Default.

The state transition function is:

trans(state, symbol) := IF Check[Base[State] + Symbol] == state

THEN Next[Base[State] + Symbol]

ELSE trans(Default[state], symbol)

However, I haven't been able to find an algorithm to actually compute

these data structures. The Dragon Book is very vague. Grosch (in the

documentation for Rex) talks of a O(n^2 c) algorithm where n is the #

of states, c the cardinality of the input character set, but also

isn't specific. Dencker, Duerre, Heuft (in TOPLAS 6:4, 1984, 546-572)

also mention the technique but again don't specify an algorithm.

Can anyone tell me more or give me a reference describing an algorithm

for this? Any help would be much appreciated.

--

Post a followup to this message

Return to the
comp.compilers page.

Search the
comp.compilers archives again.