From: | hbaker@netcom.com (Henry Baker) |
Newsgroups: | comp.lang.scheme,comp.compilers |
Date: | 16 May 1997 23:39:26 -0400 |
Organization: | nil |
References: | 97-05-183 |
Keywords: | C, assembler |
Ray Dillinger <bear@sonic.net> wrote:
> And, I think a lobotomized subset of C does what I want. I can write
> this kind of pseudo-machine code in C, with a honking huge main()
> routine, global variables for the registers, etc.
>
> However, this will violate every "reasonable" assumption a maker of C
> compiers will have about programming style. It will mean a program is
> compiled into a *single routine* of C code, with Goto destinations
> that might be more than 64K bytes away -- and no templates, no library
> functions linked, no header files, etc etc....
>
> Will modern C systems handle this?
> [Probably not. Machine generated source code always seems to break
> compilers designed for code written by humans. -John]
'Kyoto' Common Lisp (KCL) generated C code essentially like this. I
used it to compile some pretty large Lisp files; compiling a Lisp file
generated a _very large_ C program, including some pretty large C
functions.
I was actually impressed that the C compilers didn't cause more
problems than they did. Perhaps it was because KCL generated pretty
basic C code.
On very rare occasions, I blew out the C compiler with optimization
turned on (-O), because its register coloring routine went nuts.
Apparently, the C compiler used some O(n^2) or worse algorithms,
because the longer C functions took a _lot_ longer to compile.
Generally, however, the _assembly_ pass that followed the C compiler
took _longer_ than the C compilation itself! Some of these assembly
sources were several megabytes long. I even blew out the assembler
once.
BTW, the 'inlining' capabilities of the C compilers I tried were
pretty miserable, but this information is now >5 years old, so I don't
know if the situation has improved.
--
Return to the
comp.compilers page.
Search the
comp.compilers archives again.