|GCC is 25 years old today firstname.lastname@example.org (Rui Maciel) (2012-03-22)|
|Re: GCC is 25 years old today email@example.com (BGB) (2012-03-24)|
|Re: GCC is 25 years old today firstname.lastname@example.org (Dmitry A. Kazakov) (2012-03-26)|
|Re: GCC is 25 years old today email@example.com (2012-03-27)|
|Re: GCC is 25 years old today firstname.lastname@example.org (glen herrmannsfeldt) (2012-03-28)|
|Re: GCC is 25 years old today email@example.com (Dmitry A. Kazakov) (2012-03-28)|
|Re: GCC is 25 years old today firstname.lastname@example.org (Rui Maciel) (2012-03-28)|
|Re: GCC is 25 years old today email@example.com (BGB) (2012-03-28)|
|Re: GCC is 25 years old today DrDiettrich1@aol.com (Hans-Peter Diettrich) (2012-03-29)|
|Re: GCC is 25 years old today firstname.lastname@example.org (2012-03-29)|
|Re: GCC is 25 years old today Pidgeot18@verizon.net (Joshua Cranmer) (2012-03-29)|
|Re: GCC is 25 years old today email@example.com (BGB) (2012-03-29)|
|Re: GCC is 25 years old today DrDiettrich1@aol.com (Hans-Peter Diettrich) (2012-03-30)|
|Re: GCC is 25 years old today firstname.lastname@example.org (Dmitry A. Kazakov) (2012-03-30)|
|[7 later articles]|
|Date:||Wed, 28 Mar 2012 13:46:35 -0700|
|Posted-Date:||29 Mar 2012 04:58:49 EDT|
On 3/27/2012 2:27 AM, email@example.com wrote:
>> On Sat, 24 Mar 2012 08:44:36 -0700, BGB wrote:
>>> GCC showed up in various forms (such as DJGPP, and later Cygwin and
>>> MinGW), and in not much time, most previously non-free compilers (MSVC,
>>> Watcom, ...) became freely available as well.
>>> if not for GCC, maybe compilers would tend to still cost money?
> They still do, once you realize there are computers other than PCs and
> there are OS other than UNIX.
I mostly am using and was thinking mostly about Windows.
in Windows land, many compilers proper are free (yes, Visual Studio is
not free, but one is paying more for the IDE than for the compiler
itself, which can be gained for free).
there are many GCC variants.
Watcom once cost money, but is now free.
>>> either that, or maybe this trend was inevitable?
> On Linux/UNIX? Probably so given people expect free as in free and are
> not used to paying for anything. Hard to figure pricing models for
> that market, but the cheaper the better.
again, on Windows.
there was a time when compilers cost money for both DOS and Windows,
but seemed to change with DJGPP and MinGW and so on being available.
tools/libraries for most scripting languages also tend to be freely
available by default.
>> It was. The software market was (and is) unregulated. The big software
>> vendors were able to fund incredibly cost-intensive compiler
>> development form sales of other, far less expensive to develop,
>> software and services.
> Yes but not necessarily only. Some of the compilers still cost a bundle and
> since they're mature I think they're making all their money back several
> times over. I don't have the numbers in front of me so my guess isn't any
> worse than yours unless you do.
well, one can ask:
what if MS were to charge for the Windows SDK?
most likely, people who might otherwise use it would just use MinGW instead.
now, what if there were no MinGW or Cygwin or similar around?
very possibly, people like MS can and would charge for the Windows SDK
(unless something like PCC or LCC or similar would have taken GCC's place).
>> This started a race to the bottom and, in the end, destroyed the whole
>> market of compilers with all the compiler vendors who were not quick
>> enough to diversify their business. Those who did, walked away anyway.
>> Why would you keep an unprofitable department? You cannot yearn
>> anything for compilers now.
> This is not generally true. If you're talking about Intel x86 it's mostly
> true. As you know Adacore still makes plenty of money. So does Green
> Hills. So do others I haven't heard of.
> It's not true about IBM at all on System Z. They sell compilers and make a
> lot of money, it's probably all gravy now. I don't know if they make money
> on compilers for POWER and AIX. There are even a few companies selling C/C++
> compilers for System Z. It's not only IBM in the market, and people are
> still spending money.
I think the issue here is that "big money" needs to be involved for
people to be anywhere near an IBM mainframe, so a little more money is
not asking a whole lot.
however, in PC land, something like Visual Studio can cost more than a
low-end PC, and many more of the developers are people without a huge
budget (such as people in their teens and 20s who are not actively
employed as a programmer), and these sort of people are much less likely
to pay so much for a piece of software.
in this case, there is also a lot more incentive for such people to make
things like this and make them freely available. even if they don't get
a lot of money, they may get a lot more recognition and status and "a
sense of community" and similar for having done so.
this property may matter much less in cases where the barrier to entry
is much higher, since if people have lots of money, they are much more
likely to be willing to spend it, and the relative payoff of making
things freely available is much lower ("popularity" and "a sense of
community" aren't really going to help keep a company going, but selling
a product will).
reasons though why a company might make something freely available
though could be to promote lock-in with their other products, or if
there is little or no market for the product.
>> Consequently, there is no significant investments in compiler and
>> language research, of which effect the author of the article observed
>> as a "plateau." There is no mystery in it, no market means no
>> progress. Academic research very soon became irrelevant without an
>> input from the field, without industry hungry for fresh compiler
>> developers. So, here we are.
> I don't think this is correct. Like many other things about the industry, it
> has become a lot more focused and refined but I don't think there is no
> market. And I don't think there is no innovation and no progress. Things
> move along. There are several significant compiler companies. Not as many as in
> the 1970s and not on as many platforms but there are still some doing a
> pretty good business.
>> Was GCC responsible for that? No, its role was rather positive, to keep
>> some least diversity of compilers, to serve as an epitaph on the
> It's an epitaph on FSF's tombstone, with any luck!
I don't think free software or open-source needs to go away, since it
does seem to have a fair amount of benefit in many areas.
granted, not everyone who believes in FOSS necessarily agrees with RMS
> [There's plenty of compiler work in embedded systems, too. ARM has
> compilers that basically just generate intermediate code, and the
> linker does sophisticated global optimization over the whole program.
although not exactly the same, and for probably different reasons,
something vaguely similar is used a fair amount in my own stuff as well
(such as in the newer JIT for my scripting VM).
in many cases, the produced ASM code actually contains "requests", which
may in turn trigger the linker to call code generators for other pieces
of code (the request is itself encoded in the output via a call
instruction to a specially mangled name). the called generator will then
emit a piece of code specialized for handling this request.
I had idly considered the possibility of also allowing this to be done
at the level of the assembler, to essentially produce code which would
be "inlined", but thus far have not done so (this seems kind of like
more of a big nasty hack than doing it at link-time).
a technically cheaper/simpler option would be to allow the request to be
encoded instead using a jump instruction, relying on the fact that 2
direct jumps seem to be a bit faster than a call/return pair, but this
would disallow the generated code thunks from being reused (since each
thunk would need to return to a fixed address).
this could involve the linker "magically" transforming a call
instruction into a jump instruction (since, for example, on x86 both
CALL and JMP differ solely in the opcode byte).
as-is, the mechanism is a bit cheaper than a traditional function
values in both directions are typically passed in caller-defined
registers, or as immediate values (encoded into request signature);
typically, a mask value is provided to tell the specialized code
generator which registers it may safely use without needing to preserve
they are easier to generate code for than a "proper" function call.
sometimes, it may be slightly more expensive though, such as when the
request is handled essentially by just marshaling the call into a
function call into C land.
typically, this is also used to implement things like:
getting/setting object fields;
calling many functions;
(sometimes) loading constant values (some constants may either depend on
run-time data, or otherwise have a value that is not a readily
determined constant during code generation);
another slight merit is that it allows things to be a little more
loosely coupled, since a the main codegen (compiling the script bytecode
to native code) doesn't need to know about the exact details of every
piece of machinery it interfaces with (nor is it necessary to write
"plug-ins" to interface these pieces of machinery with the code-gen).
however, there are potential merits for allowing codegen plugins as well
(rather than handling everything by plugging "request handlers" into the
sadly... using this mechanism at present would likely require writing a
custom static linker if I have any desire to be able to statically
compile my scripting language to native code (not currently possible).
Return to the
Search the comp.compilers archives again.