|Year 2000 and C Compiler Libserv3-Ocisi.FranceTelecom@wanadoo.fr (ocisi3) (1998-10-01)|
|Re: Year 2000 and C Compiler firstname.lastname@example.org (1998-10-04)|
|Re: Year 2000 and C Compiler email@example.com (1998-10-04)|
|Re: Year 2000 and C Compiler firstname.lastname@example.org (1998-10-04)|
|Date:||4 Oct 1998 01:10:10 -0400|
On: 1 Oct 1998 00:56:14
"ocisi3" <Libserv3-Ocisi.FranceTelecom@wanadoo.fr> asked about
>> ... about Year 2000 and C ompiler.
And offered a good distinction
>> Are there a problem when you compile the code or when you execute the code
The compilation aspect can certainly also entail the full build
process. If it is permitted to expand the thread slightly into that
more general area, then perhaps these comments will be valuable.
There are SMALL concerns in the areas of the make facility and version
control software. Proper functioning of these may require very up to
This would not impact the integrety of the C source code based
This is more of an issue for large and complex development
environments. But with modern languages even the individual
practitioner is using a bundle of tools just to generate one program.
The make facility and version control COULD go somewhat erratic at or
about the Y2K event threshold, IF the software used is a little weak.
Basically, the error would be something like a lot of extra
unnecessary recompilation. Although it could get more strange. (for
example poorly time stamped items might not get regenned when they
Because of the criticality of the dates on component files, you should
have the latest release of your operating system in place also. It is
a good idea to get rigorous control of any special device (like the
touch program) that could impact a file date stamp. Also the
linkers/binders should be controlled.
In a network environment with workers using a diversity of software
disciplines it can be a political challenge to reduce the number of
linkers, binders, resource (script file) compilers, and help facility
The key to success with Y2K is to reduce the problem to managable
size. To meet all of your requirements fifteen months from now, you
will probably need just as many tools as you are currently using. But
in some environments it is definitely worth noting that you do not
have to use so many versions of the same tool.
Testing your ability to rebuild a system, ultimately requires you to
set the system clock. Any such experiment should be physically
isolated from real-date material.
After doing some testing with artificial forward clock settings, a
special gotcha category arises when you set the clock back to attempt
your trial runs again. This creates an UNREAL set of dates on files,
with some components appearing to be future dated.
Generally do not go backwards in time. This can drain your resources
in analyzing prolems that will not happen, or cover problem that will
The test plan to prove the ability to compile, and re-build software
needs to have a competent scheme for refreshing the test machine to an
earlier point in time. Simply setting the clock back, is usually not
enough. You would hope that make facilities are just concerned with
the relation between two file dates, but unless you know the tool's
code itself you can not be sure that there is not a branch point
involving a comparison to current date. The code can break when there
are unreal-date environments with future dated files. So planning
methodical rollbacks in the test environment can be involved.
BTW, the general advice that I have heard for individuals with just
one machine, is that you should NOT experiment with the system clock;
especially if you do not have recent releases of your OS.
Return to the
Search the comp.compilers archives again.