Searching \ for '[EE] Time t and The Year 2038 Problem' in subject line. ()
Help us get a faster server
FAQ page: massmind.org/techref/timers.htm?key=time
Search entire site for: 'Time t and The Year 2038 Problem'.

Exact match. Not showing close matches.
'[EE] Time t and The Year 2038 Problem'
2008\04\19@084543 by

home.netcom.com/~rogermw/Y2038.html

http://en.wikipedia.org/wiki/Year_2038_problem

"Computer experts say the wired world will turn upside down in January
2038, a date that seems far away but is already being accessed by
is akin to that of Y2K, only this time it is for real, warn experts. Experts
have set the disaster time for computers at 03:14:07 GMT, January 19,
2038, a Tuesday. At that moment, many computers will "run out of time."
The next second, they will begin showing the time as 8:45:52 PM GMT,
December 13, 1901.

If you are willing to take a risk and crash your computer, you could try
the following exercise: change your computer system date to January 19,
2038, and time to 03:14:07 or beyond, and send a message on Yahoo
Messenger. The messenger crashes. Chances are that if you are using
Internet Explorer, it could get corrupted. All network-based applications
could stop functioning.

The 2038 problem affects programs that use the Posix time representation,
which represents time as the number of seconds since January 1, 1970.
This representation is standard in Unix-like operating systems and also
affects software written for most operating systems because of the broad
deployment of C.

"In case you think we can sit on this issue for another three decades
before addressing it, consider that reports of temporal echoes of the 2038
problem are already starting to appear in future date calculations for
mortgages and vital statistics," said M.H. Noble, managing director of
Zoom Technologies (India) Ltd. What actually is the 2038 problem and
how does it strike? Be prepared for a little lesson in programming.

Time_t is a data type used by C and C++ programs to represent time. Time_t
is a count of the number of seconds since 1200 GMT on January 1, 1970.
Time_t 0 would be midnight of January 1, 1970, time_t value of 1 would be
12:00:01 am of January 1, 1970. By the time the world moves to 2038, the
time_t value will be over 2,140,000,000.

And therein lies the problem. A modern 32-bit computer stores a "signed
integer" data type, such as time_t, in 32 bits. The first of these bits is
used for the positive/negative sign of the integer, while the remaining 31
bits are used to store the number itself. The highest number that 32-bit
computers can store in C and C++ based programs is 2,147,483,647. In
time_t terms, it translates to 3:14:07 GMT, January 19, 2038. At that
moment, time_t in a 32-bit C or C++ program will reach its upper limit. Or
the computer will run out of time. One second later, on January 19, 2038,
at 3:14:08 GMT, disaster strikes, explains Murali Krishna of Hutch. Every
time the calculation goes haywire.

The next second is represented as minus-2,147,483,648, and the computer
does not understand this. Thrown in a tizzy, it "thinks" that the time is
8:45:52 PM GMT, December 13, 1901. So, according to the soothsayers,
January 19, 2038, will turn into December 13, 1901. Every date calculation
will go haywire. And it gets worse. Most of the support functions that use
the time_t data type cannot handle negative time_t values at all. They
simply fail and return an error code.

But some experts say there is nothing to worry about. Says senior
mathematician B. Moinuddin: "The problem is only with 32-bit computers.
Anyhow 64-bit computers are going to be the norm in just a few years, let
alone 34 years. We need not go into the panic mode. There is time to
resolve this issue in a calm, cool and collected manner. Y2K was definitely
blown way out of proportion. This too is going to be that way"

On Sat, Apr 19, 2008 at 8:43 PM, Jinx <joecolquittclear.net.nz> wrote:
>
> But some experts say there is nothing to worry about. Says senior
> mathematician B. Moinuddin: "The problem is only with 32-bit computers.
> Anyhow 64-bit computers are going to be the norm in just a few years, let
> alone 34 years. We need not go into the panic mode. There is time to
> resolve this issue in a calm, cool and collected manner. Y2K was definitely
> blown way out of proportion. This too is going to be that way"
>

I will agree with this assertion. Y2k has made contribution to an upgrade
of computer systems but basically there were no major problems.

Xiaofan

:: But some experts say there is nothing to worry about. Says senior
:: mathematician B. Moinuddin: "The problem is only with 32-bit
:: computers.
:: Anyhow 64-bit computers are going to be the norm in just a few
:: years, let
:: alone 34 years. We need not go into the panic mode. There is time
:: to
:: resolve this issue in a calm, cool and collected manner. Y2K was
:: definitely
:: blown way out of proportion. This too is going to be that way"

That means that Murkeysoft and  Linenuts can charge extra for 64 bit
OS's - the marketing departments will have a field day .

Colin
--
cdb, colinbtech-online.co.uk on 19/04/2008

Web presence: http://www.btech-online.co.uk

Hosted by:  http://www.1and1.co.uk/?k_id=7988359

Even if faith is lacking, and hope is non existent, you always have
charity and compassion to offer.

--
No virus found in this outgoing message.
Checked by AVG.
Version: 7.5.523 / Virus Database: 269.23.2/1386 - Release Date: 4/18/2008 5:24 PM

Personally I doubt if we will still use 32 bit time_t in 30 years from
now...

Tamas

On Sat, Apr 19, 2008 at 2:02 PM, Xiaofan Chen <xiaofancgmail.com> wrote:

{Quote hidden}

> -

For PCs that's probably true. The one to watch might be long-
life or unrevised embedded applications

> http://home.netcom.com/~rogermw/Y2038.html
>
> http://en.wikipedia.org/wiki/Year_2038_problem
>
> "Computer experts say the wired world will turn upside down
> in January 2038, a date that seems far away but is already
> commercial agreements. The problem is akin to that of Y2K,
> only this time it is for real, warn experts. Experts have set
> the disaster time for computers at 03:14:07 GMT, January 19,
> 2038, a Tuesday. At that moment, many computers will "run out
> of time."
> The next second, they will begin showing the time as 8:45:52
> PM GMT, December 13, 1901.

Thirty years is too long to wait for the next Y2K-esque money spinner.
Let's claim the Mayans were right and December something-or-other 2012 is
doomsday for our computers.

I worked for a company that had 1990 rollover problem.  The software was
written in 1980, and to save space on the twin floppies, only one digit was
used for the year, so 3 meant 1983.  All good until 1989 when the problem
was realised, and a solution was found - make it two digits!  Yeah, we know,
it'll fail in 2000, but we won't be using it then...

(...and nevermind they could have stored the dates better, ie as days since
1980 or something.)

Come 2000...

..and I get to fix it again.  As a bonus, I get to fix another system.  This
was even older, the joke was that it was so old it pre-dated computers.

This one was a bit smarter.  Written in Pascal, it had a date structure,
something like:

RECORD DATE
Day   As Int  0..31
Month As Int  0..12
Year  As Int  0..99
END

Pascal lets you set limits in the structs, so you can't set the month to 16
or something silly, you get a runtime error.  Nice.  It also had PACKED
records, which meant it used those limits to compress data, so it knows Day
only needs 5 bits, Month needs 4 and Year needs 7.  16 bits all up, meaning
it fits in two bytes, not three (or six) like before.  Simply add 1900 to
the year, and you're done.

Year  As Int  0..127

The 1900 + Year trick still works fine.  The UI needed fixing a bit,
especially things like Date = "19" & Year that didn't work so well anymore.
This still shows up on the web, do a search for 19108...

Of course, this will fall over come 2027, but they're promised it won't be
in use then.  :)

Tony

Yes, but, we do use 32 bit now.

We have a financial program at work that calcualtes the present value of
future investments, etc. Now, tell me what will happen when you
calculate the value of a 30 year mortgage ... ;-)

Years ago we had to revise the program to ensure that the date
structures were not based on time_t but some better methods of

Rolf

Tamas Rudnai wrote:
{Quote hidden}

>> --
THIS is a perfect example of how pure science affects the PIC world.
I have written many lines of PIC code for time functions, and THIS is good
stuff.

Good catch, Jinx!

--Bob Axtell.

Jinx wrote:
{Quote hidden}

>
> For PCs that's probably true. The one to watch might be long-
> life or unrevised embedded applications
>

And, I'm using time_t in some PIC24 products. I wonder when the compiler
will be modified to use signed 64 bit values for time_t.

Harold

--
FCC Rules Updated Daily at http://www.hallikainen.com - Advertising
opportunities available!
How many computers built 30 years ago are still used right now ?
In 2038 will be no computer in use manufactured today.
Joe, this subject is a waste of EE time...
:)

On 4/19/08, Jinx <joecolquittclear.net.nz> wrote:
{Quote hidden}

> -

On Sat, 19 Apr 2008, Vasile Surducan wrote:

> How many computers built 30 years ago are still used right now ?
> In 2038 will be no computer in use manufactured today.
> Joe, this subject is a waste of EE time...
> :)

The architecture has nothing to do with this.

The word length for the z80 was / is 8 bits, yet it could still cope with
large numbers and relatively large discs. People seem to have latched onto
64 bit as a cure-all. I would expect this from the average PC end user who
knows next to nothing about software or hardware, but come on I thought
most of the people on this list had a bit more sense.

The biggest problems we will face in 2038 will be can we find all the
source, compilers, linkers, libraries etc needed to rebuild an app and
will the average programmer know the difference between an integer and a
string :-)

Regards
Sergio

> We have a financial program at work that calcualtes the present value of
> future investments, etc. Now, tell me what will happen when you
> calculate the value of a 30 year mortgage ... ;-)

Yep, that's true actually :-)

> Years ago we had to revise the program to ensure that the date
> structures were not based on time_t but some better methods of

One thing I have learned when was developing on multi platform environment
to use my own type defs. It helps on making sure that the application will
be compiled and will be run on all systems but also it is good in situations
like this: just redefine the type you are using for time calculations and
pretty much that's it - well, not quite when you are interfacing with
standard lib functions, but that's another issue.

Tamas

On Sat, Apr 19, 2008 at 5:13 PM, Rolf <learrrogers.com> wrote:

{Quote hidden}

> The word length for the z80 was / is 8 bits, yet it could still cope with
> large numbers and relatively large discs. People seem to have latched onto
> 64 bit as a cure-all.

Yes, the architecture does not have to be 64 bit, but the time_t. How the 64
bit number is calculated is irrevelant. This problem could also be solved if
time_t related lib functions counts the dates from 2000 for example and not
from 1970, but I am not sure if that is the best solution. Then someone will
say "what if a software wants to go back in time...".

Tamas

On Sat, Apr 19, 2008 at 10:37 PM, sergio masci <smplxallotrope.net> wrote:

{Quote hidden}

> -

2008\04\19@165920 by
At 11:37 AM 4/19/2008, Harold Hallikainen wrote:

>And, I'm using time_t in some PIC24 products. I wonder when the compiler
>will be modified to use signed 64 bit values for time_t.

This is a question from someone who doesn't grok C - can't they
declare time_t as unsigned?

dwayne

--
Dwayne Reid   <dwaynerplanet.eon.net>
Trinity Electronics Systems Ltd    Edmonton, AB, CANADA
(780) 489-3199 voice          (780) 487-6397 fax
http://www.trinity-electronics.com
Custom Electronics Design and Manufacturing

> At 11:37 AM 4/19/2008, Harold Hallikainen wrote:
>
>>And, I'm using time_t in some PIC24 products. I wonder when the compiler
>>will be modified to use signed 64 bit values for time_t.
>
> This is a question from someone who doesn't grok C - can't they
> declare time_t as unsigned?
>
> dwayne
>

I suppose so, but then they'd have to rewrite the functions anyway. Why
not just go out to signed 64 bit now? Though I suspect that some people
are storing date/time in 4 bytes, and this upgrade would mess them up.

Harold

--
FCC Rules Updated Daily at http://www.hallikainen.com - Advertising
opportunities available!
> How many computers built 30 years ago are still used right
> now ? In 2038 will be no computer in use manufactured today.
> Joe, this subject is a waste of EE time...
> :)

Yes, I should have put it in [OT] where all the programmers and
engineers lurk. Seriously, as micros acquire PC functionality and
WILL end up anywhere doing anything, it's an issue to address
for future-proofing products. It's quite conceivable that a micro
programmed 10 years from now (look how many legacy bugs
MPASM has for example) relying on time_t will be in service for
20 years after that

Is the 2038 problem mentioned in C classes or tutorials ?

> Maybe there will be minor glitches but I do not think we need to

Wouldn't it be nice to have a "Y2K38 Compliant" sticker on a
product before anyone else ?

On Sun, 20 Apr 2008, Jinx wrote:
> Wouldn't it be nice to have a "Y2K38 Compliant" sticker on a
> product before anyone else ?

Heh.. check out the bottom of my web page.. it's been compliant with a
particular date issue since pre-2000.  :-)

http://www.ian.org

--
Ian Smith

> Wouldn't it be nice to have a "Y2K38 Compliant" sticker on a
> product before anyone else ?

One afternoon, sitting in a Dr's office, I noticed that his exam light
had been certified to be Y2K compliant...  A light bulb, cord, switch,
lens, and some fiber...

Gargh..

The hysterics like to quote the end of the Mayan calendar, but I think
it's far more likely that whoever commissioned the work said something
like "oh, run it out to there, that's enough for now".   Notice that
pretty much any physical calendar you find has an "End Date"..

I still wonder why certain voting machines count votes with a signed integer.
David VanHorn wrote:

> The hysterics like to quote the end of the Mayan calendar, but I think
> it's far more likely that whoever commissioned the work said something
> like "oh, run it out to there, that's enough for now".   Notice that
> pretty much any physical calendar you find has an "End Date"..

I'm really ticked off at those Mayans, according to their calendar
I don't get to enjoy my birthday party. ;-) Ah, I'll just celebrate
it one day early and very hard. That way if it's the end of the
world then I'll be too busy to worry about it.

--
Linux Home Automation         Neil Cherry       ncherrylinuxha.com
http://www.linuxha.com/                         Main site
http://linuxha.blogspot.com/                    My HA Blog
Author of:            Linux Smart Homes For Dummies

>> Maybe there will be minor glitches but I do not think we need to

The thing is that time_t is much more of an "internal" format than
the two digit "year" that led to the Y2K "problems."  The number of
programs that store binary time_t in a database is vanishingly small
compared to the number of programs that stored "YY", and you could
probably change the base year, or the increment, and 80% of programs
wouldn't even notice...

BillW

I remember some of the articles before 2000 claimed that many elevators,
vehicles, machineries etc will stop working in 2000. I guess many
microcontroller based application that controls such products are written in
C and use time_t instead of "YY", but not sure if an elevator would stop
working in 2038 because of this.

Tamas

On Sun, Apr 20, 2008 at 5:43 AM, William Chops Westfield <westfwmac.com>
wrote:

{Quote hidden}

> -
On 4/19/08, Jinx <joecolquittclear.net.nz> wrote:
> > How many computers built 30 years ago are still used right
> > now ? In 2038 will be no computer in use manufactured today.
> > Joe, this subject is a waste of EE time...
> > :)
>
> Yes, I should have put it in [OT] where all the programmers and
> engineers lurk. Seriously, as micros acquire PC functionality and
> WILL end up anywhere doing anything, it's an issue to address
> for future-proofing products. It's quite conceivable that a micro
> programmed 10 years from now (look how many legacy bugs
> MPASM has for example) relying on time_t will be in service for
> 20 years after that

Let's try in other way.
How many of the people involved in this discussion have designed and
built something looking closely to a PC arhitecture ?
Having a RTC with battery backup and at least one PCI, PCIE, SATA etc.
and one motherboard and a few daughtercards with specific functions.
Manufactured in a few dozens of pieces at least.
Because if there is someone here, will definitely know that the
predicted life of such device is below 7 years. Rarely such equipments
are still in life after 10 years.
And definitely after 30 years nobody are offering support for those
even there still  available operating systems, bios etc.
A computer is already 50% aged when you take it home from the shop.
There would be no problem in 2038 except maybe if we don't kill the
earth till then.
greetings,
Vasile

On Sun, 20 Apr 2008, Vasile Surducan wrote:

{Quote hidden}

I wrote the software for a news distribution system back in 1990. It was a
big box with 96 telegraph lines. At its heart is an AT PC with specialised
ISA daughtercards. The software was written using MS C 5.1. The box was
expected to have an operational life in excess of 10 years and the
software was written to cope with years in excess of 1999.

Regards
Sergio
> Because if there is someone here, will definitely know that the
> predicted life of such device is below 7 years.

For hardware yes, but this is a software problem.

> Rarely such equipments
> are still in life after 10 years.

In places where things could matter HW and certainly SW is often used
much much longer. Just ~10y ago the Dutch Hoogovens (steel industry)
still used a number of PDP-11's. DEC did not even exist anymore. They
carefully stocked what was still useable of every one that died and
repaired what they could...

--

Wouter van Ooijen

-- -------------------------------------------
Van Ooijen Technische Informatica: http://www.voti.nl
consultancy, development, PICmicro products
docent Hogeschool van Utrecht: http://www.voti.nl/hvu

> How many of the people involved in this discussion have designed
> and built something looking closely to a PC arhitecture ?

But it's not about architecture, it's about the C variable time_t, and
C compilations are and will be in micros for many years to come. I
personally have some original micros still in service after almost 20
years, and there's no reason why they won't go for another 20. They
don't have C in them, but ones in production do. Micros now are
quite capable of performing tasks that would have once required a
PC. As you say, you don't expect a PC to have a long lifetime but
micros are completely different

Dwayne Reid wrote:

> At 11:37 AM 4/19/2008, Harold Hallikainen wrote:
>
>>And, I'm using time_t in some PIC24 products. I wonder when the compiler
>>will be modified to use signed 64 bit values for time_t.
>
> This is a question from someone who doesn't grok C - can't they
> declare time_t as unsigned?

The implications are not really specific for C... How'd you represent dates
before 1970 in that case? (I'm told there are people around who were born
before that year :)

Gerhard

Jinx wrote:

> Is the 2038 problem mentioned in C classes or tutorials ?

I don't think this is the most important question. IMO it is whether the
really needed time span is mentioned in the specs of the software to be
written. Probably most of the time bugs are not due to misunderstanding on
the side of the programmer of the implications of using limited date
ranges, but not sufficiently specified operational date ranges.

Gerhard

> Because if there is someone here, will definitely know that the
> predicted life of such device is below 7 years. Rarely such equipments
> are still in life after 10 years.
It's not so much the equipment as the software, file formats, protocols
abi's etc. Those things often long outlive the machines they run on.

The definition of standard types like time_t cannot be changed without
causing an abi break. abi breaks are an extremely big deal.

Most if not all 64 bit operating systems when running native 64 bit apps
use a 64 bit time_t but they also support 32 bit apps which still use a
32 time_t and many programmers use int where they should use time_t (int
is still 32 bit on many 64 bit systems including x64 linux and windows)

And then there are all the arm/coldfire/etc chips running a unix like
operating system. I don't see microcontrollers going 64 bit in the
forseeable future (admittedly 30 years is a long time but think how many
8 bit microcontrollers are still in use today).

And then there is the issue of file formats. If there are only four
bytes in the file format and you need to be able to represent dates
before the epoch you are kind of screwed.

I guess many of us on this list today will be the old retired or about
to retire coders who still know C and are called on to fix the bugs in
the years running up to 2038.

On Sun, 20 Apr 2008, Gerhard Fiedler wrote:

{Quote hidden}

But you useually find that the person writing the specs does not know
to read the specs and highlight any problems he foresees (limitations).

The average H/W eng, chemist, microbioligist, vet etc (all pretty
intelligent people) would assume that asking for a date / time stamp (on
data being captured) would work for all dates and not just some small
subset. To put this another way, the person writing the spec doesn't need
to tell the programmer that certain variables should be 8, 16, or 32 bits
wide integers or 32 or 64 bit floats. The programmer needs to sort that
out. So why should the spec writer assume anything less for dates?

Regards
Sergio
> I  personally have some original micros still in service
> after almost 20
> years, and there's no reason why they won't go for another
> 20.

Do they have UV / Time erasable EPROMS in them?

R

sergio masci wrote:

>> Jinx wrote:
>>
>>> Is the 2038 problem mentioned in C classes or tutorials ?
>>
>> I don't think this is the most important question. IMO it is whether the
>> really needed time span is mentioned in the specs of the software to be
>> written. Probably most of the time bugs are not due to misunderstanding on
>> the side of the programmer of the implications of using limited date
>> ranges, but not sufficiently specified operational date ranges.
>
> But you useually find that the person writing the specs does not know
> about programming limitations, that's one of the tasks of the programmer,
> to read the specs and highlight any problems he foresees (limitations).
>
> The average H/W eng, chemist, microbioligist, vet etc (all pretty
> intelligent people) would assume that asking for a date / time stamp (on
> data being captured) would work for all dates and not just some small
> subset.

I don't agree. Date representations are always limited, and writing a spec
for any project where dates are important requires a range spec for those
dates. I don't have any statistics, but I think that the majority of Y2k
fixes were due to someone saying "Dates up until 1999? That's plenty, let's
use this" somewhere along the spec path.

After the media exposure that Y2k got, no average person specifying a
program should assume that a valid date is just "given".

> To put this another way, the person writing the spec doesn't need
> to tell the programmer that certain variables should be 8, 16, or 32 bits
> wide integers or 32 or 64 bit floats. The programmer needs to sort that
> out.

This is only partly correct. Of course the spec doesn't (or doesn't have
to) specify how many bits a variable should have. But it needs to specify
whether it should work for values up to \$1G or up to \$1T for a given
entity, for example. For most private finance software, \$1G may be good
enough to balance checking accounts; if you want to keep track of
government debt, you need to go higher (and \$1T is not enough). That needs
to be specified, and the programmer then chooses the appropriate variable
structure. There's no way for a programmer to determine how to represent a
value if no required range is given, somehow.

> So why should the spec writer assume anything less for dates?

Not anything less, but exactly the same with dates... the spec needs to
specify the required ranges, and the programmer chooses the appropriate
representation. It's not the programmer's job to know whether the client
wants to spend the extra money to be able to compute beyond 2038 now, or
rather leave that until later. I'm certain that for many MP3 players in the
market that contain some time/date handling, the time_t type fits the specs
and the client doesn't want anything different (better and probably more
expensive). I'm also certain that for some other devices it doesn't. But
even if you used a date spec that can represent all dates from year 1 to
year 9999 (for example) you can't know whether this is enough for the job.
You may have to be able to go further into the past or the future, or
handle different calendars, or ... This is spec stuff.

It still is the programmer's responsibility (a shared responsibility) to
make sure the specs are complete enough. But the decision about the
required range is outside of the programming domain; it's in the spec
domain.

Sometimes a programmer can infer something from the context -- but whenever
you infer something, you better make sure that it gets added to the spec,
even if informally (like in an email where the client or spec writer
confirms that your assumptions are correct). You may infer wrongly.

Gerhard

> Probably most of the time bugs are not due to misunderstanding on
> the side of the programmer of the implications of using limited date
> ranges, but not sufficiently specified operational date ranges
Thing is even if a programmer knows about the issue if the OS uses a 32
bit signed time_t (e.g. most unix like operating systems for 32 bit or
lower bit count processors) there isn't much they can do.

Why don't we start thinking about dates in terms of scientific notation? In
other words, we don't need to keep track of ranges of time longer than our
current precision, but when we set an arbitrary starting point, and use a
number system that doesn't include an exponent, eventually it has to fail.
Why not use a mantissa and exponent for the year or over all date?

Then you can represent dates +/- the precision of the mantissa centered on
the year with exponent.

--
James.

So what are your great, great, great.... grand children going to do in
10001?

--
James.

-----Original Message-----
From: piclist-bouncesmit.edu [piclist-bouncesmit.edu] On Behalf Of
piclistian.org
Sent: Saturday, April 19, 2008 16:13
To: Microcontroller discussion list - Public.
Subject: Re: [EE] Time_t and The Year 2038 Problem

On Sun, 20 Apr 2008, Jinx wrote:
> Wouldn't it be nice to have a "Y2K38 Compliant" sticker on a
> product before anyone else ?

Heh.. check out the bottom of my web page.. it's been compliant with a
particular date issue since pre-2000.  :-)

http://www.ian.org

--
Ian Smith

>> This is a question from someone who doesn't grok C - can't they
>> declare time_t as unsigned?

In most applications, if used with care, this will work fine. Unfortunately
time_t is used to hold two completely different kinds of quantities:

A) Points in time: like "15 seconds after noon on December 12, 2012"

B) Durations like "6 days, 2 hours, 27 minutes, 33 seconds" (expressed as a
number of seconds).

It is the latter use that can easily cause trouble. If an interval is
expressed as 'end-start' the result is positive, but if it is represented as
'start-end' it becomes negative.

--- Bob Ammerman
RAm Systems

James Newton wrote:

> Why don't we start thinking about dates in terms of scientific notation? In
> other words, we don't need to keep track of ranges of time longer than our
> current precision, but when we set an arbitrary starting point, and use a
> number system that doesn't include an exponent, eventually it has to fail.
> Why not use a mantissa and exponent for the year or over all date?

The problem with mantissa and exponent is that the resolution depends on
the distance from the reference point 0. That has its own set of problems.

Also for going back into the past things are a bit more complex. Once you
go back more than a couple hundred years, you usually have to take into
account the different calendars that at different times in different
cultures were used. Subtracting 730'485 days from today and saying that
this was April 20, 8 is not going to help anybody much.

Date as we know it is not a scientific measure, it's a social measure.
(Here the treaded "human factor" creeps in again :)

Gerhard

"03:14:07 UTC on Tuesday, January 19, 2038," is not the universal date
for this projected epoch. Things are getting localised at a fast rate,
and I believe that regions with daylight savings time are going to be
affected at a time different than the projected one.

Places in southern hemisphere will be on DST in January. New Zealand,
being the easternmost timezone, will be the first one to be hit by the
'bug.'

Soumendra
--
If thought is life and strength and breath
And the want of thought is death
Then am I a happy fly
If I live
Or if I die
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

On Sun, Apr 20, 2008 at 12:07:31PM -0700, James Newton wrote:
> Why don't we start thinking about dates in terms of scientific notation? In
> other words, we don't need to keep track of ranges of time longer than our
> current precision, but when we set an arbitrary starting point, and use a
> number system that doesn't include an exponent, eventually it has to fail.
> Why not use a mantissa and exponent for the year or over all date?
>
> Then you can represent dates +/- the precision of the mantissa centered on
> the year with exponent.

In general, why bother?

A 64-bit signed integer, with 1 second resolution, gives you 292 billion
years before and after the epoch. I'd hate to think that we'd be still
remembering the 70's until well after the sun has blown up.

If that's not enough, just upgrade to 256-bit resolution, which has
enough bits to encompase Wikipedia's "Orders of Magnitude (time)"
section from Planck time, 10^-44, to 10^11 (closed universe total
lifetime) with another 14-odd digits to spare.

That said, from a real world implementation perspective, go for it.
Modern computers add 64-bit floating point doubles in one clock cycle
just like 64-bit integers. There is precedence as well, Python for
instance uses floats for everything time related, time() returns the
usual seconds since 1970 and as much fractional seconds of resolution as
the OS provides, microseconds on Linux for instance.

My Tuke EDA toolkit I'm writing uses standard 64-bit floats to store all
dimension data. Most EDA tools I know of tend to store geometry data in
integers, gEDA for instance has a 1 micron resolution, but I figured
it'd be easier just to use floats given that most math and geometry
libraries expect them. Besides, it's fun to think that with meters as my
underlying unit %99.99 of designs will never have a dimension be more
than 1.0 internally.

- --
http://petertodd.org 'peter'[:-1]@petertodd.org
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.4.6 (GNU/Linux)

iD8DBQFIC+0Q3bMhDbI9xWQRAqcVAJ4uIR9F3FLJdQnRrFc9CNRMcHtZxQCePvzw
PICRrReqNCGKi8Tbzsp7M0Y=
=cjTD
-----END PGP SIGNATURE-----

> Why don't we start thinking about dates in terms of scientific notation?
> In
> other words, we don't need to keep track of ranges of time longer than our
> current precision, but when we set an arbitrary starting point, and use a
> number system that doesn't include an exponent, eventually it has to fail.
> Why not use a mantissa and exponent for the year or over all date?
>
> Then you can represent dates +/- the precision of the mantissa centered on
> the year with exponent.
>
> --
> James.

What happens if you increment a floating point number once the number is
high enough so the the lsb is 2 (instead of 1 or some fraction)? Does time
stop?

Harold

--
FCC Rules Updated Daily at http://www.hallikainen.com - Advertising
opportunities available!
>I worked for a company that had 1990 rollover problem.  The software was
>written in 1980, and to save space on the twin floppies, only one digit was
>used for the year, so 3 meant 1983.  All good until 1989 when the problem
>was realised, and a solution was found - make it two digits!  Yeah, we
>know,
>it'll fail in 2000, but we won't be using it then...

I remember a similar scenario in a company I worked for. They had a legal
package where someone had done some 'quick and dirty' coding to get the
product out the door. Again it lasted about 10 years, before falling over
with errors. Us hardware engineers came in on the Monday morning to a heap
of calls, only to find it was a software problem ... ;)

The software department did get some ribbing over that one ...

>I suppose so, but then they'd have to rewrite the functions anyway. Why
>not just go out to signed 64 bit now? Though I suspect that some people
>are storing date/time in 4 bytes, and this upgrade would mess them up.

And why waste 3 bytes, when really all that is needed is 40 bits .... ???
>> Because if there is someone here, will definitely know that the
>> predicted life of such device is below 7 years.
>
>For hardware yes, but this is a software problem.
>
>> Rarely such equipments
>> are still in life after 10 years.

And how many of us have replaced the BIOS battery in PCs that have gone
through that sort of age ???

>In places where things could matter HW and certainly SW is often used
>much much longer. Just ~10y ago the Dutch Hoogovens (steel industry)
>still used a number of PDP-11's. DEC did not even exist anymore. They
>carefully stocked what was still useable of every one that died and
>repaired what they could...

and it is reported (possibly apocryphally) that NASA trawls eBay for 8086
based equipment to support the ground based portion of shuttle missions ...

>So what are your great, great, great.... grand children going
>to do in 10001?

"In the year 2525
If man is still alive
If women can survive ..."
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

On Sun, Apr 20, 2008 at 06:50:56PM -0700, Harold Hallikainen wrote:
{Quote hidden}

That's exactly what happens:

0.000000000001 + 10000.0
= 10000.000000000002

0.000000000001 + 100000.0
= 100000.0

- --
http://petertodd.org 'peter'[:-1]@petertodd.org
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.4.6 (GNU/Linux)

iD8DBQFIDG9r3bMhDbI9xWQRAoABAKCYsFvhHdNnwFsQ/H6qUKe+wOcoSACfcfh+
VVX0C0Qx8J6y6u3+9OkCnYg=
=SxBe
-----END PGP SIGNATURE-----
Soumendra wrote:
>
> "03:14:07 UTC on Tuesday, January 19, 2038," is not the universal date
> for this projected epoch. Things are getting localised at a fast rate,
> and I believe that regions with daylight savings time are going to be
> affected at a time different than the projected one.
>
Indeed, if an app uses the unix time style format to store local time as
well as universal time it will indeed be hit by the bug slightly earlier
in certain timezones.

We are talking under a days different though so this is mostly nitpicking.

On Mon, 21 Apr 2008, Alan B. Pearce wrote:

> >So what are your great, great, great.... grand children going
> >to do in 10001?
>
> "In the year 2525
> If man is still alive
> If women can survive ..."

Let me guess, a cleopatra fan?
> > "In the year 2525
> > If man is still alive
> > If women can survive ..."
>
> Let me guess, a cleopatra fan?

Zager and Evans!
>> > "In the year 2525
>> > If man is still alive
>> > If women can survive ..."
>>
>> Let me guess, a cleopatra fan?
>
>Zager and Evans!

That's them, I couldn't remember who it was that sang it.

And as to the comment about 10001, well if the program can last (near
enough) 8 millennia, and still find operational hardware that can run it,
then I guess it will be doing extremely well.

: The thing is that time_t is much more of an "internal" format than
:: the two digit "year" that led to the Y2K "problems."  The number of
:: programs that store binary time_t in a database is vanishingly
:: small
:: compared to the number of programs that stored "YY", and you could
:: probably change the base year, or the increment, and 80% of
:: programs
:: wouldn't even notice...

I'm not so sure, I was in a university research lab today, and there
battered in a corner was a computer from circa mid nineties, still
requires a 5.25" floppy to crank it up, uses DOS and is still used, as
it still performs the measurements required by their fluoroscopic
microscopy. I have seen lab equipment made by a well known Austrian
lab equipment  company that still uses DOS based firmware to run their
systems. Apart from providing the experiment date, not too much else
is date based - but the point is, if high tech facilities are using
what many or most would consider outdated technology, it isn't beyond
the realms of possibility that in 2038 computers or equipment made in
2000 would still be in use.

Colin
--
cdb, colinbtech-online.co.uk on 24/04/2008

Web presence: http://www.btech-online.co.uk

Hosted by:  http://www.1and1.co.uk/?k_id=7988359

:

--
No virus found in this outgoing message.
Checked by AVG.
Version: 7.5.523 / Virus Database: 269.23.4/1395 - Release Date: 4/24/2008 7:24 AM

{Quote hidden}

I think the software will live longer than the hardware. The Y2K issues, I
believe, were due to code written in the 1960s and 1970s in COBOL still
running fine on relatively recent hardware. We may find some of our code,
especially if it gets stuck in a library somewhere so no one ever looks at
the source, still running in 30 years.

Harold

--
FCC Rules Updated Daily at http://www.hallikainen.com - Advertising
opportunities available!

More... (looser matching)
- Last day of these posts
- In 2008 , 2009 only
- Today
- New search...