[quote name='Servant of the Lord' timestamp='1347833400' post='4980723']
I bet you think a minute has 60 seconds in it, too.
I will confirm that statement, depends the start point of second counting.
Else i am not scientist and approx value of 60 seconds in minute suffices for my needs.
And much thanks for explaining that sizeof function works i will do homework
[/quote]
It's not about being scientific and that a minute has approximately 60 seconds, or that it depends on how you count. A scientific minute does have exactly 60 seconds by definition. But the problem here is the effect of having a time that is bound to the rotation of the earth, like GMT or UTC. Leap seconds are added or removed from the time to keep the clock in sync with the rotation of the earth to ensure that the length a day is 24 hours +/- 0.9 seconds. On some days, the very last minute of the very last hour of that day has either 61 or 59 seconds, not 60 as a normal minute. If you have a time-stamp critical application, you may be in trouble when some systems are suddenly off by a second and you have times stamps that are not in sync anymore.
There are other time systems that don't have leap seconds, in which a minute is always 60 seconds, for example atomic time. At the introduction of atomic time, it was set to GMT. The difference is that atomic time ticks according to the definition of a second, with 60 seconds a minute and so on, but without leap seconds. Today, the atomic time is 30 or so seconds ahead of GMT due to added leap seconds.
Is this necessary to know? In everyday life, probably not. You're most likely fine assuming a minute always has 60 seconds. Just like you can assume that in your everyday development you can assume that a byte has eight bits. Some platforms, just like some application are time-stamp critical, your assumptions may be wrong and a byte isn't eight bits anymore. There are often more to it to trivial and well known facts, and it is not always as easy and trivial as it may sound