Unix time

Unix time, or POSIX time, is a representation of a date and time in the form of a number representing the number of seconds since the "epoch" of January 1, 1970 at 00:00 (midnight) UTC. "Seconds" for this purpose are reckoned so that a calendar day is always exactly 86,400 seconds, which means that leap seconds are skipped in the numbering.

Technically speaking, the modern definition of UTC with leap seconds was not in use until 1972, so the actual definition of the epoch time in 1970 is a bit hazy. However, in practice, the standard simply consists of days of a fixed number of seconds extending indefinitely into the past and future (seconds before 1972 were actually variable in length to match the varying length of the solar day), with the behavior of the count at the exact moment of a leap second slightly irregular.

Many file formats contain timestamps of this form, stored either as a binary-encoded number to some degree of precision, or as ASCII strings of digits. When stored as binary, the endianness of the representation is an issue for the file format.

When stored as a signed 32-bit integer, Unix timestamps will overflow at 03:14:08 UTC on 19 January 2038. This may be a software problem similar to the much-hyped "Y2K" problem that occurred when the year 2000 was reached and software using 2-digit years had problems as a result.

Switching to unsigned integers would extend the useful life of the format through 06:28:15 UTC on Sun, 7 February 2106, at the expense of removing the ability to represent times before the epoch with negative numbers.

The more robust solution would be to adopt 64-bit integers, which would support times through the year 292,277,026,596.

If fractional seconds are needed, the number of bits required and the time of overflow will vary. Milliseconds (1/1000 sec) are often used, adding three decimal digits to the representations.