Epoch (computing)


In computing, an epoch is a date and time from which a computer measures system time. Most computer systems determine time as a number representing the seconds removed from particular arbitrary date and time. For instance, Unix and POSIX measure time as the number of seconds that have passed since 00:00:00 UT, a point in time known as the Unix epoch. The NT time epoch on Windows NT and later refers to the Windows NT system time in s intervals from 0h.
Computing epochs are nearly always specified as midnight Universal Time on some particular date.

Variation in detail

Software timekeeping systems vary widely in the precision of time measurement ; some systems may use time units as large as a day, while others may use nanoseconds. For example, for an epoch date of midnight UTC on, and a time unit of a second, the time of the midnight between and is represented by the number 86400, the number of seconds in one day. When times prior to the epoch need to be represented, it is common to use the same system, but with negative numbers.
Such representation of time is mainly for internal use. On systems where date and time are important in the human sense, software will nearly always convert this internal number into a date and time representing a human calendar.

Epoch in satellite-based time systems

There are at least six satellite navigation systems, all of which function by transmitting time signals. Of the only two satellite systems with global coverage, GPS calculates its time signal from an epoch, whereas GLONASS calculates time as an offset from UTC, with the UTC input adjusted for leap seconds. Of the only two other systems aiming for global coverage, Galileo calculates from an epoch and Beidou calculates from UTC without adjustment for leap seconds. GPS also transmits the offset between UTC time and GPS time, and must update this offset every time there is a leap second, requiring GPS receiving devices to handle the update correctly. In contrast, leap seconds are transparent to GLONASS users.
The complexities of calculating UTC from an epoch are explained by the European Space Agency in Galileo documentation under "Equations to correct system timescale to reference timescale".

Problems with epoch-based computer time representation

Computers do not generally store arbitrarily large numbers. Instead, each number stored by a computer is allotted a fixed amount of space. Therefore, when the number of time units that have elapsed since a system's epoch exceeds the largest number that can fit in the space allotted to the time representation, the time representation overflows, and problems can occur. While a system's behavior after overflow occurs is not necessarily predictable, in most systems the number representing the time will reset to zero, and the computer system will think that the current time is the epoch time again.
Most famously, older systems which counted time as the number of years elapsed since the epoch of and which only allotted enough space to store the numbers 0 through 99, experienced the Year 2000 problem. These systems would interpret the date as, leading to unpredictable errors at the beginning of the year 2000.
Even systems which allocate more storage to the time representation are not immune from this kind of error. Many Unix-like operating systems which keep time as seconds elapsed from the epoch date of, and allot timekeeping enough storage to store numbers as large as will experience an overflow problem on if not fixed beforehand. This is known as the Year 2038 problem. A correction involving doubling the storage allocated to timekeeping on these systems will allow them to represent dates more than 290 billion years into the future.
Other more subtle timekeeping problems exist in computing, such as accounting for leap seconds, which are not observed with any predictability or regularity. Additionally, applications which need to represent historical dates and times must use specialized timekeeping libraries.
Finally, some software must maintain compatibility with older software that does not keep time in strict accordance with traditional timekeeping systems. For example, Microsoft Excel observes the fictional date of 29 February 1900 in order to maintain bug compatibility with older versions of Lotus 1-2-3. Lotus 1-2-3 observed the date due to an error; by the time the error was discovered, it was too late to fix it—"a change now would disrupt formulas which were written to accommodate this anomaly".

Notable epoch dates in computing

The following table lists epoch dates used by popular software and other computer-related systems. The time in these systems is stored as the quantity of a particular time unit that has elapsed since a stated time.
Epoch dateNotable usesRationale for selection
0 January 1 BCMATLAB
1 January AD 1Microsoft.NET, Go, REXX, Rata DieCommon Era, ISO 2014, RFC 3339
SPSS
UUID version 1The date of the Gregorian reform to the Christian calendar.
NTFS, COBOL, Win32/Win64 1601 was the first year of the 400-year Gregorian calendar cycle at the time Windows NT was made.
MUMPS programming language1841 was a non-leap year several years before the birth year of the oldest living US citizen when the language was designed.
VMS, United States Naval Observatory, DVB SI 16-bit day stamps, other astronomy-related computations, 00:00:00 UT is the zero of the Modified Julian Day equivalent to Julian day 2400000.5
Microsoft COM DATE, Object Pascal, LibreOffice Calc, Google SheetsTechnical internal value used by Microsoft Excel; for compatibility with Lotus 1-2-3.
Dyalog APL, Microsoft C/C++ 7.0Chosen so that would produce 0=Sunday, 1=Monday, 2=Tuesday, 3=Wednesday, 4=Thursday, 5=Friday, and 6=Saturday. Microsoft’s last version of non-Visual C/C++ used this, but was subsequently reverted.
0 January 1900Microsoft Excel, Lotus 1-2-3While logically 0 January 1900 is equivalent to, these systems do not allow users to specify the latter date. Since 1900 is incorrectly treated as a leap year in these systems, 0 January 1900 actually corresponds to the historical date of.
Network Time Protocol, IBM CICS, Mathematica, RISC OS, VME, Common Lisp, Michigan Terminal System
LabVIEW, Apple Inc.'s classic Mac OS, JMP Scripting Language, Palm OS, MP4, Microsoft Excel, IGOR Pro1904 is the first leap year of the 20th century.
SAS System
Pick OS and variants Chosen so that would produce 0=Sunday, 1=Monday, 2=Tuesday, 3=Wednesday, 4=Thursday, 5=Friday, and 6=Saturday.
Unix time|Unix Epoch aka POSIX time, used by Unix and Unix-like systems, and programming languages: most C/C++ implementations, Java, JavaScript, Perl, PHP, Python, Ruby, Tcl, ActionScript. Also used by Precision Time Protocol.
AmigaOS. The Commodore Amiga hardware systems were introduced between 1985 and 1994. Latest OS version 4.1. AROS, MorphOS.
IBM BIOS INT 1Ah, DOS, OS/2, FAT12, FAT16, FAT32, exFAT filesystemsThe IBM PC with its BIOS as well as 86-DOS, MS-DOS and PC DOS with their FAT12 file system were developed and introduced between 1980 and 1981.
Qualcomm BREW, GPS, ATSC 32-bit time stampsGPS counts weeks and 6 January is the first Sunday of 1980.
AppleSingle, AppleDouble, PostgreSQL, ZigBee UTCTime
Apple's Cocoa framework2001 is the year of the release of Mac OS X 10.0.