What is Unix epoch time?
Unix epoch time (also called POSIX time or Unix timestamp) is the number of seconds elapsed since 1970-01-01T00:00:00Z, not counting leap seconds. It is the universal way to store moments in time inside Unix-derived systems, databases, log files, and almost every programming language. A timestamp of 0 means midnight UTC at the start of 1970; 1000000000 was September 9, 2001.
Why is my timestamp 13 digits?
13 digits means milliseconds since the epoch — the precision JavaScript Date.now() returns. 10 digits is seconds (the original Unix convention used by C, Python time(), Go, and PostgreSQL TIMESTAMP), 16 digits is microseconds (Python time.time_ns()/1000, PostgreSQL extract(epoch)*1e6), and 19 digits is nanoseconds (Go time.UnixNano, Java Instant.toEpochNano). This converter detects the digit count and picks the right precision automatically.
What is the Y2038 problem?
The Y2038 problem is the moment a 32-bit signed integer storing Unix seconds overflows: 03:14:07 UTC on January 19, 2038 (2147483647 seconds). Systems still using time_t as int32 will wrap to a negative number and interpret the time as 1901. Modern Linux, macOS, BSD, and 64-bit kernels use int64 time_t, which does not overflow for ~292 billion years. Embedded devices and old binaries are the remaining risk.
Are negative timestamps valid?
Yes — a negative Unix timestamp represents a date before 1970-01-01 UTC. -1 is 1969-12-31T23:59:59Z, -2208988800 is 1900-01-01. JavaScript Date handles negative epoch values correctly, and so does this converter when you paste a leading minus sign. Many databases and APIs reject them in their schema validation, however, so check before storing historical events as negative epochs.
How do I get the current Unix time in different languages?
JavaScript: Math.floor(Date.now()/1000) for seconds, Date.now() for ms. Python: int(time.time()) or time.time_ns(). Go: time.Now().Unix() or .UnixNano(). PostgreSQL: extract(epoch from now()). Bash: date +%s (or date +%s%3N for ms). Java: Instant.now().getEpochSecond() / .toEpochMilli(). Click the Now button above to see the current epoch in this tool.
What is the difference between epoch in seconds vs milliseconds?
They count the same instant but at different precisions. Seconds give whole-second resolution (1714471234), milliseconds give thousandths (1714471234567). Convert by multiplying or dividing by 1000. JavaScript natively works in ms, most Unix tools in seconds. Mixing them is the most common bug — a 13-digit value passed where 10 is expected lands you in the year 56000.
Can I convert milliseconds back to seconds losslessly?
Going from ms → seconds always loses sub-second precision (1714471234567 ms truncates to 1714471234 s — the .567 ms is gone). Going seconds → ms appends three zeros and is lossless because seconds had no sub-second info to lose. The same applies between ms/μs and μs/ns: divide loses precision, multiply preserves.
How accurate are Unix timestamps?
A Unix timestamp is an exact integer count of SI seconds (or smaller units) and is therefore as precise as the source clock. The browser Date object is millisecond-accurate, performance.now() is microsecond, and OS-level monotonic clocks reach nanoseconds. Wall-clock accuracy depends on NTP sync — most servers stay within a few ms of UTC. Leap seconds are not counted, so during a leap insertion the timestamp briefly stalls.