Epoch Time Converter — Seconds, Milliseconds, Microseconds, Nanoseconds

Paste any Unix epoch value and instantly see it in all four precisions plus a human-readable date. Auto-detects whether your number is seconds, ms, μs, or ns from its digit count, and uses BigInt math so nanoseconds never lose precision.

Detected: seconds
1 second = 1,000 ms = 1,000,000 μs = 1,000,000,000 ns
10 digits ⇒ seconds · 13 ⇒ ms · 16 ⇒ μs · 19 ⇒ ns
Seconds (s)
Milliseconds (ms)
Microseconds (μs)
Nanoseconds (ns)
UTC
Local
Enter an epoch value or click Now.

What is an epoch time converter?

An epoch time converter takes a Unix timestamp expressed in any precision — seconds, milliseconds, microseconds, or nanoseconds — and returns the equivalent value in every other precision plus a human-readable date. Modern systems disagree on which unit they emit (Go uses nanoseconds, JavaScript uses milliseconds, classic Unix uses seconds, PostgreSQL stores microseconds), so a converter is essential glue when piping logs, metrics, and database rows between languages.

This converter uses native BigInt arithmetic so a 19-digit nanosecond value is never coerced to an IEEE-754 double — meaning no silent precision loss. Detection is automatic from the digit count, and the "Now" button fills the field with the current instant in whichever precision you have selected.

How to convert epoch precision — 4 steps

  1. Paste your epoch value. Any 10/13/16/19-digit integer works. Auto-detect figures out the unit; uncheck if you want to override.
  2. Or click Now. Inserts the current instant at the selected precision so you can copy it straight into a SQL query, log line, or test fixture.
  3. Read the side-by-side table. Same instant, four precisions. UTC and local human-readable strings appear underneath.
  4. Copy any value. Click the copy icon on a single row, or the Copy button in the toolbar to grab the whole conversion table.

Sample input / output

Input:  1714521600 (10 digits → seconds)

seconds      = 1714521600
milliseconds = 1714521600000
microseconds = 1714521600000000
nanoseconds  = 1714521600000000000
UTC          = Wed, 01 May 2024 00:00:00 GMT

Input:  1714521600123456789 (19 digits → nanoseconds)

seconds      = 1714521600
milliseconds = 1714521600123
microseconds = 1714521600123456
nanoseconds  = 1714521600123456789  ← preserved via BigInt

Auto-Detect Precision

The digit count tells the tool whether you pasted seconds, ms, μs, or ns. Override with the dropdown when the heuristic is wrong.

BigInt-Safe Math

Conversions are done in BigInt so 19-digit nanoseconds round-trip exactly. JavaScript Number would lose the last 4 digits.

UTC & Local

Every value also renders as a UTC string and your browser’s local time so you can spot off-by-one-day timezone bugs.

Common use cases

  • check_circleTranslating a Go time.UnixNano() value from a trace log into milliseconds for a JavaScript chart
  • check_circleConverting PostgreSQL EXTRACT(epoch FROM ts)*1000000 microseconds to plain Unix seconds
  • check_circleComparing Kafka message timestamps (ms) with OpenTelemetry span start_time_unix_nano (ns)
  • check_circleReading a 13-digit JavaScript Date.now() value in your terminal as a real date
  • check_circleGenerating a current epoch in any precision to seed a test fixture or migration
  • check_circleDebugging a 32-bit Y2038 wraparound by inspecting the underlying integer
  • check_circleVerifying that nanoseconds returned by clock_gettime are not silently truncated
  • check_circleConverting CloudWatch / Datadog log millisecond timestamps to nanosecond OTel spans

Why BigInt matters for nanoseconds

JavaScript's Number type is a 64-bit IEEE-754 double, which can represent integers exactly only up to 2^53 - 1 = 9,007,199,254,740,991. A 19-digit nanosecond timestamp like 1714521600123456789 exceeds that bound — convert it through parseFloat and you lose the trailing digits, getting 1714521600123456800 back. BigInt has no upper bound and represents the value exactly, which is why this converter parses with BigInt(input) and only narrows down to Number when calling new Date(ms) for the human-readable view (where milliseconds easily fit in safe integer range until the year 285,000).

Need other DateTime tools?

Pair the epoch converter with the rest of OpenFormatter's browser-side date and time toolkit.

Frequently Asked Questions

What is epoch time?

Epoch time (also called Unix time or POSIX time) is the number of time units that have elapsed since 1970-01-01T00:00:00Z. The default unit is seconds, but modern systems express the same instant in milliseconds (JavaScript), microseconds (PostgreSQL TIMESTAMP), or nanoseconds (Go time.UnixNano, Linux clock_gettime). The instant is the same — only the unit changes.

Why does my epoch have 13 / 16 / 19 digits?

A 10-digit value is seconds, 13 is milliseconds, 16 is microseconds, and 19 is nanoseconds (current era). The converter auto-detects the precision from the digit count and shows the equivalent in every other unit. The boundary years where digit counts roll over are well past 2100 for seconds, so the heuristic is reliable today.

What is the difference between microseconds (μs) and nanoseconds (ns)?

A microsecond (μs) is one millionth of a second; a nanosecond (ns) is one billionth — 1 μs = 1,000 ns. Microseconds are the standard precision for SQL TIMESTAMP columns (PostgreSQL, MySQL DATETIME(6)). Nanoseconds are used by high-resolution OS clocks, Go time, and tracing systems like OpenTelemetry.

When do I need nanosecond precision?

Three common cases: distributed-systems tracing (Jaeger, Zipkin, OpenTelemetry use nanosecond start/end times to compute span durations); Go programs that call time.Now().UnixNano(); and Linux/eBPF tooling reading CLOCK_MONOTONIC values. For most application logging, milliseconds are sufficient and easier to read.

Can I lose precision converting nanoseconds to milliseconds?

Yes — going from ns to ms is integer division by 1,000,000, which discards the sub-millisecond remainder. This converter performs the math with BigInt so the conversion itself does not overflow JavaScript number range, but the truncation is permanent. To preserve the original instant, store the highest-precision value you have and convert on the way out.

What is the maximum 32-bit Unix timestamp (Y2038)?

A signed 32-bit integer rolls over at 2,147,483,647 seconds — 2038-01-19T03:14:07Z. After that any system still using a 32-bit time_t (older C, embedded firmware, some database columns) wraps to a negative number representing 1901. 64-bit systems and any timestamp stored as int64 milliseconds, microseconds, or nanoseconds are unaffected for hundreds of millennia.

Why does PostgreSQL use microseconds?

PostgreSQL stores TIMESTAMP / TIMESTAMPTZ as a 64-bit integer count of microseconds since 2000-01-01 (configurable to floating-point at compile time, but rare). Microsecond resolution gives nine significant digits in a typical year — fine for application logging — without paying the storage cost of nanoseconds. EXTRACT(epoch FROM ts) returns seconds with fractional precision; multiply by 1,000,000 to get the underlying integer.

How do I generate epoch time in different languages?

JavaScript: Date.now() (ms). Python: time.time() (float seconds), time.time_ns() (int ns). Go: time.Now().Unix() (s), .UnixMilli(), .UnixMicro(), .UnixNano(). Ruby: Time.now.to_i / .to_f / .strftime("%s%N"). Java: System.currentTimeMillis() / Instant.now().toEpochMilli(). Rust: SystemTime::now().duration_since(UNIX_EPOCH). PostgreSQL: EXTRACT(epoch FROM now()).

Epoch Time Converter — Seconds, Ms, Microseconds, Ns