Unix Time Format Converter — Seconds, Milliseconds, Nanoseconds
Unix time exists at four distinct precision levels. Seconds (10-digit) are the standard for Unix and POSIX APIs, cron schedulers, and most SQL databases. Milliseconds (13-digit) are the default for JavaScript Date.now() and Java System.currentTimeMillis(). Microseconds (16-digit) appear in high-precision system logs and Python time.time_ns() // 1000. Nanoseconds (19-digit) are used by Go time.Now().UnixNano() and Linux clock_gettime(). When data flows between different systems — a Go microservice writing to a JavaScript frontend, or a Python script reading from a PostgreSQL table — a mismatch in precision causes timestamps that are off by factors of 1,000 or 1,000,000. This tool auto-detects the precision of any integer timestamp you paste and converts it to all four scales simultaneously, eliminating the guesswork.
Input
≤ 10 digits → seconds
11–13 digits → milliseconds
14–16 digits → microseconds
17–19 digits → nanoseconds
All Formats
Enter a Unix timestamp to see all formats.
Unix Time Precision — Format Reference Table
| Format | Digits | Example | Multiplier from seconds | Common in |
|---|---|---|---|---|
| Seconds | 10 | 1700000000 | × 1 | Linux, POSIX APIs, databases |
| Milliseconds | 13 | 1700000000000 | × 1,000 | JavaScript Date, Java System.currentTimeMillis |
| Microseconds | 16 | 1700000000000000 | × 1,000,000 | Python time.time_ns()÷1000, gettimeofday |
| Nanoseconds | 19 | 1700000000000000000 | × 1,000,000,000 | Go time.Now().UnixNano(), Linux clock_gettime |
How to Convert Between Unix Time Formats
From seconds to other formats:
ms = seconds × 1,000
µs = seconds × 1,000,000
ns = seconds × 1,000,000,000From other formats back to seconds:
seconds = ms ÷ 1,000
seconds = µs ÷ 1,000,000
seconds = ns ÷ 1,000,000,000Cross-format conversions:
µs = ms × 1,000
ns = ms × 1,000,000
ns = µs × 1,000Note: JavaScript numbers are IEEE 754 doubles, safe only up to 253− 1 (9,007,199,254,740,991 ≈ 9×1015). Microsecond timestamps near the current time are within this range, but nanosecond timestamps exceed it. Use BigInt in JavaScript or int64 in Go/Java for lossless nanosecond arithmetic.
Frequently Asked Questions
Which unix time format does my system use?+
A 10-digit integer means seconds — the standard for most Unix and Linux APIs, POSIX system calls, and SQL databases. A 13-digit integer means milliseconds — the default for JavaScript Date.now() and Java System.currentTimeMillis(). A 16-digit integer means microseconds — used by gettimeofday() on Linux and Python time.time_ns() // 1000. A 19-digit integer means nanoseconds — the default for Go time.Now().UnixNano() and Linux clock_gettime(CLOCK_REALTIME).
What precision does JavaScript Date give?+
JavaScript Date works in milliseconds (13-digit timestamps). Date.now() returns the number of milliseconds since the Unix epoch. To get a seconds-precision timestamp, divide by 1000 and floor: Math.floor(Date.now() / 1000). To convert a seconds timestamp back to a Date, multiply by 1000: new Date(ts * 1000).
What does Python time.time() return?+
Python time.time() returns seconds as a floating-point number (e.g. 1700000000.123456), giving sub-second precision but relying on float arithmetic. For integer nanosecond precision, use time.time_ns() which returns a plain integer with nanosecond resolution. Divide by 1000 to get microseconds, by 1_000_000 to get milliseconds.
What is Go's default timestamp precision?+
Go uses nanoseconds by default. time.Now().UnixNano() returns a 19-digit int64 representing nanoseconds since the Unix epoch. For seconds use time.Now().Unix(), for milliseconds use time.Now().UnixMilli() (Go 1.17+), and for microseconds use time.Now().UnixMicro() (Go 1.17+).
When should I use microseconds versus nanoseconds?+
Microseconds are sufficient for most network logging and distributed tracing — sub-millisecond events like HTTP request durations or database query times are well within the microsecond range. Nanoseconds are needed for CPU profiling, hardware event timing, Go runtime internals, and Linux kernel instrumentation where individual CPU cycles matter.
Related Tools
Related Tools
Unix Timestamp Converter
Convert Unix timestamps to human-readable dates and back, instantly.
Epoch Converter
Convert epoch timestamps to human-readable dates and times.
Unix Timestamp
Learn about Unix timestamps and get the current value live.
Timestamp to Date
Convert any Unix timestamp to a readable date and time.
Date to Timestamp
Convert dates and times into Unix timestamps.
Epoch Time
Explore epoch time history, standards, and real-world usage.
UTC Time Now
See the current UTC time live with a real-time clock.
Year 2038 Problem
Learn about the Y2K38 overflow and which systems are affected.
Seconds Since 1970
See how many seconds have elapsed since the Unix epoch.
Epoch Time to Date
Convert epoch time to a readable date with code examples in 6 languages.
Linux Timestamp to Date
Convert Linux/Unix timestamps to readable dates using command-line tools.
Unix Time to Date
Convert Unix time to a readable date with real-world examples and code.
Convert Time to Unix Time
Convert any date or time to a Unix timestamp in seconds, milliseconds, and ISO 8601.
Convert Unix Time to Time
Decode any Unix timestamp to ISO 8601, RFC 2822, and relative time instantly.
Unix Timestamp to Date
Convert any Unix timestamp to a readable date with code examples for Python, JS, SQL, and more.