Convert Unix Time to Time — Free Epoch to Date Converter

A Unix timestamp is the number of seconds that have elapsed since January 1, 1970, at 00:00:00 UTC — a fixed reference point known as the Unix epoch. You encounter them everywhere: server logs record events as 10-digit second values, JavaScript APIs and databases often return 13-digit millisecond variants, and JWT tokens embed expiry and issue times in the exp and iat fields as Unix seconds. Converting a raw number back to a readable date is essential when debugging API responses, inspecting log files, auditing authentication tokens, or verifying scheduled job timestamps. This free tool decodes any Unix timestamp to ISO 8601, RFC 2822, a localized date-time string in your chosen timezone, and a human-friendly relative time — all instantly, with no sign-up required.

Input

Result

Enter a Unix timestamp to see the date and time.

Tip: Paste any number into the input field. The tool automatically detects whether the value is in seconds (10 digits) or milliseconds (13 digits). Use the timezone selector to display the result in your local time — the underlying timestamp does not change.

Frequently Asked Questions

How do I tell if a Unix timestamp is in seconds or milliseconds?+

Count the digits. A 10-digit Unix timestamp represents seconds since the epoch (e.g. 1700000000), while a 13-digit timestamp represents milliseconds (e.g. 1700000000000). The threshold is roughly 10^12: values above it are treated as milliseconds, values at or below it as seconds. This tool detects the format automatically.

Can a Unix timestamp be negative?+

Yes. Negative Unix timestamps represent moments before January 1, 1970 UTC. For example, -86400 corresponds to December 31, 1969 at 00:00:00 UTC. Modern programming languages and JavaScript handle negative timestamps correctly.

How precise are Unix timestamps in milliseconds?+

Millisecond timestamps give 1 ms resolution, which is sufficient for most application-level logging and APIs. High-performance systems sometimes use microsecond (1/1,000,000 s) or nanosecond (1/1,000,000,000 s) precision, stored as larger integers or floating-point numbers.

How does timezone affect the display of Unix time?+

It does not change the underlying value. A Unix timestamp is always an absolute count of seconds from the UTC epoch, so it is the same everywhere on Earth. Timezone only affects how that moment is displayed as a local date and time. Selecting a different timezone in this tool shifts the displayed hours, not the timestamp itself.

What is the Year 2038 problem?+

Many legacy systems store Unix time as a signed 32-bit integer, which can hold values only up to 2,147,483,647. That value corresponds to January 19, 2038, at 03:14:07 UTC. After that second the integer overflows to a large negative number, causing date calculations to fail. Modern 64-bit systems are unaffected because they store timestamps as 64-bit integers with a range far beyond the foreseeable future.

Related Tools

Related Tools