DevBolt

Epoch Milliseconds Converter

Convert between millisecond-precision Unix timestamps and human-readable dates. JavaScript and Java use millisecond timestamps (13 digits), while Unix and Python default to seconds (10 digits).

← Back to tools

Epoch / Timestamp Converter

Convert between Unix timestamps and human-readable dates. Supports seconds and milliseconds.

Quick Reference

Unix epoch is the number of seconds since January 1, 1970 00:00:00 UTC. Timestamps with 13+ digits are automatically treated as milliseconds. Negative values represent dates before 1970.

Seconds vs milliseconds

Unix timestamps come in two common formats: seconds since epoch (10 digits, e.g., 1710720000) and milliseconds since epoch (13 digits, e.g., 1710720000000). JavaScript's Date.now() returns milliseconds. Python's time.time() returns seconds with decimal. This converter handles both formats automatically.

Converting between seconds and milliseconds

Seconds to milliseconds: multiply by 1000. Milliseconds to seconds: divide by 1000 and floor. Quick detection: if the timestamp has 13 digits, it is likely milliseconds; 10 digits means seconds. Timestamps in the billions (1,000,000,000+) are seconds after 2001; in the trillions (1,000,000,000,000+) are milliseconds.

Frequently Asked Questions

How do I tell if a timestamp is seconds or milliseconds?

Count the digits: 10 digits = seconds (e.g., 1710720000), 13 digits = milliseconds (e.g., 1710720000000). Values above 1 trillion are always milliseconds.

Why does JavaScript use milliseconds?

JavaScript's Date object was designed for browser precision where sub-second timing matters (animations, events). Date.now() returns milliseconds for consistency with Date.getTime() and setTimeout/setInterval.