alltools.one
Developmentβ€’
2025-07-05
β€’
7 min
β€’
alltools.one Team
UnixTimestampEpochTime ZoneDate

Unix Timestamps Explained: Conversion and Common Pitfalls

A Unix timestamp is one of the simplest yet most misunderstood concepts in programming. It is the number of seconds that have elapsed since January 1, 1970, 00:00:00 UTC β€” a moment known as the Unix epoch. Despite its simplicity, timestamps are a source of bugs related to time zones, precision, and overflow.

What Is the Unix Epoch?

The Unix epoch β€” January 1, 1970, 00:00:00 UTC β€” was chosen as the starting point for Unix time. Every timestamp is measured relative to this moment:

TimestampDate and Time (UTC)
0Jan 1, 1970 00:00:00
86400Jan 2, 1970 00:00:00
1000000000Sep 9, 2001 01:46:40
1700000000Nov 14, 2023 22:13:20
2000000000May 18, 2033 03:33:20

Negative timestamps represent dates before the epoch. For example, -86400 is December 31, 1969.

Convert timestamps instantly with our Timestamp Converter.

Seconds vs. Milliseconds

This is the most common source of confusion. Different systems use different precisions:

SystemPrecisionExample
Unix/POSIXSeconds1700000000
JavaScriptMilliseconds1700000000000
Java (System.currentTimeMillis)Milliseconds1700000000000
Python (time.time)Seconds (float)1700000000.123
PostgreSQL (extract epoch)Seconds (float)1700000000.123456

Rule of thumb: If the number has 13 digits, it is milliseconds. If it has 10 digits, it is seconds.

// JavaScript returns milliseconds
const nowMs = Date.now();          // 1700000000000
const nowSec = Math.floor(nowMs / 1000);  // 1700000000

Time Zone Handling

Unix timestamps are always UTC. They do not contain time zone information. This is actually a feature β€” it provides a universal reference point.

The confusion arises when converting timestamps to human-readable dates:

const ts = 1700000000;
const date = new Date(ts * 1000);

date.toUTCString();      // "Tue, 14 Nov 2023 22:13:20 GMT"
date.toLocaleString();    // Depends on user's local time zone
date.toISOString();       // "2023-11-14T22:13:20.000Z"

Best practice: Store and transmit timestamps in UTC. Convert to local time only at the display layer, as close to the user as possible.

The Year 2038 Problem

Traditional Unix systems store timestamps as a 32-bit signed integer. The maximum value is 2,147,483,647, which corresponds to January 19, 2038, 03:14:07 UTC.

After this moment, 32-bit timestamps overflow to negative values, wrapping back to December 13, 1901. This is analogous to the Y2K bug.

Current status:

  • Most modern systems use 64-bit timestamps (good until the year 292 billion)
  • Linux kernel has been 64-bit timestamp clean since version 5.6 (2020)
  • Embedded systems and legacy databases remain at risk
  • If you are building software that handles dates beyond 2038, verify your timestamp storage

Conversion in Different Languages

JavaScript

// Current timestamp (seconds)
const now = Math.floor(Date.now() / 1000);

// Timestamp to Date
const date = new Date(1700000000 * 1000);

// Date to timestamp
const ts = Math.floor(new Date('2023-11-14').getTime() / 1000);

Python

import time, datetime

# Current timestamp
now = int(time.time())

# Timestamp to datetime
dt = datetime.datetime.fromtimestamp(1700000000, tz=datetime.timezone.utc)

# Datetime to timestamp
ts = int(dt.timestamp())

SQL (PostgreSQL)

-- Current timestamp
SELECT EXTRACT(EPOCH FROM NOW());

-- Timestamp to date
SELECT TO_TIMESTAMP(1700000000);

-- Date to timestamp
SELECT EXTRACT(EPOCH FROM '2023-11-14'::timestamp);

Common Pitfalls

1. Mixing Seconds and Milliseconds

If a date shows as January 1970, you probably passed seconds where milliseconds were expected (or vice versa). Always check which precision the API expects.

2. Ignoring Time Zones in Date Strings

Parsing "2023-11-14" without a time zone creates the date in the local time zone, which varies by server location. Always include the time zone: "2023-11-14T00:00:00Z".

3. Floating Point Precision

When storing timestamps as floating-point numbers, you may lose precision beyond milliseconds. For microsecond or nanosecond precision, use integers with the appropriate multiplier.

4. Leap Seconds

Unix timestamps do not account for leap seconds. A Unix day is always exactly 86,400 seconds, even though actual UTC days occasionally have 86,401 seconds. For most applications, this is irrelevant. For scientific or satellite applications, use TAI (International Atomic Time) instead.

ISO 8601: The Human-Readable Alternative

While timestamps are great for computation, ISO 8601 is the standard for human-readable date representation:

2023-11-14T22:13:20Z          # UTC
2023-11-14T17:13:20-05:00     # Eastern Time
2023-11-14                     # Date only

Most APIs should accept and return ISO 8601 strings. Use timestamps internally for calculations and storage.

FAQ

Why does Unix time start on January 1, 1970?

The date was chosen arbitrarily when Unix was being developed at Bell Labs in the early 1970s. A recent-enough date was needed to avoid wasting bits on distant past dates. Since 32-bit integers can store about 68 years in each direction, starting at 1970 covered dates from 1901 to 2038.

Should I store dates as timestamps or formatted strings in my database?

Store dates as timestamps (integer or native datetime types) for efficient sorting, comparison, and arithmetic. Formatted strings are harder to query and sort correctly. Most databases have native datetime types that handle this well. Reserve string formatting for display and API responses.

Related Resources

Published on 2025-07-05
Unix Timestamps Explained: Conversion and Common Pitfalls | alltools.one