Epoch Time Explained: Seconds vs Milliseconds
Time Tools

Epoch Time Explained: Seconds vs Milliseconds

Site DeveloperSite Developer
2025-12-25

Epoch Time Explained: Seconds vs Milliseconds

Quick answer: Epoch time is a number that represents a moment in UTC, counted from January 1, 1970. Most systems use seconds (10 digits) or milliseconds (13 digits). Use /epoch-converter to convert safely.

Why it is popular

Article illustration

Epoch timestamps are:

  • Compact
  • Language agnostic
  • Easy to compare and sort

Key takeaways

  • Definition: Why it is popular explains what you are looking at and why it matters in practice.
  • Context: this section helps you interpret inputs and outputs correctly, not just run a tool.
  • Verification: confirm assumptions (format, encoding, units, or environment) before changing anything.
  • Consistency: apply one approach end-to-end so results are repeatable and easy to debug.

Common pitfalls

  • Mistake: skipping validation and trusting the first output you see from Why it is popular.
  • Mistake: mixing formats or layers (for example, decoding the wrong field or using the wrong unit).

Quick checklist

  1. Identify the exact input format and whether it is nested or transformed multiple times.
  2. Apply the minimal transformation needed to make it readable.
  3. Validate the result (structure, encoding, and expected markers).
  4. If the result still looks encoded, repeat step-by-step and stop as soon as it becomes clear.

The seconds vs milliseconds trap

  • 10 digits is usually seconds
  • 13 digits is usually milliseconds

If you use the wrong unit, your date will be off by 1000x.

Key takeaways

  • Definition: The seconds vs milliseconds trap explains what you are looking at and why it matters in practice.
  • Context: this section helps you interpret inputs and outputs correctly, not just run a tool.
  • Verification: confirm assumptions (format, encoding, units, or environment) before changing anything.
  • Consistency: apply one approach end-to-end so results are repeatable and easy to debug.

Common pitfalls

  • Mistake: skipping validation and trusting the first output you see from The seconds vs milliseconds trap.
  • Mistake: mixing formats or layers (for example, decoding the wrong field or using the wrong unit).

Quick checklist

  1. Identify the exact input format and whether it is nested or transformed multiple times.
  2. Apply the minimal transformation needed to make it readable.
  3. Validate the result (structure, encoding, and expected markers).
  4. If the result still looks encoded, repeat step-by-step and stop as soon as it becomes clear.

Example (quick identification)

  • 1735110000 (10 digits) is likely seconds
  • 1735110000000 (13 digits) is likely milliseconds

More examples to test

  • Example A: a minimal example (quick identification) input that should produce a clean, readable output.
  • Example B: a nested or double-encoded input (common in logs and redirects).
  • Example C: an input with whitespace/newlines that should still decode after cleanup.

What to look for

  • Does the output preserve meaning (no missing characters, no truncated data)?
  • Are special characters handled correctly (spaces, quotes, emoji, and reserved symbols)?
  • If the output is structured (JSON/XML), can it be parsed without errors?
  • Reminder: verify inputs and outputs for "Example (quick identification)" with a known-good sample.
  • Reminder: verify inputs and outputs for "Example (quick identification)" with a known-good sample.

Another common trap: microseconds and nanoseconds

Some systems log:

  • microseconds (16 digits) or
  • nanoseconds (19 digits)

If your converted date is thousands of years in the future, the unit is probably too fine (divide by 1,000 or 1,000,000).

Key takeaways

  • Definition: Another common trap: microseconds and nanoseconds explains what you are looking at and why it matters in practice.
  • Context: this section helps you interpret inputs and outputs correctly, not just run a tool.
  • Verification: confirm assumptions (format, encoding, units, or environment) before changing anything.
  • Consistency: apply one approach end-to-end so results are repeatable and easy to debug.

Common pitfalls

  • Mistake: skipping validation and trusting the first output you see from Another common trap: microseconds and nanoseconds.
  • Mistake: mixing formats or layers (for example, decoding the wrong field or using the wrong unit).

Quick checklist

  1. Identify the exact input format and whether it is nested or transformed multiple times.
  2. Apply the minimal transformation needed to make it readable.
  3. Validate the result (structure, encoding, and expected markers).
  4. If the result still looks encoded, repeat step-by-step and stop as soon as it becomes clear.

A fast conversion routine

  1. Identify the unit.
  2. Convert to ISO 8601.
  3. Adjust for your target time zone.

Key takeaways

  • Definition: A fast conversion routine explains what you are looking at and why it matters in practice.
  • Context: this section helps you interpret inputs and outputs correctly, not just run a tool.
  • Verification: confirm assumptions (format, encoding, units, or environment) before changing anything.
  • Consistency: apply one approach end-to-end so results are repeatable and easy to debug.

Common pitfalls

  • Mistake: skipping validation and trusting the first output you see from A fast conversion routine.
  • Mistake: mixing formats or layers (for example, decoding the wrong field or using the wrong unit).

Quick checklist

  1. Identify the exact input format and whether it is nested or transformed multiple times.
  2. Apply the minimal transformation needed to make it readable.
  3. Validate the result (structure, encoding, and expected markers).
  4. If the result still looks encoded, repeat step-by-step and stop as soon as it becomes clear.

Example output formats you should recognize

  • ISO 8601 UTC: 2025-12-25T13:30:00Z
  • ISO 8601 with offset: 2025-12-25T08:30:00-05:00
  • Human local display: depends on locale/time zone (great for debugging, not for storage)

More examples to test

  • Example A: a minimal example output formats you should recognize input that should produce a clean, readable output.
  • Example B: a nested or double-encoded input (common in logs and redirects).
  • Example C: an input with whitespace/newlines that should still decode after cleanup.

What to look for

  • Does the output preserve meaning (no missing characters, no truncated data)?
  • Are special characters handled correctly (spaces, quotes, emoji, and reserved symbols)?
  • If the output is structured (JSON/XML), can it be parsed without errors?
  • Reminder: verify inputs and outputs for "Example output formats you should recognize" with a known-good sample.

When you should keep it in UTC

Logs and APIs should stay in UTC. Convert to local time only for display.

Key takeaways

  • Definition: When you should keep it in UTC explains what you are looking at and why it matters in practice.
  • Context: this section helps you interpret inputs and outputs correctly, not just run a tool.
  • Verification: confirm assumptions (format, encoding, units, or environment) before changing anything.
  • Consistency: apply one approach end-to-end so results are repeatable and easy to debug.

Common pitfalls

  • Mistake: skipping validation and trusting the first output you see from When you should keep it in UTC.
  • Mistake: mixing formats or layers (for example, decoding the wrong field or using the wrong unit).

Quick checklist

  1. Identify the exact input format and whether it is nested or transformed multiple times.
  2. Apply the minimal transformation needed to make it readable.
  3. Validate the result (structure, encoding, and expected markers).
  4. If the result still looks encoded, repeat step-by-step and stop as soon as it becomes clear.

Quick rule

If the number is larger than 10 digits, it is probably milliseconds.

Key takeaways

  • Definition: Quick rule explains what you are looking at and why it matters in practice.
  • Context: this section helps you interpret inputs and outputs correctly, not just run a tool.
  • Verification: confirm assumptions (format, encoding, units, or environment) before changing anything.
  • Consistency: apply one approach end-to-end so results are repeatable and easy to debug.

Common pitfalls

  • Mistake: skipping validation and trusting the first output you see from Quick rule.
  • Mistake: mixing formats or layers (for example, decoding the wrong field or using the wrong unit).

Quick checklist

  1. Identify the exact input format and whether it is nested or transformed multiple times.
  2. Apply the minimal transformation needed to make it readable.
  3. Validate the result (structure, encoding, and expected markers).
  4. If the result still looks encoded, repeat step-by-step and stop as soon as it becomes clear.

FAQ

Why does the time look off by hours?

That is usually a time zone display issue. Convert to UTC to confirm the underlying moment, then convert to your local zone for readability.

Why does JavaScript often use milliseconds?

Many JavaScript APIs (like Date.now()) use milliseconds. Server logs often use seconds. Mixing them creates the classic 1000x bug.

What should I do if the output still looks encoded?

Decode step-by-step. If you still see obvious markers (percent codes, escape sequences, or Base64-like text), the data is likely nested.

What is the safest way to avoid bugs?

Keep the original input, change one thing at a time, and validate after each step so you know exactly what fixed the issue.

Should I use the decoded value in production requests?

Usually no. Decode for inspection and debugging, but send the original encoded form unless your protocol explicitly expects decoded text.

Why does it work in one environment but not another?

Different environments often have different settings (time zones, keys, encoders, or parsing rules). Compare a known-good sample side-by-side.

References

Back to Blog

Found this helpful?

Try Our Tools