Why a Unix Timestamp Looks Wrong and What to Check Next
A focused Unix timestamp troubleshooting FAQ for dates that look wrong because of unit mistakes, UTC-versus-local confusion, browser timezone assumptions, or JavaScript millisecond handling.
Related Tools
Open the matching tools
Start the workflow right away with the tools that fit this article best.
When a timestamp feels wrong but the number is still valid
Unix timestamps often look wrong for ordinary reasons rather than because the raw value is broken. The most common causes are the wrong unit, the wrong timezone expectation, or a mismatch between browser and backend assumptions.
That is why the best troubleshooting path is to check a few specific things in order instead of assuming the timestamp itself is invalid.
Start with the unit before you check the calendar
The fastest check is whether the number should be treated as seconds or milliseconds. Modern Unix seconds usually have 10 digits, while milliseconds usually have 13 digits.
If you interpret seconds as milliseconds, the result often lands near 1970. If you interpret milliseconds as seconds, the date can jump far into the future.
- Treat 10-digit values as seconds unless you know the system does something unusual.
- Treat 13-digit values as milliseconds in most modern frontend and analytics workflows.
- If the year is obviously wrong, verify the unit before anything else.
UTC versus local time is the next common source of confusion
The same timestamp can display as different readable dates depending on whether the output is shown in UTC or in your local timezone. That does not mean one value is wrong. It means you are looking at the same moment through two display contexts.
This is especially easy to miss when logs are in UTC but your browser, laptop, or screenshots show local time instead.
Browser date input and server assumptions can disagree
When you start from a date picker instead of from a raw timestamp, the browser usually treats that input as local time on your current device. A server, API, or teammate may later interpret the same intended moment in UTC or in another timezone.
That mismatch can make a generated Unix value seem wrong even though the browser did exactly what your local timezone implied.
JavaScript often wants milliseconds even when APIs use seconds
A common frontend bug is pasting a 10-digit Unix timestamp directly into JavaScript logic that expects milliseconds. In that case, the code is not reading the timestamp incorrectly on purpose. It is reading the right number in the wrong unit for that environment.
If a backend returns seconds and frontend code uses JavaScript Date objects, multiplying by 1000 is often the missing step.
Quick troubleshooting map
| What you see | Most likely reason | What to check next |
|---|---|---|
| The date looks near 1970 | Seconds were treated as milliseconds | Check whether the value should be read as Unix seconds |
| The date is far in the future | Milliseconds were treated as seconds | Check whether the value should be read as Unix milliseconds |
| UTC and local outputs disagree | You are viewing the same moment in different timezone formats | Compare the UTC and local displays before assuming the timestamp changed |
| Frontend code shows the wrong time but the backend value looks plausible | JavaScript logic expected milliseconds | Confirm whether the timestamp needs to be multiplied by 1000 first |
A local browser workflow helps with private debugging
ToolBaseHub runs timestamp conversion locally in the browser, which is useful when the values come from private logs, internal payloads, support tickets, or unreleased product work.
That local workflow does not remove the need to understand your app's timezone rules, but it does make it easier to inspect raw values without sending them to a third-party service.
FAQ
Frequently Asked Questions
Why does a Unix timestamp show 1970?
The most common reason is that a Unix-seconds value was interpreted as milliseconds. Check the unit first before assuming the timestamp itself is broken.
Why does the same timestamp show different UTC and local times?
Because UTC and local time are different readable views of the same moment. The timestamp has not changed. Only the display context has.
Does JavaScript usually use seconds or milliseconds?
JavaScript Date workflows usually use milliseconds, which is why a 10-digit Unix-seconds value often needs to be multiplied by 1000 first in frontend code.
Why can a browser date picker create the wrong Unix value for my server?
A browser date picker usually starts from your local timezone. If the server expects UTC or another timezone assumption, the generated Unix value can look wrong even though the local browser conversion was consistent.
Can I troubleshoot timestamps without uploading logs or payloads?
Yes. ToolBaseHub runs the timestamp conversion workflow locally in the browser, which is useful for private debugging and internal values.
Related Articles
Keep reading
Related Tools
Related Tools
Use these tools to finish the task covered in this article or continue with the next step in your workflow.