That’s a fairly long title to describe this:
I found this value in a field of a JSON response from a web service where I was expecting a date. I’m used to seeing 13 digit timestamps so this date surprised me and looked a little odd.
After some research I found that the 13-digit timestamp that I’ve grown accustomed to seeing and this 10-digit timestamp are both Unix-style timestamps that represent the number of seconds since January 1, 1970 at 00:00:00 GMT. The difference is that the 13-digit timestamps represent the number of milliseconds and the 10-digit timestamps represent the number of seconds.
But neither of those worked with the 10-digit timestamp, which seems pretty obvious now that I know that the constructor and the setTime method expect milliseconds. So when you are given a 10-digit timestamp you should first convert it from seconds to milliseconds by multiplying by 1000: