I'm confused but in javascript:
> new Date('2012-1-15') - new Date('2012-01-15')
21600000
Why is that? (21600000 / 1000 / 3600 == 6 hours)
I'm confused but in javascript:
> new Date('2012-1-15') - new Date('2012-01-15')
21600000
Why is that? (21600000 / 1000 / 3600 == 6 hours)
For the first case, the constructor function sets time to 00:00 in your time zone. But in the second case, it initialize time relative to GMT +00.00
The date format
yyyy-mm-dd
(2012-01-15) is parsed as being a UTC date whileyyyy-m-dd
(2012-1-15) is parsed as a local date. This is shown if you use.toString
on each.Note that I am in California, hence the Pacific Standard Time. If you are in a different time zone you will get different results.
When JavaScript parses dates it tries formats used in more areas (such as UTC) first before it tries localized date formats. The last part of the UTC date format is a timezone offset from GMT which is assumed to be 0 when it is missing (as it is in this example). To get the same date you would need the full UTC timestamp: 2012-01-15T00:00:00-08:00.
The result of
new Date('2012-1-15')
is implementation-dependent (ECMAScript standard, clause 15.9.4.2).