Javascript dates are weird. Famously, Brendan Eich wrote the first version of Javascript in 10 days - and the Date
function itself was no exception. It is based on code that was ultimately deprecated in Java.
That means Javascript has inherited a Date
function which was found to be buggy and problematic in Java, leaving it full of issues. You may have even encountered some issues yourself. You may be wondering, then, "what's so weird about it?". Let's look at all the quirks and common pitfalls with Javascript's Date constructor, so you can avoid them.
It sounds counterintuitive, given the main Javascript date constructor is called Date, but Javascript does not actually support dates. Javascript only supports date times. All Javascript dates are Unix timestamps underneath. That means if we try to create a date, we actually are creating a date time. All Javascript dates with no time specified default to midnight on that given day.
let date = new Date(2011, 1, 22);
// Notice the date produced has a time attached:
// Tue Feb 22 2011 00:00:00 GMT+0000 (Greenwich Mean Time)
Parsing dates as we did above works fine if you know months start at 0, but parsing date strings vary significantly across browsers. It is strongly advised to not parse date strings. Before the ECMAScript 5 specification, how Date
parsed string dates was never defined, and different browsers have many historical quirks that make it very unreliable.
According to the current specification, only strings conforming to the ISO-8601 standard should be parsable by Javascript, and any other dates should return NaN
i.e:
let parseMyDate = Date.parse('2022-03-21T11:00:01+00:00');
However, that is not the case. Many browsers allow date parsing outside of this format. This is where it has the potential to get confusing. Let's say you want to parse a date format in standard dd/mm/yyyy
date format. You take a standard date, and pass it into the parse()
function:
let myDate = new Date("5/1/2020");
console.log(myDate);
In all modern browsers, this uses the US date format, i.e. mm/dd/yyyy
- meaning it returns May 1st, not Jan 5th, leading to unexpected results.
Suppose you have a date which has no time or timezone associated with it:
let myDate = Date.parse('01 Jan 1999');
console.log(myDate);
You might think there is nothing immediately confusing about this - it represents a fixed date in time. However:
915148800000
.915138000000
, i.e. 3 hours more.If your timezone is UTC-5:00, this will return 915166800000
, i.e. 5 hours less.
So if your timezone is west of UTC, for example, -5:00, Javascript subtracts 5 hours from the Unix timestamp. Since days start at midnight.
That means if we try to use this timestamp with a different timezone, for example, in a backend system, we wouldn't get 1st Jan 1999, we get 31st Dec 1998! All of this is because Javascript does not implement dates - every date has a time associated with it - in this case, midnight.
If we want to create a date in Javascript we can parse numbers representing year, month and day. For example, if we want to create a date for Feb 22nd, 2011, we'd write this, right?
let date = new Date(2011, 2, 22);
Only, that gives us Tue Mar 22 2011 00:00:00 GMT+0000 (Greenwich Mean Time)
. That's because months in Javascript start counting from 0, so February is 1, not 2:
let date = new Date(2011, 1, 22);
Let's say you have accidentally created an incorrect date, say 31st Feb, 2022. You pass this into your date function, by mistake, from a database or API:
let date = new Date(2022, 1, 31);
You might think that this will just return Invalid Date
or NaN
, but you'd be wrong. Javascript skips to March 3rd! Since February only has 28 days in 2011, and there are 3 extra days, these days are added to the end of the month. In other words, you can't trust Date
to return errors on all incorrect dates.
The weirdest behavior of all is when we don't give Javascript entire strings in parse. For example:
let myDate = new Date("0");
console.log(myDate);
You might think that this will return the year 0, or perhaps the unix epoch, but it actually returns the year 2000 - Sat Jan 01 2000 00:00:00 GMT+0000 (Greenwich Mean Time)
.
Even more strangely, though, if we try to increase this, it starts counting in months:
console.log(new Date("5")); // Tue May 01 2001 00:00:00 GMT+0100 (British Summer Time)
console.log(new Date("11")); // Thu Nov 01 2001 00:00:00 GMT+0000 (Greenwich Mean Time)
console.log(new Date("4")); // Sun Apr 01 2001 00:00:00 GMT+0100 (British Summer Time)
To top it off, if you try to do new Date("13")
, we'll get Invalid Date
as the result, since there is no 13th month.
If we only pass one number to new Date()
, it will treat it as the Unix timestamp - however, it is not adjusted for timezone. For example, in UTC, the following code returns Thu Jan 01 1970 00:00:00 GMT+0000 (Greenwich Mean Time)
:
console.log(new Date(0));
That makes sense, since it's the Unix epoch - however, if we are in UTC-5:00, that code returns Wed Dec 31 1969 19:00:00 GMT-0500 (Eastern Standard Time)
- i.e. 5 hours before. That means, by default, time zones can lead to a lot of confusion - if we expected the date to be 1st Jan 1970, we immediately have an issue when using method like Date().toLocaleString()
. Ultimately, we can resolve this by using the method .toUTCString()
- but this complication leads to a lot of confusion.
You might have thought we've gotten off easy, and only timestamps and timezones are broken - but even years are inconsistent. If we wanted to create a date for the 1st Jan, in the year 0, you might think we'd write this:
console.log(new Date(0, 0, 0));
Since months start from 0, this looks right - but actually, if the year is less than 100, 0 means the year 1900. Alright, you might think, I suppose this should return 1st Jan 1900
instead - but that's actually wrong too - since days are indexed from 1, not 0. The above code returns Sun Dec 31 1899 00:00:00 GMT+0000 (Greenwich Mean Time)
- since the 0th day of the month is counted as the last day from the previous month. Here are a few other examples:
console.log(new Date(0, 0, 0)); // Sun Dec 31 1899 00:00:00 GMT+0000 (Greenwich Mean Time)
console.log(new Date(50, 0, 0)); // Sat Dec 31 1949 00:00:00 GMT+0000 (Greenwich Mean Time)
console.log(new Date(30, 0, 0)); // Tue Dec 31 1929 00:00:00 GMT+0000 (Greenwich Mean Time)
console.log(new Date(24, 0, 0)); // Mon Dec 31 1923 00:00:00 GMT+0000 (Greenwich Mean Time)
As soon as you get above the year 100, it then does go back to counting the years normally. So the below code actually gives us the year 101, not the year 2001:
console.log(new Date(101, 0, 0)); // Fri Dec 31 0100 00:00:00 GMT-0001 (Greenwich Mean Time)
This may be useful if you are using years after 1900, but it is incredibly counterintuitive for anything before.
The Javascript Date function is fundamentally broken in many ways - which is why most people use tools like Moment.js, but why hasn't it been fixed?
The main reason is that most of the web has been built on code that took into consideration the flaws with Date. As such, changing now would result in many websites simply breaking.
To remedy this situation, Javascript is introducing a totally new set of standards called Temporal which will occupy a different namespace than Date, and will solve most of the problems described in this article. Until then, we are stuck with the quirks Javascript Dates produce.
Also Published here