Thank god. Javascript is the only language where I... wait for it... roll my own datetimes. Moment's central failure, beyond the surprise mutability, is the conflation of Dates and Times with Datetimes. They are not the same, and this causes so many problems. Python's Arrow library made the same mistake, likely inspired by Moment. I've heard JS devs insist that Dates and Times have no place as standalone constructs outside of a Datetime, but this is incompatible with practical use cases.
Rust's Chrono? Outstanding. You can find flaws, but it's generally predictable, and most importantly, has fewer flaws than those in other languages. Python's? Kind of messy, but usable. JS's Date and Moment? Unusable.
> I've heard JS devs insist that Dates and Times have no place as standalone constructs outside of a Datetime
As somebody having both caused and fixed many time-related bugs in a financial/payments context, this sentiment is creating an almost physiological reaction at this point.
If your real-world process is concerned with things happening on particular days, don't represent these as midnight, i.e. 00:00, of that day. You'll thank me the day your company starts hosting servers or interacting with a customer/vendor in a different timezone.
Essentially, this is just a corollary of "don't imply more precision in your internal data representation than was present in your data source", but it bears repeating when it comes to date/time processing too.
(And please, please don't ask me about the difference between "Y" and "y" in Java's SimpleDateFormat, but look it up before you ever use it, thinking they're equivalent... Because on some days of some years, they are not.)
My favorite Date v Datetime bug is that Chile changes to DLS at midnight and not 1am. So it’s goes 11:59>1am. Many systems that conflate dates and date times take the existence of midnight as an invariant, which it is not.
Oh, that's another beautiful counterexample, thank you!
I was wrecking my mind trying to come up with a scenario where what we did could go wrong with DST shifts alone (i.e. without a timezone conversion/relocation), but fortunately, most (all?) of Europe shifts DST on a weekend, never at the last day of the month, and with ample safety buffer from midnight, for which I am very thankful.
Another fun thing is that some countries that celebrate Ramadan may change back to normal time during Ramadan. So you can 2-4 DLS changes per year depending on when Ramadan falls. Luckily most time systems don’t really care, but it’s still a fun edge case. You really can’t hard code anything.
DST being on a weekend doesn't mean things can't go wrong. If it's Friday at 1pm and you want to schedule something for 1 week later (next Friday at 1pm) and there's a DST transition over the weekend, are you correctly getting next Friday at 1pm? Or is it 2pm because you didn't account for the jump forward in time, thus making this particular week 1 hour shorter than more?
DST problems can manifest in a lot of interesting ways. That's why it's important to use a datetime library that does DST safe arithmetic (like Temporal).
I didn’t mean to imply nothing could go wrong (and a lot does go wrong; I once watched a Windows PC freeze completely exactly at the time of DST shift!), but it does reduce the number of ways things can go wrong with non-defensive programming slightly.
Not doing the shift on a weekday or the last of the month alone has probably prevented countless bank batch jobs from exploding.
This is an interesting question because it’s the difference between clocks and calendars. If you add 7 calendar days it’s 1pm. If you add 168 hours then it’s 2pm. Generally calendar systems will try to encode reasonable semantics for this on what a human would expect it to do. In those case it’s add 7 days. Calendars and clocks are not equivalent.
Oh god your comment brought back so many horrible memories. At my last company we had a reporting system in NodeJS that ran daily reports for clients around the globe, and whoever set it up decided that the reports should be set to 00:00.
The amount of hell that ensued was never-ending. I'm not sure we ever truly fixed the issue, it's so hard.
There's still apps at my company that use the server's local time for everything, built in a pre-cloud world. Now we "cloud host" many of these, by running sets of VMs with their clocks set to different timezones and setting up each tenant on one matching their zone. Unfortunately, it's a product that makes a fair bit of money but isn't our focus (not much growth or new sales), so it's never been worth it to actually do something better.
It's not just the timezone/DST issues, either. Another product I've worked on did the "reports at midnight" thing, but as it grew it became impossible, as it took hours to complete all reports for all customers. So some "scheduled at midnight reports" would happen at 1:28am. This alone isn't a huge deal, but many reports were time-dependent (eg: you get different results running at 1:28 vs 0:00), and for a subset of those it was actually a big problem. So some got fixed by removing the time dependency (basically making the queries in the report have end times vs using now()). Another workaround was to staggering times manually on other reports - run at 2:30am - but still many customers would still pick 1:00 or 2:00 (which is of course someone elses' midnight), and there were still big I/O bursts on :00, :30, :15 and :45, especially overnight.
My personal takeaway was to avoid building report systems like that, and try to do things like let the user pick "Daily" without picking the time-of-day. For time-sensitive "reports" it's been my experience that if you actually dig into the user needs, they're really looking for an "operational view", not a report. It requires specific dev work to build of course, but the end result is better both technically and for the customer, and is less overall work than all the fixes/workarounds needed when you try to shoehorn it in with all the other reports, especially as the usage grows.
> don't imply more precision in your internal data representation than was present in your data source
Matrix's e2ee message formats have a form of this issue. The AES IVs generated are only a subset of the possible IV space than the cryptography actually uses. This gets expressed in the JSON base64 string storing the IV always having an AAAAA prefix for the zero-padding.
I'm not sure if this is still true and I don't believe it was responsible for any security vulnerability due to how the IVs are used, but it's still a sloppy design.
In defense of Java's SimpleDateFormat, it's purposefully designed to be compatible with the ICU formatting.
The choice of 'Y' for the rarely used "week year" is unfortunate. For those unaware, in ISO-8601 every year has either 52 or 53 full weeks. If you want to format a date like "2023-W52-6" (the 6th day of the 52nd week of 2024) you would use "YYYY-'W'ww-u".
Unfortunately ICU's mistake here has spread to a lot of places because deferring to what ICU does is nearly always the correct decision even if you aren't actually using ICU.
How is a function that converts an internal date/time representation to an external/string-based one not "date/time stuff", regardless of the package it happens to be in?
java.time.format.DateTimeFormatter seems to still point the exact same loaded footgun (i.e. the format patterns "Y" and "y") at every junior developer that dares to use it, so I'm not sure that would have helped junior-me much.
The important information I was missing was "the week-based year and the year-of-era representations of one and the same date differ on some days, but not all days, and especially not those days you're currently using in your unit tests".
And even if it wouldn't – SimpleDateTime was the non-deprecated standard library method I found at the time that seemed to do my job, so that was the one I used.
Have you ever used date-fns or Luxon? although Moment is still popular, I thought that was it succeeded by other libraries. If my memory is completely wrong I thought that Luxon might have been made by the creators of Moment?
Date-fns uses the native Date object which makes the same mistake, but then has a bunch of additional mistakes like not accounting for timezones and being painful to use. Date-fns can resolve some of these issues, but it's generally too simple to do a good job. It's great for manipulating timestamps in the user's local time zone, but beyond that it starts becoming difficult to get right.
Ruby on Rails (ActiveSupport::TimeWithZone) is a solid API--have never had a problem with it. Also looks like Elixir is good too.
One of the biggest problems with Javascript's Date is that there is no way to change the timezone context--you are stuck with whatever the system timezone is.
Beyond timezone and DLS which I don't use much, and mutability which is already solved with dayjs, is there any problem with moment? Really curious because I use that in production for years without problem.
Rust's Chrono? Outstanding. You can find flaws, but it's generally predictable, and most importantly, has fewer flaws than those in other languages. Python's? Kind of messy, but usable. JS's Date and Moment? Unusable.