https://gist.github.com/timvisee/fcda9bbdff88d45cc9061606b4b...
Uhhh...
explains the naming conventions of every culture on the planet
Honestly, sometimes I think about the linear algebra, AI, or robotics I learned in school and get this feeling of, "Is this what I'm doing? Stuff that feels like it should be simple?"
It's funny, even our product manager - who is a great guy - can fall into that "come on, this should be easy" mode, and I'll admit I sometimes get lulled into it too. But to his credit, every time I walk him through the actual edge cases, he totally gets it and admits it's easy to forget the on-the-ground complexity when you're in 'planning mode'.
So yeah, seeing your comment is incredibly validating.
The first is significantly easier as it requires remembering only a single offset and then going with societal conventions.
Agree I've never met a farmer who cares about DST. Though also, for non-ag farmers, sometimes "crack of dawn" isn't early enough lol. Cow: "Dairy barn has electric lights, why aren't you awake at 4am tending to my needs, Human? Vacation? Lol no, stay here. Every morning. 4am."
What it is about is moving an extra hour of daylight from say 5:30am-6:30am (when it is only of use to farmers and a few other early risers) to say 7pm-8pm when 95% of the population is still awake.
Likewise Daylight savings is a concept that had its uses, but makes less sense as technology progresses. I don't think even farmers care much about 7AM approximating to sunrise and 6PM as sunset.
Stargate SG-1 is one of my favorite instances of this. The first couple of episodes address the fact that the Earth characters do not speak the same languages as everyone else in the galaxy. Then, having established the point that A: the show runners understand this is an issue and B: it makes for a rather tedious watch, they moved on to "everyone speaks English" and we all breathed a sigh of relief. I just think of it as part of the "camera" now. It turns out that we don't necessarily want a truly literal recording of what such things would look like.
[1]: https://tvtropes.org/pmwiki/pmwiki.php/Main/CallARabbitASmee...
Like, in a story background I'm pushing around, there's a coalition of a large amount of species developed on different planets. And you're a military officer, and you need to coordinate shifts, but - assuming some collectively normalized number of hours - some of your tiny dudes are tuned to 3 hours of sleep, 3 hours of leisure and 3 hours of work, others weird dudes with 2 arms and 2 legs are tuned to 38 hour cycles, and some huge dudes with a trunk in their face are tuned to 356 hour cycles.
Even if you could train and adjust this by an hour or two (which, for the 3 hour dudes would compare to an 8 earth-hour extension of duty for us), how the heck would you coordinate any kind of shifts across this? Or does every species have their own schedule? Good look finding crossover meetings then. Some of the small guys would have to do overtime for longer meetings even.
But you have to make it a point of the story and the challenges if you want to include it. If it is just a weird side note, just say that they figured out a conversion and that's it.
If you’ve read David Weber’s Safehold series, this point gets super clear. It's written with names like "Zherald Ahdymsyn" (Gerald Adamson), but that makes it quite the slog for many.
You couldn’t translate that novel to Italian or Finnish, or any language with proper phonetical spelling.
"Armageddon" actually. Poignant because it's a movie about a nuclear ballistic submarine. But not a particularly non-English word.
That doesn't make it unusable as a cross galactic time unit, and I think the same goes for years and hours.
(... it's not exactly pi-squared because the French yanked it around a bit before settling into the modern number based on light in a vacuum and cesium atoms).
Currently, a Mars days is called "sol", FWIW.
If we find other species out there I won't speculate on how they think about time.
I don't think that helps with the original concern.
"Our dates start x trillion rotations of pulsar y ago and our unit is defined as z wiggles of cesium" is a starting point.
But you can tell an alien species our units are expressed in multiples of that, and they can translate it into how theirs works. (Vinge, for example, has space-faring humans talk about "megaseconds" and "gigaseconds" rather than days/years.)
I take joy in exuberantly pushing back on their insistence of clinging to such archaic time units as "minutes", "hours", and "days", telling them to come back when they embrace kiloseconds. It is telling that most of my friends accept this with equal joy and laughter (:
It probably doesn't hurt that I've also spent time drilling metric conversions so that I can code-switch pretty seamlessly among units. Neurotic tendencies can have payoffs.
https://babylon5.fandom.com/wiki/Measurements_of_Time
Aliens use phrases like " 2 of your Earth days "
> The essence of this story doesn't lie in the quantity of bizarre terms we might have invented; it lies, rather, in the reaction of a group of people somewhat like ourselves, living on a world that is somewhat like ours in all but one highly significant detail, as they react to a challenging situation that is completely different from anything the people of Earth have ever had to deal with. Under the circumstances, it seemed to us better to tell you that someone put on his hiking boots before setting out on a seven-mile walk than to clutter the book with quonglishes, vorks, and gleebishes.
The Galactic Standard Calendar or Galactic Standard Time was the standard measurement of time in the galaxy. It was based on the Coruscant solar cycle. The Coruscant solar cycle was 368 days long with a day consisting of 24 standard hours.
60 standard minutes = 1 standard hour
24 standard hours = 1 standard day
5 standard days = 1 standard week
7 standard weeks = 1 standard month
10 standard months + 3 festival weeks + 3 holidays = 368 standard days = 1 standard year
I don't care how much they talk themselves down on their homepage, begging me to choose a different library - I like it and I'll continue using it.
> We now generally consider Moment to be a legacy project in maintenance mode. It is not dead, but it is indeed done.
> We will not be adding new features or capabilities.
> We will not be changing Moment's API to be immutable.
> We will not be addressing tree shaking or bundle size issues.
> We will not be making any major changes (no version 3).
> We may choose to not fix bugs or behavioral quirks, especially if they are long-standing known issues.
I consider this a strength, not a weakness. I love a library that's "done" so I can just learn it once and not deal with frivolous breaking changes later. Extra bonus that they plan to continue making appropriate maintenance:
> We will address critical security concerns as they arise.
> We will release data updates for Moment-Timezone following IANA time zone database releases.
I asked them why they couldn’t use DATEDIFF since this was in a sql db.
They said they hadn’t heard of it and that it must be new.
2025-01-01 - 2024-12-31 = 20250101 - 20241231 = 8870
i.e. 90 months and 10 days
or 7 years 6 months and 10 days
How is that the same thing as one day?
2025-02-01 - 2025-01-31 = 20250201 - 20250131 = 70
When is 30 days after today? 25206+30
But subtle plug of something I made long ago for when you find your data pipelines are running hot parsing timestamp strings etc: https://github.com/williame/TimeMillis
I’m still pumped by the performance of the thing! :)
https://learn.microsoft.com/en-us/dotnet/api/system.globaliz...
https://learn.microsoft.com/en-us/windows/apps/design/global...
I encourage everyone to learn how to parse the japanese calendar format.
The more people know the better!
The benefit of DIY parsing is to make the problem simple by restricting it to the set of plausible inputs your users will want your code to handle, not to make a highly general library. The right takeaway for juniors is to stop over-complicating things.
This is spot on. So many of the "X is really hard and your intuition is wrong" takes ignore the fact that most people are not building something which needs to be usable in every country, language, and culture on this earth. Yes, human behavior is insanely complex, but for any given application you can probably ignore huge swathes of it.
Make your own load balancer software
Make firewall software
Make a date parsing library
Attempt to verify an email with a regular expression.
Accept people's names
Anything
.+@.+
That one always seemed sufficient for me, every issue after that is the users problemWhile I am fairly sure this a a locale defined thing. locales are this huge pile of worms and I have never figured out how to change it to show YYYY-MM-DD format
“Simple” example that anyone who’s ever worked on a scheduling application will probably be familiar with:
“Get a list with all of today’s events.”
Well, whose “today” (timezone) are we talking about? Server, client, setting in the user account? Or none of the above, and actually timezone at the physical location of the event, if there is one?
And what does “today” mean, anyway? Truncate the date? 00:00-23:59? Business hours?
And what does “today’s event” even mean? Events can cross midnight… Does an event need to start today? End today? Both? Can events span multiple days?
The fun never ends!
You just need to understand how time works if you write code handling time.
And still, you regularly run into issues, because our API or a third party did something silly
So go ahead, write your own date library, your own Unicode font rendering, compiler, OS, game engine or what ever else people tell you to never do because its hard.
all the people who say C is not safe have downvoted me for quoting them
The messaging here is that you should be careful about using what you build on your own because it:
- hasn't been battle tested
- likely has bugs
- isn't mature
The only way that it will be all of those things is if someone invests time and energy in them.
From an ecosystem perspective this is absolutely the right thing. You want duplicate projects. You want choice. You want critical knowledge to be spread around.
I see it as “Dont write your own X, unless you want to maintain X. Here be dragons, this problem is deeper than it appears, the first 80% will be easy, the next 15% will annoy you, and the last 5% will consume your life for weeks, months, or even years. Or you could use a library”
The point is, before you release your new thing, make sure it addresses all of the pain points the previous solutions have already slogged through,
or that if it doesn't, people are still aware of when they can arise, and why your thing has chosen not to mitigate them yet,
or ever, if it's an opinionated piece of tech.
The things that people write that everyone uses have had HUGE exposure.
They've been exposed to all the edge cases, they've been tested millions, if not billions of times. All the bugs ironed out.
The people who've worked on them are now the greatest domain experts on that little corner of comp-sci.
Yours won't unless it hits prime time.
So yours will be weak, brittle and dangerous.
Most of the time you build something else.
Like if you build a todo app and have to deal with scheduling you don’t spend time making date library because it’s not your goal. But people would do that.
Heck most developers instead of starting blog on a blog platform start writing code for their own blogging engine.
Having said that, I don't think date libraries are hard, I think they're messy. Mostly because humans keep introducing convenience fudges - adding a second here, taking eleven days off there, that kind of thing.
See the number of unit tests in the Linux kernel, for example.
You could run a fuzzer against two libraries at the same time to find discrepancies....... hmm. That might actually be a good exercise.
You can have reasonable confidence that here there be dragons, but not so much that your assumptions about something will hold.
Messy is just a particular kind of tedious which is the most common form of hard.
It's not like typical things that need doing tend to include solving lots of unsolved problems.
2) Many people (especially the elderly) take enough medications on different schedules that managing them all would be a significant cognitive load for anyone
It’s just an illustrative example, though. My point is getting dates right (including parsing their string representations) often matters quite a bit. If you disagree, let’s argue about that rather than quibble about the minutiae of the example
My most valuable resource is time. Sure, I could learn more low-level aspects of my craft ... and sometimes I find it useful to do so.
When I focus on doing the hardest, already solved things by re-implementing them my own way, what value am I adding?
I've never met a client who cared about a library or how I did something in code - until it broke. Then, they didn't care who wrote it, they just cared it started working again.
If you wanted to grow your wood, plane and dry it yourself, etc... then you'd be "hard way" building a table.
I assume you use tools?
We aren't talking about the same thing: I stated "I don't think tables are the hard thing."
Note the word "the" in front of "hard thing" -- I'm referencing the article we're discussing, which mentions "the hard thing"
There is difference between “never build your own for a professional need” and “never build your own”.
I build my own stuff if it is for my own purposes, and I use proper tools and libraries for my professional work.
There is a difference between things that are difficult and things that just take a lot of work.
[0] https://www.space.com/astronomy/earth/earth-will-spin-faster...
I agree with you though, do the hard things even if it doesn't work 100% right you will have learned a lot. In university I had to implement all of the standard template library data structures and their features, it wasn't as robust as the actual STL but the knowledge of how those work under the covers still comes up in my day to day job.
Just, why.
If you want to build it to scratch an itch, go ahead. If you want to build it for fun, go ahead. If you want to build it because an existing solution gets something wrong and you can do better, go ahead (but know that it is a way bigger undertaking than you might assume at first glance).
The real advice is “don’t casually build your own X”, but that’s less punchy.
It's not interesting, it's not fun, it's just a process of getting complaints it's wrong in edge cases and then fixing them, over and over until no one can find another broken edge case.
You can start by writing down England is +0, Germany is +2, etc... someone's going to mention DST and you'll put in a field for switching on the Nth Sunday of month X... later you'll run into a country that uses a different rule and you'll add a bunch of spaghetti code or write a Turing-complete DSL, etc... One day someone tells you about a village where they count 17 hour days on seashells and then you'll give up.
And if your DB doesn't produce identical results to the Olson DB in all cases then you created an incompatibility anyway. Might as well just use the Olson DB.
It's all about the nuisance created by human behavior. Calendar, DST, timezone, all the problems you never imagined can happen and can only be met in real life scenarios, and you will meet same problem again, struggle then found out the same problem have been solved long time ago by mature library, and the solution doesn't require any smart or advanced technique, just another corner case.
Firstly because I have a great imagination, but secondly because I am old and have a lot of real life scenarios to think about.
State-of-the-art here has changed a few times in my professional career: Once upon a time most time/date libraries used a single integral type and try to make it do double-duty by being both interval and absolute (whatever that means) time by taking the interval from an epoch.
Relatively recently however, that's started to change, and that change has been made possible by people using languages with better type systems reinventing the date/time approach. This has led to fewer bugs, and more predictability with regards to calendar operations in different programs.
But bugs still happen, so this approach is still unsatisfying. One thing I keep having to worry about is distance; I record RTT as part of my events, since when I am looking for contemporaneous events, the speed-of-light actually tends to be a real factor for me.
So I don't think this is solved simply because my problems aren't solved by existing libraries, and I keep getting into arguments with people who think GMT=TAI or something dumb like that.
It's not "all about" anything: Nobody knows shit about what's happening in the next room over, and if there are 12 different date/time libraries now, I guarantee there'll be a 13th that solves problems in all of them, and is still incomplete.
Datetime libs themselves? No thanks.
The only way you understand X is by making your own X and trying to support it for a few decades, and our industry needs more people who understand X; fewer who just ask chatgpt/stackoverflow/google for "the answer".
If you work for a company and build todo app most likely it will not be beneficial for you to implement in-house library because there will be stuff that will bring much more value.
Like you don't have now 2 years to cover for all hard stuff because you have to make synchronization of tasks between devices and your boss most likely won't appreciate that.
"Never roll your own cryptography" is always used in context of building another application it is never "don't become a cryptography specialist".
and i don't want to pay my employees to learn, i want to pay them to produce output i can sell.
Doing hard things are good, if this hard thing has never been done before - like going to the moon.
Doing hard things which has been done, but just not by you, is not good unless it's for "entertainment" and personal development purposes - which is fine and i encourage people to do it, on their own dime. Like climbing Mount Everest, or going to the south pole.
But if you are doing a project for someone else, you don't get to piggy back your personal wants and desires unrelated to the project on to it.
Getting better at your job is not just a "personal want" but very much something that the employer appreciates aswell.
Of course reinventing the wheel isn't good in corporate because the reinvented wheel is buggier than the ready made npm package but employers should go out of their way to find hard problems to solve that they can pass to their employees. It's called a growth opportunity.
This can be a bad local optimum. It probably depends on what exactly your business does, but it can make sense to pay an employee to acquire knowledge and skills that are needed in the business. You can't buy this off the shelf in all circumstances. Of course, it also has to make economic sense and be viable for the company. Unfortunately, I often see employees doing things quite badly that they don't really understand because they are not given the opportunity to learn properly. I can't imagine that this burns less money in the medium and long term than giving paid employees adequate space to learn.
What a silly example. ASML is valuable because it does something no one else does. It's not because it's hard, it's because they have the know-how and infrastructure to do it whereas others don't.
Juggling is hard. Do you know any millionaire jugglers?
You should try instead.
I brought jugglers balls for my team and in a few weeks we had 4 or 5 fluent.
For some of the stuff that has been done already, it might still make sense to do your own implementation, for example if you want to be able to experiment without having to navigate and learn a huge codebase and then have to maintain a fork just to have your own stuff in.
Another project we are starting now involves replacing software which is outright crappy and wastes our time. Thankfully my employer was able to see and understand this after talking it through with them.
Then how do you expect them to learn?
Good luck getting more blood out of that stone, smh.
There things which was a result will make your mind click to an other way to comprehend a problem space and how to navigate through it.
And there are things which are hard due to pure accumulation of concurrent conventions, because of reasons like coordinating the whole humanity toward harmony with full happy peaceful agreement of everyone is tricky.
Handling date is rather the latter. If you dig in the lucky direction, you might also fall into cosmological consideration which is a rabbit hole of its own, but basically that's it: calendars are a mess.
That's perfectly fine. Your time, your hobbies.
> Nobody wants people who can do easy things, people want people who can do hard things.
No, not really. People want people who do easy things, because they are clever enough to avoid needlessly wasting their time having to do hard things when they could have easily avoided it.
It's your blend of foolish mindset that brought us so many accidental complexity and overdue projects. There's a saying: working smart instead of working hard.
> So go ahead, write your own date library, your own Unicode font rendering, compiler, OS, game engine or what ever else people tell you to never do because its hard.
You can cut it out, this isn't LinkedIn.
OTOH writing, e.g., your own renderer could cause some funny display at worst and maybe some unnecessary effort.
Due to my work I rely on web scraped data for cybersecurity incidents. For Amazon Linux, they are disclosed with the fvcked up US datetime format (Pacific Time) and not in ISO8601 formatted strings which could imply Juliet/Local time.
In 2007 there was a new law that changed when Pacific Time enters/leaves Daylight Saving Time. Instead of making this fixed by a specific Day of a specific Month in numbered form like say "YYYY-03-01 to YYYY-10-01", they literally wrote the law quoting "first Sunday of April" to "last Sunday in October". Before 2007 it was "Second Sunday in March" to "first Sunday in November".
I'm not making this shit up, go ahead and read the law, come back and realize it's even more complex for other timezones, because some nations seem to make fun of this by going to +14:00 hours and -11:30 hours depending on the president's mood on Christmas or something.
In order to find out the Day of a specific calendar date, there's this cool article about Determination of the day of the week [1] which is quite insane on its own already. There is no failsafe algorithm to do that, each method of determining the day of the week has its own tradeoffs (and computational complexity that is implied).
Then you need to get all Sundays of a month, count the right one depending on the year, map back the date to ISO8601 and then you know whether or not this was daylight saving time they're talking about. Also make sure you use the correct local time to shift the time, because that changed too in the law (from 02:00LST to 03:00LDT and 02:00 LDT to 01:00LST before, to 02:00LST to 03:00LDT and 02:00LDT to 01:00LST after the changes).
Took me over 4 fvcking weeks to implement this in Go (due to lack of parsers), and I hate Amazon for this to this date.
PS: Write your own Datetime parser, this will help you realize how psychotic the human species is when it comes to "standards". After all this I'm in huge favor of the Moon Phase based International Fixed Calendar [2]
[1] https://en.wikipedia.org/wiki/Determination_of_the_day_of_th...
[2] https://en.wikipedia.org/wiki/International_Fixed_Calendar
I wrote my own, so had to click, but mine was for a very different use case: converting extremely varied date strings into date ranges,
where a significant % of cases are large number are human-entered human-readable date and date range specifiers, as used in periodicals and other material dating back a century or two.
I.e. I had correctly interpret not just ISO dates, but, ambiguous dates and date ranges as accepted in (library catalog) MARC records, which allows uncertain dates such as "[19--]" and "19??", and, natural language descriptors such such as "Winter/Spring 1917" and "Third Quarter '43" and "Easter 2001." In as many languages as possible for the corpus being digitized.
Output was a date range, where precision was converted into range. I'd like to someday enhance things to formalize the distinction between ambiguity and precision, but, that's a someday.
When schema is totally uncontrolled, many cases are ambiguous without other context (e.g. XX-ZZ-YYYY could be month-day-year or day-month-year for a large overlap); and some require fun heuristics (looking up Easter in a given year... but did they mean orthodox or...) and arbitrary standards (when do seasons start? what if a publication is from the southern hemisphere?) and policies for illegal dates (Feburary 29 on non-leap-years being a surprisingly common value)...
In a dull moment I should clean up the project (in Python) and package it for general use...
Some of the things in the standard are surprising, like maybe were a special request. At the time, I commented, something like, Somewhere, in the French countryside, there is a person who runs an old family vineyard, that is still stamping their barrels with the timepoint information [...]. And that person's lover was on the ISO 8601 committee.
(I once wrote an time library in Scheme that supported everything in ISO 8601. It did parsing, representation, printing, calendar conversion, and arithmetic. Including arithmetic for mixed precision and for relative values. It was an exercise in really solving the problem the first time, for a core library, rather than cascading kludges and API breakage later. I don't recall offhand whether I tried to implement arithmetic between different calendar systems, without converting them to the same system.)
e.g. dd/mm/yyyy (British) and mm/dd/yyyy (USA) can be confused for the first twelve days of every month.
So, given the high volume of international communication, I think we should hand-write months in full, or at least as the first three letters (Jan, Feb, Mar, ..., Dec)
We should also abandon three-letter acronyms (but that's another story).
https://github.com/macintux/jam
I’d like to get back to it. If nothing else, I dearly miss using Erlang.
I tried it. I will never try again unless I take, like, six months to plan out how the system will work before I even write a single line of code.
> As an aside, this search has made me tempted to ask: do we need to keep Dual publishing packages? I prefer ESM over CJS but maybe just pick one?
Pick ESM. CJS doesn't work in browsers without being transformed to something else. Node can require(esm) now, so it's time to ditch CJS completely.
IMHO needing to handle multiple, possibly obscure, date formats simultaneously is nearly never a problem in practice.
The bundle size reductions are impressive (230kB client-side savings!), and your RFC 9557 alignment is a smart forward-looking move. Two questions:
Edge cases: How does your parser handle leap seconds or pre-1582 Julian dates? (e.g., astronomical data) Temporal readiness: Will @11ty/parse-date-strings become a temporary polyfill until Temporal API stabilizes, or a long-term solution? Minor observation: Your comparison table shows Luxon supports YYYY-MM-DD HH (space separator) while RFC 9557 doesn’t – this might break existing Eleventy setups using space-delimited dates. Maybe worth an explicit migration note?
Regardless, fantastic work balancing pragmatism and standards. The web needs more focused libraries like this!
I like such subtle branding. I will try 11ty when I need a static site generator.
All engineers please follow this example when you want to promote your product, even when you don't want to promote your product.
there is so much wrong with this paragraph, it's a nest of people who shouldn't work on date parsing. there is no way 200 is any kind of date, but if you're going to insist it is, 2000 to 2010 is 11 years unless "to" means "up to but not including" in which case it should say 2001 to 2011 if you want to refer to the 200th decade, since decade 1 was 1AD through 10AD...
there is no saving this post
This is obviously wrong by induction.
If 2000 to 2010 is 11 years, then:
2000 to 2009 would be length 10 years
...
2000 to 2001 would have length 2 years
and finally 2000 to 2000 would be a span lasting "1 year".
But any span with the same start and end point must have length zero, it's nonsensical to have a system without that property.
As for the spec, ISO 8601 defines a decade as a period of 10 years starting with a year divisible by 10 without a remainder.
Decade 1 is year(s) 10 through 19.
“Never write your own x” kind of titles come off as arrogant and demotivating.
Maybe some other person will write an excellent date parsing library that will be better than current ones? Maybe they think it is worth to spend some time on it?
These kinds of hard things tend to have libraries that are extremely bloated because everyone uses one library, and that one library has to work for everyone’s use case.
You can see this in the post too, not everyone needs to be able to parse every single date format.
There may be some obscure cases.
Like for example, lets say you are writing very performance sensitive code where nanoseconds count. All of the date parsing libraries available for the language you are writing are too slow for your requirements. So you might roll your own lighter weight faster one.
That's what it took to not write my own date parsing library.
If you wrote your own date parsing library, there would be six ISO 8601 parsers available, all bad.
You should feel grateful for not having wasted your time.
There's a need for a standard that locks down the commonly used variants of it and gets rid of all the ambiguity.
For me, timestamps following this pattern 'YYYY-MM-DDThh:mm:ss.xxxxxZ' is all that I use and all my APIs will accept/produce. It's nice that other legacy systems are available that don't normalize their timestamps to UTC for whatever reason, that consider seconds and fractions of a second optional, etc. But for unambiguous timestamps, all I want is this. It's fairly easy to write a parser for it. A simple regular expression will do the job. Of course add unit tests. Technically the Z is redundant information if we can all agree to normalize to UTC. Which IMHO we should. In the same way the T part and separators are redundant too. But they are nice for human readability.
You can use datetime libraries that are widely available to localize timestamps as needed in whatever way is required locally. But timestamps should get stored and transmitted in a normalized and 100% unambiguous way.
It's only when you get into supporting all the pointless and highly ambiguous but valid variants of ISO 8601 that parsing becomes a problem. There's actually no such thing as a parser that can parse all valid variants with no knowledge of which variant is being used. There are lots of libraries with complex APIs that support some or all of the major and minor variants of course. But not with just one function called parse().
I think the main challenge with ISO 8601 is that it never called out this variant as a separate thing that you should be using. This really should be its own standard. And not using that would be a mistake. ISO 8601 is what happens when you do design by committee.
If by "timestamp" you mean past dates and deterministic future dates, then agreed. (Although I prefer unix epoch in ms for those, to be able to use integers and skip string parsing steps completely.)
But if your unlucky enough to need to handle future dates, especially "clock on the wall" ("let's meet in Frankfurt on July 26th 2029 at 1pm"), then you just can't know the timezone. The reasons can be many political ones, but especially in this case there's a high probability that EU will remove daylight saving time by then.
So in those cases, if you want to be correct, you'd need to include the geolocation in the stored timestamp.
IntroDuceThing: The ip:port,date/time,longlat string. Oh, yes its format is also dependant on the language you encode it in and what parts you leave out to be defaulted. .:, is now a valid locationdateip
QuadmasterXLII•15h ago
kccqzy•15h ago
That's because the developers use datetimes (aka timestamps) to store a single date. Just pick an arbitrary epoch date (such as January 1, 1900 as used by Excel, or my favorite January 1, 1600 since 1600 is a multiple of 400 making leap year calculations even simpler) and store the number of days elapsed since then. The rules involving leap years are much much simpler than rules involving timezones and timezone databases. The translation from/to this representation to a broken-down y/m/d takes only ~50 lines of code anyways.
Of course if you don't need to do arithmetic on dates, just store three numbers, year, month, and day.
happytoexplain•15h ago
In this case, I'd suggest storing what you mean (the user wasn't born 9,487 days after Jan 1 1970. They were born Dec 23, 1995.)
Storing the literal units (and ONLY the relevant units), as the parent has, is robust and logically+semantically correct (they could add a translation layer for UX so the user doesn't have to be particular, but that's beside the point). Whether you use a string or a struct or some date-only type is moot, as long as you're literally storing the year, month, and day, and only those three things. You can ephemerally convert it to your platform's date type if you need to.
SoftTalker•15h ago
tadfisher•15h ago
PaulHoule•14h ago
https://en.wikipedia.org/wiki/Julian_day
which are the moral equivalent of Unix timestamps with a different offset and multiplier. These work OK for human history but will break if you go far enough into the past or the future because uncertainty in the earth's rotation adds up over time.
If you don't care about timezones timezones may still care about you, if you want to minimize trouble it makes sense to properly use timezone-aware Zulu (GMT) dates for everything if you can.
In certain cases you might be doing data analysis or building an operational database for throttling access to an API or something and you know there are 16-bits worth of days, hours, 5-minute periods or something it can make sense to work relative to your own epoch.
habibur•14h ago
You need to deal with 1600 and 2000 being leap year.
While 1700, 1800, 1900 not being a leap year.
I limit dates from 1900 to 2100. All !year%4 = leap year.
Especially when you try to convert int_date to y,m,d things get tricky.
jerf•14h ago
It's a wacky rule for sure.
2000 was fun. Everyone knows about "unless % 4", but there was also an interesting and very vocal set of people who knew about the "unless % 100" but somehow knew that without knowing about the "unless % 400" part. A very specific level of knowledge.
kccqzy•13h ago
oneshtein•3h ago
Just use simple database as source of truth with all days passed since a start of human history (e.g. 6000 years ago) with labels such as "this day 12345678 was known as day XXXX-xx-xx in those regions, also known as YYYY-yy-yy in those regions, also known as ZZZZZ in this specific region". It's not a hard task to automatically compress such database into a compact representation.
pavel_lishin•14h ago
Didn't the article explicitly tell us not to write our own date parsing library?
kccqzy•13h ago
dlachausse•15h ago
Tyr42•15h ago
mattkrause•15h ago
I'd hate it less if typing updated the widget.
8organicbits•15h ago
Is this DD/MM/YYYY or MM/DD/YYYY? I can tell from the 18 that it's the latter, but that convention isn't universal. I'd recommend YYYY/MM/DD as a less ambiguous format, but I don't have a perfect answer.
jack_pp•15h ago
kriops•15h ago
PaulHoule•15h ago
titzer•15h ago
cyanydeez•14h ago
PaulHoule•14h ago
https://developer.mozilla.org/en-US/docs/Web/HTML/Reference/...
or a text field with some code that converts vernacular dates to a structured format. I don't think users are going to be too weirded out at seeing "1997-04-15" and will probably learn to use that natively.
The hard part is that a lot of devs aren't aware that there's a standard and that standard is superior to the alternatives.
homebrewer•14h ago
https://ijmacd.github.io/rfc3339-iso8601/
bobmcnamara•14h ago
tshaddox•13h ago
happytoexplain•15h ago
senfiaj•15h ago
hanche•14h ago
oneshtein•3h ago
dragonwriter•14h ago
Strictly, it is the extended form of the ISO 8601 calendar date format. (The basic format has no separators.)
ISO 8601 allows any of its date formats (calendar date, week date, or ordinal date) to be combined with a time representation for a combined date/time representation, it is inaccurate both to call any of the date formats part of the time format, and to call the calendar date format the format that is part of the combined date/time format.
(There's a reason why people who want to refer to a simple and consistent standard tend to choose RFC-3339 over ISO 8601.)
PaulHoule•15h ago
I faced a similar problem with a form where people were supposed to submit a date and probably not aware of what timezone was involved. I figured that so long as they selected "02/28/1993" and people always saw "02/28/1993" that was correct and if they ever saw it differently it was wrong. So I used non-TZ aware dates throughout the whole system.
kaoD•15h ago
Temporal has other cool types, each with distinct semantics:
- Instant: a fixed point in time with no calendar or location. Think e.g. "the user logged in at X date and time" but valid across the world for any timezone or calendar system. This is what we usually use "Unix UTC timestamps" for.
- ZonedDateTime: like an Instant but associated with a particular calendar and location. Think an Instant but rendered "real" into a calendar system and timezone so the user can see a meaningful time for them.
- PlainDate: already discussed. Think e.g. birthdates.
- PlainTime: think "run task every day at 6:30pm".
- PlainDateTime: like an Instant but associated with a calendar system, but no timezone. Think e.g. what a user would insert in a datetime picker, where the timezone is implied instead of explicitly selected.
- PlainYearMonth: think e.g. "we'll run our reports during October 2025".
- PlainMonthDay: think e.g. "my birthday is June 13".
- Duration: think e.g. "the task ran for 3hrs 30min".
Also see its important concepts[2].
[0] https://tc39.es/proposal-temporal/docs/
[1] https://tc39.es/proposal-temporal/docs/#Temporal-PlainDate
[2] https://tc39.es/proposal-temporal/docs/timezone.html
keeganpoppen•14h ago
benreesman•13h ago
Terr_•11h ago
As-is, we assume lots of things are facts and we just hope it's true enough to avoid problems. (Starting with the business requirements. :p )
geocar•3h ago
It might also be relevant: Ever ask an older Korean person their age?
> Instant: a fixed point in time with no calendar or location. Think e.g. "the user logged in at X date and time" but valid across the world for any timezone or calendar system. This is what we usually use "Unix UTC timestamps" for.
This is not a thing. Those are intervals to some Epoch, maybe taking into account leap-seconds and maybe not. They are not very useful except grossly over long ranges.
> - ZonedDateTime: like an Instant but associated with a particular calendar and location. Think an Instant but rendered "real" into a calendar system and timezone so the user can see a meaningful time for them.
Like when a user logged in at X date and time. They don't do this from no location, but from some location.
> - PlainDate: already discussed. Think e.g. birthdates.
And already wrong.
> - PlainTime: think "run task every day at 6:30pm".
Erm no. You can say 18:30 hours after midnight, or you can say when the calendar says 6:30pm, but these are different things. Imagine the poor fool who wants to run the task every day at "1:30am" and has it run twice on some days.
Bars close in some parts of the world at 30h (30時) to mean 6am the following day.
> - PlainDateTime: like an Instant but associated with a calendar system, but no timezone. Think e.g. what a user would insert in a datetime picker, where the timezone is implied instead of explicitly selected.
No, like a string.
> - PlainYearMonth: think e.g. "we'll run our reports during October 2025".
Nonsense. Also a string.
> - PlainMonthDay: think e.g. "my birthday is June 13".
Your birthday might be on the 29th of January. You cannot do reasonable arithmetic with such things, so it might as well be a string like many of these others.
> I like how Temporal[0] does this.
I don't if you can't tell. This stuff is complicated and I'd like more people exploring it because I don't know The Right Answer™ either, but I know enough to know that every existing solution is wrong in some way that can cause real harm.
dragonwriter•14h ago
You seem to have a reasonably expedient solution for that problem, but it is surprising to have the combination of things you have to have and things you have to be missing to have that problem in the first place.
happytoexplain•7h ago
BurningFrog•13h ago
shadowgovt•13h ago
legulere•12h ago
FHIR in my opinion has a pretty good system for dates (including birthdates): YYYY, YYYY-MM, or YYYY-MM-DD. (Not knowing your exact birthday is common for some countries).
https://build.fhir.org/datatypes.html#date