frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

I spent a year making an ASN.1 compiler in D

https://bradley.chatha.dev/blog/dlang-propaganda/asn1-compiler-in-d/
176•BradleyChatha•4h ago•71 comments

PyTorch Monarch

https://pytorch.org/blog/introducing-pytorch-monarch/
233•jarbus•6h ago•34 comments

Antislop: A framework for eliminating repetitive patterns in language models

https://arxiv.org/abs/2510.15061
12•Der_Einzige•39m ago•10 comments

The game theory of how algorithms can drive up prices

https://www.quantamagazine.org/the-game-theory-of-how-algorithms-can-drive-up-prices-20251022/
127•isaacfrond•5h ago•81 comments

VST3 audio plugin format is now MIT

https://forums.steinberg.net/t/vst-3-8-0-sdk-released/1011988
537•rock_artist•11h ago•125 comments

Google flags Immich sites as dangerous

https://immich.app/blog/google-flags-immich-as-dangerous
1283•janpio•20h ago•542 comments

Unconventional Ways to Cast in TypeScript

https://wolfgirl.dev/blog/2025-10-22-4-unconventional-ways-to-cast-in-typescript/
9•Bogdanp•3h ago•0 comments

CRDTs: Convergence without coordination

https://read.thecoder.cafe/p/crdt
45•0xKelsey•1w ago•19 comments

Google Earth AI expanding access around the globe

https://blog.google/technology/research/new-updates-and-more-access-to-google-earth-ai/
5•diogenico•17m ago•1 comments

Programming with Less Than Nothing

https://joshmoody.org/blog/programming-with-less-than-nothing/
326•signa11•11h ago•113 comments

VectorWare – from creators of `rust-GPU` and `rust-CUDA`

https://www.vectorware.com/blog/announcing-vectorware/
14•ashvardanian•1h ago•7 comments

Show HN: Deta Surf – An open source and local-first AI notebook

https://github.com/deta/surf
66•mxek•5h ago•26 comments

Compiler for "Easy" language from "Etudes for Programmers" book (1978)

https://github.com/begoon/easy
14•begoon•1w ago•6 comments

Scripts I wrote that I use all the time

https://evanhahn.com/scripts-i-wrote-that-i-use-all-the-time/
1142•speckx•1d ago•327 comments

Nango (YC W23) is hiring staff back-end engineers (remote)

https://www.nango.dev/careers
1•bastienbeurier•5h ago

Radios, how do they work? (2024)

https://lcamtuf.substack.com/p/radios-how-do-they-work
180•aqrashik•11h ago•44 comments

Show HN: Nostr Web – decentralized website hosting on Nostr

https://nweb.shugur.com
11•karihass•2h ago•4 comments

Which Collatz numbers do Busy Beavers simulate (if any)?

https://gbragafibra.github.io/2025/10/16/collatz_ant11.html
28•Fibra•5d ago•1 comments

Accessing Max Verstappen's passport and PII through FIA bugs

https://ian.sh/fia
564•galnagli•22h ago•128 comments

Karpathy on DeepSeek-OCR paper: Are pixels better inputs to LLMs than text?

https://twitter.com/karpathy/status/1980397031542989305
367•JnBrymn•1d ago•146 comments

SpaceX disables 2,500 Starlink terminals allegedly used by Asian scam centers

https://arstechnica.com/tech-policy/2025/10/starlink-blocks-2500-dishes-allegedly-used-by-myanmar...
193•jnord•5h ago•168 comments

Run interactive commands in Gemini CLI

https://developers.googleblog.com/en/say-hello-to-a-new-level-of-interactivity-in-gemini-cli/
191•ridruejo•1w ago•68 comments

JMAP for Calendars, Contacts and Files Now in Stalwart

https://stalw.art/blog/jmap-collaboration/
369•StalwartLabs•23h ago•179 comments

US axes website for reporting human rights abuses by US-armed foreign forces

https://www.bbc.com/news/articles/cqx30vnwd4do
647•tartoran•2h ago•333 comments

Ask HN: Does anyone have scans of these missing PC Plus issues (1991–1993)?

107•billpg•1w ago•30 comments

C64 Blood Money

https://lemmings.info/c64-blood-money/
114•mariuz•8h ago•25 comments

Ovi: Twin backbone cross-modal fusion for audio-video generation

https://github.com/character-ai/Ovi
302•montyanderson•21h ago•110 comments

We need to start doing web blocking for non-technical reasons

https://utcc.utoronto.ca/~cks/space/blog/web/WeShouldBlockForSocialReasons?showcomments
42•birdculture•3h ago•21 comments

Why SSA?

https://mcyoung.xyz/2025/10/21/ssa-1/
206•transpute•21h ago•88 comments

Element: setHTML() method

https://developer.mozilla.org/en-US/docs/Web/API/Element/setHTML
248•todsacerdoti•1d ago•135 comments
Open in hackernews

I spent a year making an ASN.1 compiler in D

https://bradley.chatha.dev/blog/dlang-propaganda/asn1-compiler-in-d/
174•BradleyChatha•4h ago

Comments

BradleyChatha•4h ago
In short: I wanted to talk a bit about ASN.1, a bit about D, and a bit about the compiler itself, but couldn't think of any real cohesive format.

So I threw a bunch of semi-related ramblings together and I'm daring to call it a blog post.

Sorry in advance since I will admit it's not the greatest quality, but it's really not easy to talk about so much with such brevity (especially since I've already forgot a ton of stuff I wanted to talk about more deeply :( )

whizzter•2h ago
As someone that had the dis-pleasure to work with Asn.1 data (yes, certificates) I fully symphatise with anguish you've gone through (that 6months of Ansible HR comments cracked me up also :D ).
BradleyChatha•2h ago
It makes me laugh that absolutely no one can say "I've worked with ASN.1" in a positive light :D
StopDisinfo910•1h ago
There was an amusing chain of comments the last time protobuf was mentionned in which some people were arguing that it had been a terrible idea and ASN.1, as a standard, should have been used.

It was hilarious because clearly none of the people who were in favor had ever used ASN.1.

mananaysiempre•43m ago
Cryptonector[1] maintains an ASN.1 implementation[2] and usually has good things to say about the language and its specs. (Kind of surprised not he’s not in the comments here already :) )

[1] https://news.ycombinator.com/user?id=cryptonector

[2] https://github.com/heimdal/heimdal/tree/master/lib/asn1

hamburglar•1h ago
As a former PKI enthusiast (tongue firmly in cheek with that description) I can say if you can limit your exposure to simply issuing certs so you control the data and thus avoid all edge cases, quirks, non-canonical encodings, etc, dealing with ASN.1 is “not too terrible.” But it is bad. The thing that used to regularly amaze me was the insane depths of complexity the designers went to … back in the 70’s! It is astounding to me that they managed to make a system that encapsulated so much complexity and is still in everyday use today.

You are truly a masochist and I salute you.

cyberax•12m ago
It's also amazing that we're basically using only a couple of free-form text fields in the WebPKI for the most crucial parts of validation.

Completely ignoring the ASN.1 support for complicated structures, with more than one CVE linked to incorrect parsing of these text fields m

throw_a_grenade•2h ago
Don't worry, it's your blog, and your way. Keep it up, if it makes you whole.
giancarlostoro•2h ago
At least you might be summoning Walter Bright in talking about D. One of my favorite languages I wish more companies would use. Unfortunately for its own sake, Go and Rust are way more popular in the industry.
pjmlp•1h ago
Unfortunately it lost the opportunity back when Remedy Games and Facebook were betting on it.

The various WIP features, and switching focus of what might bring more people into the ecosystem, have given away to other languages.

Even C#, Java and C++ have gotten many of features that were only available in D as Andrei Alexandrescu's book came out in 2011.

mort96•49m ago
I feel like back when D might've been a language worth looking into, it was hampered by the proprietary compilers.

And still today, the first thought that comes to mind when I think D is "that language with proprietary compilers", even though there has apparently been some movement on that front? Not really worth looking into now that we have Go as an excellent GC'd compiled language and Rust as an excellent C++ replacement.

Having two different languages for those purposes seems like a better idea anyway than having one "optionally managed" language. I can't even imagine how that could possibly work in a way that doesn't just fragment the community.

sfpotter•38m ago
Sounds like you should look into it instead of idly speculating! Also, the funny thing about a divisive feature is that it doesn't matter if it fragments the community if you can use it successfully. There are a lot of loud people in the D community who freak out and whine about the GC, and there are plenty more quiet ones who are happily getting things done without making much noise. It's a great language.
mort96•36m ago
Are you saying that if I'm using D-without-GC, I can use any D library, including ones written with the assumption that there is a GC? If not, how does it not fracture the community?

> There are a lot of loud people in the D community who freak out and whine about the GC, and there are plenty more quiet ones who are happily getting things done without making much noise

This sounds like an admission that the community is fractured, except with a weirdly judgemental tone towards those who use D without a GC?

olvy0•55m ago
Just wanted to say I enjoyed your post very much. Thank you for writing it. I love D but unfortunately I haven't touched it for several years. I also have some experience writing parsers and implementing protocols.
mananaysiempre•39m ago
A small nitpick: I don’t think the intersection example that you use does what you want it to do. Perhaps there’s some obscure difference in “PER-visibility” or whatnot, but at least set-theoretically,

  LegacyFlags2 ::= INTEGER (0 | 2 ^ 4..8) -- as in the article
is exactly equivalent to

  LegacyFlags2 ::= INTEGER (0) -- only a single value allowed
as (using standard mathematical notation and making precedence explicit) {0} ∪ ({2} ∩ {4,5,6,7,8}) = {0} ∪ ∅ = {0}.
Keyframe•3h ago
I salute your for deep dive into this. History would have it that ASN.1 was already there as both an IDL and serialization format when HTTPS certs were defined. If it were today, would it be the same or would we end up with protobuf or thrift or similar?
woodruffw•3h ago
> If it were today, would it be the same or would we end up with protobuf or thrift or similar?

The main advantage of ASN.1 (specifically DER) in an HTTPS/PKI context is that it's a canonical encoding. To my understanding Protobuf isn't; I don't know about Thrift.

(A lot of hay is made about ASN.1 being bad, but it's really BER and other non-DER encodings of ASN.1 that make things painful. If you only read and write DER and limit yourself to the set of rules that occur in e.g. the Internet PKI RFCs, it's a relatively tractable and normal looking serialization format.)

jcranmer•2h ago
I'm hardly a connoisseur of DER implementations, but my understanding is that there are two main problems with DER. The first is that the format isn't really parseable without using a schema, unlike (say) XML or JSON. This means your generic DER parser needs to have an ASN.1 schema passed into it to parse the DER, and this leads to the second problem, which is that this ends up being complex enough that basically every attempt to do so is full of memory safety issues.
whizzter•2h ago
I wrote an Asn.1 decoder and since it contains type/size info you can often read a subset and handle the rest as opaque data objects if you need round-tripping, this is required as there can be plenty of data that is unknown to older consumers (like the ETSI EIDAS/Pades personal information extensions in PDF signatures).

However, to have a sane interface for actually working with the data you do need a schema that can be compiled to a language specific notation.

woodruffw•2h ago
> The first is that the format isn't really parseable without using a schema, unlike (say) XML or JSON.

You can parse DER perfectly well without a schema, it's a self-describing format. ASN.1 definitions give you shape enforcement, but any valid DER stream can be turned into an internal representation even if you don't know the intended structure ahead of time.

rust-asn1[1] is a nice demonstration of this: you can deserialize into a structure if you know your structure AOT, or you can deserialize into the equivalent of a "value" wrapper that enumerates/enforces all valid encodings.

> which is that this ends up being complex enough that basically every attempt to do so is full of memory safety issues.

Sort of -- DER gets a bad rap for two reasons:

1. OpenSSL had (has?) an exceptionally bad and permissive implementation of a DER parser/serializer.

2. Because of OpenSSL's dominance, a lot of "DER" in the wild was really a mixture of DER and BER. This has caused an absolutely obscene amount of pain in PKI standards, which is why just about every modern PKI standard that uses ASN.1 bends over backwards to emphasize that all encodings must be DER and not BER.

(2) in particular is pernicious: the public Web PKI has successfully extirpated BER, but it still skulks around in private PKIs and more neglected corners of the Internet (like RFC 3161 TSAs) because of a long tail of OpenSSL (and other misbehaving implementation) usage.

Overall, DER itself is a mostly normal looking TLV encoding; it's not meaningfully more complicated than Protobuf or any other serialization form. The problem is that it gets mashed together with BER, and it has a legacy of buggy implementations. The latter is IMO more of a byproduct of ASN.1's era -- if Protobuf were invented in 1984, I imagine we'd see the same long tail of buggy parsers regardless of the quality of the design itself.

jcranmer•2h ago
> You can parse DER perfectly well without a schema, it's a self-describing format. ASN.1 definitions give you shape enforcement, but any valid DER stream can be turned into an internal representation even if you don't know the intended structure ahead of time.

> rust-asn1[1] is a nice demonstration of this: you can deserialize into a structure if you know your structure AOT, or you can deserialize into the equivalent of a "value" wrapper that enumerates/enforces all valid encodings.

Almost. The "tag" of the data doesn't actually tell you the type of the data by itself (most of the time at least), so while you can say "there is something of length 10 here", you can't say if it's an integer or a string or an array.

woodruffw•2h ago
> The "tag" of the data doesn't actually tell you the type of the data by itself (most of the time at least), so while you can say "there is something of length 10 here", you can't say if it's an integer or a string or an array.

Could you explain what you mean? The tag does indeed encode this: for an integer you'd see `INTEGER`, for a string you're see `UTF8String` or similar, for an array you'd see `SEQUENCE OF`, etc.

You can verify this for yourself by using a schemaless decoder like Google's der-ascii[1]. For example, here's a decoded certificate[2] -- you get fields and types, you just don't get the semantics (e.g. "this number is a public key") associated with them because there's no schema.

[1]: https://github.com/google/der-ascii

[2]: https://github.com/google/der-ascii/blob/main/samples/cert.t...

jcranmer•1h ago
It's been a long time since I last stared at DER, but my recollection was for the ASN.1 schema I was decoding, basically all of the tags ended up not using the universal tag information, so you just had to know what the type was supposed to be. The fact that everything was implicit was why I qualified it with "most of the time"; it was that way in my experience.
woodruffw•1h ago
Oh, that makes sense. Yeah, I mostly work with DER in contexts that use universal tagging. From what I can tell, IMPLICIT tagging is used somewhat sparingly (but it is used) in the PKI RFCs.

So yeah, in that instance you do need a schema to make progress beyond "an object of some size is here in the stream."

jeroenhd•2h ago
DER is TLV. You don't know the specifics ("this integer is a value between 10 and 53") that the schema contains, but you know it's an integer when you read it.

PER lacks type information, making encoding much more efficient as long as both sides of the connection have access to the schema.

jeroenhd•2h ago
You can parse DER, but you have no idea what you've just parsed without the schema. In a software library, that's often not very useful, but at least you can verify that the message was loaded correctly, and if you're reverse engineering a proprietary protocol you can at least figure out the parts you need without having to understand the entire thing.
woodruffw•2h ago
Yes, it's like JSON in that regard. But the key part is that the framing of DER doesn't require a schema; that isn't true for all encoding formats (notably protobuf, where types have overlapping encodings that need to be disambiguated through the schema).
jeroenhd•2h ago
I'd argue that JSON is still easier as it allows you to reason about the structure and build up a (partial) schema at least. You have the keys of the objects you're trying to parse. Something like {"username":"abc","password":"def",userId:1,admin:false} would end up something like Utf8String(3){"abc"}+Utf8String(3){"def"}+Integer(1){1}+Integer(1){0} if encoded in DER style.

This has the fun side effect that DER essentially allows you to process data ("give me the 4th integer and the 2nd string of every third optional item within the fifth list") without knowing what you're interpreting.

BradleyChatha•2h ago
> You can parse DER perfectly well without a schema, it's a self-describing format.

If the schema uses IMPLICIT tags then - unless I'm missing something - this isn't (easily) possible.

The most you'd be able to tell is whether the TLV contains a primitive or constructed value.

This is a pretty good resource on custom tagging, and goes over how IMPLICIT works: https://www.oss.com/asn1/resources/asn1-made-simple/asn1-qui...

> Because of OpenSSL's dominance, a lot of "DER" in the wild was really a mixture of DER and BER

:sweat: That might explain why some of the root certs on my machine appear to be BER encoded (barring decoder bugs, which is honestly more likely).

woodruffw•1h ago
Ah yeah, IMPLICIT is the main edge case. That's a good point.
syncsynchalt•1h ago
One of my big problems with ASN.1 (and its encodings) is how _crusty_ it is.

You need to populate a string? First look up whether it's a UTF8String, NumericString, PrintableString, TeletexString, VideotexString, IA5String, GraphicString, VisibleString, GeneralString, UniversalString, CHARACTER STRING, or BMPString. I'll note that three of those types have "Universal" / "General" in their name, and several more imply it.

How about a timestamp? Well, do you mean a TIME, UTCTime, GeneralizedTime, or DATE-TIME? Don't be fooled, all those types describe both a date _and_ time, if you just want a time then that's TIME-OF-DAY.

It's understandable how a standard with teletex roots got to this point but doesn't lead to good implementations when there is that much surface area to cover.

jeroenhd•3h ago
Protobuf is pretty much ASN.1 with better tooling, optimized for message exchange protocol rather than files, when it comes down to the details. Withouth ASN.1 and the lessons learned from it, another binary serialization protocol would've probably taken its place, and I bet Protobuf and similar tools would look and perhaps work quite differently. The same way JSON would look and act quite differently if XML had never been invented.
dathinab•3h ago
> Protobuf is pretty much ASN.1

no, not at all

they share some ideas, that doesn't make it "pretty much ASN.1". Its only "pretty much the same" if you argue all schema based general purpose binary encoding formats are "pretty much the same".

ASN.1 also isn't "file" specific at all it's main use case is and always has been being used as message exchange protocols.

(Strictly speaking ASN.1 is also not a single binary serialization format but 1. one schema language, 2. some rules for mapping things to some intermediate concepts, 3. a _docent_ different ways how to "exactly" serialize things. And in the 3rd point the difference can be pretty huge, from having something you can partially read even without schema (like protobuff) to more compact representations you can't read without a schema at all.)

jeroenhd•2h ago
> if you argue all schema based general purpose binary encoding formats are "pretty much the same"

At the implementation level they are different, but when integrating these protocols into applications, yeah, pretty much. Schema + data goes in, encoded data comes out, or the other way around. In the same way YAML and XML are pretty much the same, just different expressions of the same concepts. ASN.1 even comes with multiple expressions of exactly the same grammar, both in text form and binary form.

ASN.1 was one of the early standardised protocols in this space, though, and suffers from being used mostlyin obscure or legacy protocols, often with proprietary libraries if you go beyond the PKI side of things.

ASN.1 isn't file specific, it was designed for use in telecoms after all, but encodings like DER work better inside of file formats than Protobuf and many protocols like it. Actually having a formal standard makes including it in file types a lot easier.

WorldMaker•3h ago
If it were designed today, I would imagine it could end up looking like JWT (JOSE) and use JSON. I've seen several key exchange formats in JSON beyond JWT/JOSE in the wild today as well, so we may even get there eventually in a future upgrade of TLS.
whizzter•2h ago
Yes and no, the JSON handling of things like binary data (hashes) and big-ints leaves a bit to be desired (sure we can use base64 encoding). Asn.1 isn't great by any extent but for this JSON really isn't much better apart from better library support.

Yes, JOSE is still infinitely better than XmlSignatures and the canonical XML madness to allow signatures _inside_ the document to be signed.

dathinab•2h ago
possible but unlikely for multiple reasons

- huge braking change with the whole cert infrastructure

- this question was asked to the people who did choose ASN.1 for X509 and AFIK they saied today they would use protobuf. But I don't remember where I have that from.

- JOSE/JWT etc. aren't exactly that well regarded in the crypto community AFIK or designed with modern insights about how to best do such things (too much header malleability, too much crypto flexibility, too little deterministic encoding of JSON, too much imprecise defined corner cases related to JSON, too much encoding overhead for keys and similar (which for some pq stuff can get in the 100KiB ranges), and the argument of it being readable with a text editor falls apart if anything you care about is binary (keys, etc.) and often encrypted (producing binary)). (And IMHO opinion the plain text argument also falls apart for most non-crypto stuff I mean if you anyway add a base64 encoding you already dev need tooling to read it, and weather your debug tooling does a base64 decode or a (maybe additional) data decode step isn't really relevant, same for viewing in IDE which can handle binary formats just fine etc. but thats an off topic discussion)

- if we look at some modern protocols designed by security specialists/cryptographers and have been standardized we often find other stuff (e.g. protobuf for some JWT alternatives or CBOR for HSK/AuthN related stuff).

otabdeveloper4•2h ago
ASN.1 seems orders of magnitude simpler than Protobuf or Thrift.
dathinab•2h ago
how did you end up believing that?

- ASN.1 is a set of a docent different binary encodings

- ASN.1's schema languages is IMHO way better designed then Protobuf but also more complex as it has more features

- ASN.1 can encode much more different data layouts (e.g. things where in Protobuf you have to use "tricks") each being layout in the output differently depending on the specific encoding format, annotations on the schema and options during serialization

- ASN.1 has many ways to represent things more "compact" which all come with their own complexity (like bit mask encoded boolean maps)

overall the problem of ASN.1 is that it's absurdly over engineered leading to you needing to now many hundred of pages of across multiple standard documents to just implement one single encoding of the docent existing ones and even then you might run into ambiguous unclear definitions where you have to ask on the internet for clarification

if we ignore the schema languages for a moment most senior devs probably can write a crappy protobuf implementation over the weekend, but for ASN.1 you might not even be able to digest all relevant standards in that time :/

Realistically if ASN.1 weren't as badly overengineered and had shipped only with some of the more modern of it's encoding formats we probably would all be using ASN.1 for man things including maybe your web server responses and this probably would cut non image/video network bandwidth by 1/3 or more. But then the network is overloaded by image/video transmissions and similar not other stuff so I guess who cares???!???

anentropic•27m ago
for "docent", do you mean "dozen"?

I had to look up https://www.merriam-webster.com/dictionary/docent

thadt•2h ago
No, we would use something similar to S-Expressions [1]. Parsing and generation would be at most a few hundred lines of code in almost any language, easily testable, and relatively extensible.

With the top level encoding solved, we could then go back to arguing about all the specific lower level encodings such as compressed vs uncompressed curve points, etc.

[1] https://datatracker.ietf.org/doc/rfc9804

dfox•2h ago
It is not only that ASN.1 was there before SSL, but even the certificate format was there before SSL. The certificate format comes from X.500, which is the "DAP" part of "LDAP", L as in "Lightweight" in "LDAP" refers mostly to LDAP not using public key certificates for client authentication in contrast to X.500 [1]. Bunch of other related stuff comes from RSA's PKCS series specifications, which also mostly use ASN.1.

1] the somewhat ironic part is that when it was discovered that using just passwords for authentication is not enough, the so called "lighweight" LDAP got arguably more complex that X.500. Same thing happened to SNMP (another IETF protocol using ASN.1) being "Simple" for similar reasons.

lukeh•3h ago
I worked on a Swift ASN.1 compiler [1] a while back (not swift-asn1, mine used Codable). I saved myself some time by using the Heimdal JSON compiler, which can transform ASN.1 into a much more parseable JSON AST.

[1] https://github.com/PADL/ASN1Codable

[2] https://github.com/heimdal/heimdal/tree/master/lib/asn1

BradleyChatha•2h ago
Not heard of either of those projects before, but I love how libasn1's README has a thinly veiled hint of disdain for ASN.1

> which can transform ASN.1 into a much more parseable JSON AST

The sign of a person who's been hurt, and doesn't want others to feel the same pain :D

marcosdumay•1h ago
Hey, I love how the author describes ASN.1 as a "syntax" in quotes.

What I disagree is on the disdain being veiled. Seems very explicit to me.

Anyway, yeah, I hadn't heard about it before either, and it's great to know that somebody out there did solve that horrible problem already, and that we can use the library.

lepicz•3h ago
some people simply like pain :D

(i worked with asn1c (not sure which fork) and had to hack in custom allocator and 64bit support. i shiver every time something needs attention in there)

BradleyChatha•2h ago
:)

Honestly any compiler project in pure C is pretty hardcore in my eyes, ASN.1 must amplify the sheer horror.

usrbinenv•3h ago
I really love D, it's one of my favorite languages. I've started implementing a vim-like text editor in it from scratch (using only Raylib as a dependency) and was surprised how far I was able to get and how good my test coverage was for it. My personal favorite features of D:

* unit tests anywhere, so I usually write my methods/functions with unit tests following them immediately

* blocks like version(unittest) {} makes it easy to exclude/include things that should only be compiled for testing

* enums, unions, asserts, contract programming are all great

I would say I didn't have to learn D much. Whatever I wanted to do with it, I would find in its docs or asked ChatGPT and there would always be a very nice way to do things.

gavinray•2h ago
D is a bittersweet topic for me.

From a philosophical/language-design standpoint, it ticks so many boxes. It had the potential to be wildly popular, had a few things gone differently.

If the language tooling and library ecosystem were on par with the titans of today, like Rust/Go, it really would be a powerhouse language.

binaryturtle•2h ago
Isn't D supported by the GNU compiler collection? I personally would prefer this type of tooling over what Rust and Go do (I can't even get their compilers to run on my old platform anymore; not to mention all this dependencies on remote resources typical Rust/Go projects seem to have: which seems to be enforced by the ecosystem?)
mhh__•2h ago
It is
pjmlp•1h ago
It is, however keeping LDC and GCC up to date is a volunteer effort with not enough people, so they are always a bit behind dmd.

Still much better than GCCGO, kind of useless for anything beyond Go 1.18, no one is updating it any longer, and may as well join gcj.

sfpotter•37m ago
Having written real code in D, I can say that the slight discrepancy between dmd, LDC, and gdc isn't a roadblock in practice.
BradleyChatha•2h ago
Yeah, the foundations of the language are incredible. It's just everything else around it that brings it down (and is unfortunately very hard to motivate people to solve).

D definitely missed a critical period, but I love it all the same.

usrbinenv•2h ago
So, I also write Go and I don't get the part about tooling. I don't need formatters or linters as I'm adult enough to know how to format my code (in fact I dislike tools doing it for me). D also has dub, which is fine, as far as package managers go. The ecosystem is the only issue and Go does arguably have a lot of very cool libraries for virtually anything, but outside of webdev, I can't see myself using them. This is why D works a lot better for projects where I don't need all those dependencies and would do better without them.
sfpotter•34m ago
I freely admit to not being a Go or Rust expert, but from what I can tell using C from D is even easier than in either of these languages. The C++ interop is also decently usable.

IMO, the bigger issue is language tooling.

axus•2h ago
SNMP MIB files are written in ASN.1. That is the extent of my knowledge about ASN.1, was nice to learn a little more by reading this blog post.
YouAreWRONGtoo•2h ago
The only goal of such ridiculous standards is to act as a form of vendor lock-in for vendors implementing those standards; the vendors get to say to governments that it is a standard and the sellers of the standards also get some money.

Any system designed picking such standards is basically betraying their client.

I think, if you want to annoy these people maximally, you should write an annotated version of the standard in a mathematical formal language.

I read the table constraints, which try to do something simple, but it's written in the most convoluted way possible.

I think I considered ASN.1 for a system once, but rejected it because of more modern technically superior system.

If the parser for something like ASN.1 doesn't fit in 80 lines of Haskell, perhaps you just shouldn't use it.

I don't know who these assholes are that say "Sure, let's make things slow and buggy, since we all hail Satan after all".

talkingtab•2h ago
OMG ASN.1.

For those of you who missed this, there was a very interesting thing that happened in the growth of the internet.

At the time people were evolving the protocols through the IETF. So all the things that you rely on now - for the most part - just came into being. One day there was email. There was ftp. There was TCP. There were the Van Jacobson TCP mods.

At this time corporate types paid no attention to the internet. Academic types and the IETF were from what I saw the main developers.

Then one day the corporate world realized they might make money. But the development process of the protocols was incomprehensible (and incompatible) with the corporate culture. TCP was clearly a mess, all these protocols like DNS were a mess. From the corporate perspective.

So began the protocol wars https://en.wikipedia.org/wiki/Protocol_Wars.

Whether ASN.1 was a product of that war or just a product of the corporate mentality, it serves as a powerful instance of the what the corporate world looks like vs the academic world looks like. You can find the wreckage from the war littered around. If you see and X.something protocol it could well be one of the relics. There were a few X.things that were adopted and useful, but were others that would haunt your dreams.

Although this is ancient history, and pretty much now told from the corporate perspective, it suggests to us that the corporate process for thinking is not as effective as the alternative - the IETF and Academic.

One is a sort of recipe culture. You write a recipe, everyone follows it and you are happy. The other is a sort of functional culture. If you can make bread and eat it you are happy. When the bread doesn't taste good you fix it.

Given the kind of bread that is commonly available in the US now, we can draw some conclusions about recipe thinking, recipe culture, corporate culture etc. One could even extend this paradigm of thinking to new things like AI. Or not.

gorgoiler•1h ago
My partner and I were re-watching Father of the Bride the other day (rest in peace, Diane Keaton) and during the early parents meeting the son-in-law to-be describes himself as a communications consultant, working on X.25 networking installations.

I had to pause the movie and explain to my partner just how close the world came to missing out on The Internet, and having instead to suffer the ignominy of visiting sites with addresses like “CN=wikipedia, OU=org, C=US” and god knows what other dreadful protocols underlying them. I think she was surprised how angry and distressed I sounded! It would have been awful!

Poor her!

EvanAnderson•1h ago
There would have been a network like the Internet if the "Bellheads" in the ITU won. It would have been pay-by-the-byte-transferred.
inejge•22m ago
> how close the world came to missing out on The Internet

Monday-morning-quarterbacking is an unproductive pastime, but I don't think it was very close, on account of the Internet side having developed a bunch of useful (if a bit ramshackle) protocols and applications much faster than the ISO team, because the specs were freely available (not to mention written in a much more understandable manner). I still rue the day the IETF dropped the "distribution of this memo is unlimited" phrase from the RFC preambles. Yeah I understand that it originally had more to do with classification than general availability, but it described the ethos perfectly.

It's not all roses and we're paying for the freewheeling approach to this day in some cases, cf. email spam and BGP hijacking. But it gave results and provided unstoppable momentum.

OhMeadhbh•1h ago
"OMG ASN.1" is the name of my next band.
p_l•8m ago
Protocol Wars are also a story of early enshittification of Internet, where attempts to push forward with solutions to already known problems were pushed back because they would require investment on vendor side instead of just carrying on using software majorly delivered free of charge because DoD needed a quick replacement for their PDP-10 fleet. (Only slight hyperbole)

A lot of issues also came from ISO standards refusing to get stuck without known anticipated issues taken care of, or with unextendable lockin due to accidental temporary solution ending up long term one, while IETF protocols happily ran forward "because we will fix it later" only to find out that I stalled base ossified things - one of the lessons is to add randomness to new protocols so that naive implementation will fail on day one.

Then there were accidental things, like a major ASN.1 implementation for C in 1990 being apparently shit (a tradition picked up in even worse way by OpenSSL and close to most people playing with X.509 IMO), or even complaints about ASN.1 encodings being slow due to CPU lacking barrel shifters (I figure it must refer to PER somehow)

nicce•1h ago
Normally, you could say when implementing some standard that you get 80% of the functionality with 20% of the planned time. But with ASN.1 the remaining 20% could take the rest of your life.
OhMeadhbh•1h ago
Ack. I wrote an ASN.1 compiler in Java in the 90s. Mostly just to make sure I understood how it and BER/DER were used in X.509. I think the BER interpretation bits are still being used somewhere

I'm sorry you had to waste a year of your life.

There are few things I dislike more in the computing world than ASN.1/BER. It seems to encourage over-specification and is suboptimal for loosely coupled systems.

But it looks like you had a decent time...

horizion2025•35m ago
I have also had to work with this in many contexts... Deeply embedded systems with no parsers available and where no "proper" ones would fit. So i have hand written but basic parsing and generation a few times.

Oh and there is also non compliant implementations. E.g. some passports (yes the passports with chip use tons of ASN.1) even have incorrect including of big integers (supposed to be the minimum two complement, as I recall some passports used a fixed non-complement format yanked into the 0x02 INTEGER type... Some libraries have special non-compliant parsing modes to deal with it).

cbondurant•30m ago
Every time I have ever had the displeasure of looking at an X.whatever spec, I always end up coming away with the same conclusion.

Somehow, despite these specifications being 90% metadata by weight, they seem to consistently forget the part of the spec that lets you actually know what something is. and that part is just left up to context.

I could well be missing something, but a majority of the time it feels to me like they set out to make a database schema, and accidentally wrote the sqlite file format spec instead.

Like thanks, its nice that I can parse this into a data structure :). It would be nicer, however if doing so gave me any idea of what I can do with the data I've parsed.

Though to be fair I feel the same way about the entire concept of XML schemas. The fact that you theoretically can validate an xml document against a schema is honestly completely useless. If I am parsing XML, its because my code already knows what information it needs from the XML document, and really it should also know where that information is. I don't need a separate schema definition to tell me that! its already expressed!! In the part where I am parsing out the data I need!!!

elcritch•11m ago
> The fact that you theoretically can validate an xml document against a schema is honestly completely useless. If I am parsing XML, its because my code already knows what information it needs from the XML document, and really it should also know where that information is.

You seem to miss the entire point of XML schemas, or any schema really. Validating a document against a schema isn’t really for your code. It’s for documentation of what can be in a given document and how it needs to be structured. So others don’t need to read your code to understand that.

It then allows editing tools to verify generated documents. Or developers to understand how they can structure XML output properly.

Your code could also use it to verify an XML document before passing it to your code. Then you can inform the user of an invalid document and why instead of just crashing at a random point in code without rolling your own. It can also verify an entire document whereas code may only parse portions leading to later corruption.

galkk•10m ago
Thank you, now I’m much more disillusioned in asn.1
teleforce•9m ago
According to ASN.1 Wikipedia entry, most of the tools supporting ASN.1 do the following:

1) parse the ASN.1 files, 2) generates the equivalent declaration in a programming language (like C or C++), 3) generates the encoding and decoding functions based on the previous declarations

All of these of exercise are apparently part of data engineering process or lifecycle [1].

Back in early 21st century Python is just another interpreted general purpose programming language alternative, not for web (PHP), not for command tool (TCL), not for system (C/C++), not for data wrangling (Perl), not for numerical (Matlab/Fortran), not for statistics (R).

D will probably follow similar trajectory of Python, but it really needs a special kind of killer application that will bring it to the fore.

I'm envisioning that real-time data streaming, processing and engineering can be D killer utility and defining moment.

[1] Fundamentals of Data Engineering:

https://www.oreilly.com/library/view/fundamentals-of-data/97...