1. I was a mobile dev, and I operated at the framework-level with UIKit and later SwiftUI. So much of my team's code really was book-keeping pointers (references) into other systems.
2. I was splitting my time with some tech-stacks I had less confidence in, and they happened to omit Option types.
Since then I've worked with Dart (before and after null safety,) C, C++, Rust, Go, Typescript, Python (with and without type hints,) and Odin. I have a hard time not seeing all of this as preference, but one where you really can't mix them to great effect. Swift was my introduction to Options, and there's so much support in the language syntax to help combat the very real added-friction, but that syntax-support can become a sort of friction as well. To see `!` at the end of an expression (or `try!`) is a bit distressing, even when you know today the unlikelihood (or impossibility) of that expression yielding `nil.`
I have come to really appreciate systems without this stuff. When I'm writing my types in Odin (and others which "lack" Optionals) I focus on the data. When I'm writing types in languages which borrow more from ML, I see types in a few ways; as containers with valid/invalid states, inseparably paired with initializers that operate on their machinery together. My mental model for a more featureful type-system takes more energy to produce working code. That can be a fine thing, but right now I'm enjoying the low-friction path which Odin presents, where the data is dumb and I get right to writing procedures.
gingerBill•1h ago
> *TL;DR* null pointer dereferences are empirically the easiest class of invalid memory addresses to catch at runtime, and are the least common kind of invalid memory addresses that happen in memory unsafe languages. The trivial solutions to remove the “problem” null pointers have numerous trade-offs which are not obvious, and the cause of why people think it is a “problem” comes from a specific kind of individual-element mindset.