https://blog.tartanllama.xyz/writing-a-linux-debugger-setup/
Eli Bendersky also wrote about debuggers (I think his post is a great place to start):
https://eli.thegreenplace.net/2011/01/23/how-debuggers-work-...
I was fascinated with debuggers a while back exactly because they were so mysterious to me. I then wrote a ptrace debugger myself [1]. It features pretty simple implementations of the most common stuff you would expect in a debugger. Though I made the grave mistake of formatting all the code in GNU style.
There's added bonuses too--the author evidently is quite a skilled C++ programmer, so there's all sorts of little modern-cppisms in there that I was ignorant of as someone that's only ever written the language as a hobbyist.
Coolest feature of windbg is time travel debugging - https://learn.microsoft.com/en-us/windows-hardware/drivers/d...
(Warning: contains me trying to play Doom :)
I agree with this sentiment, yet still I’m wondering if it’s fully justified. There has never been more bad software than right now, but there has never been more good software either, no?
It’s not super relevant to the main contents of the article. Just a bit that caught my attention with regards to how it made me think.
Right. It's incredible that something like Linux is free. For a more recent example, look at Vs code. An even more recent example, look at how many open weight llms there are out there.
My machine has enough ram that this doesn't matter to me, and Vs code allows to be very productive compared eg to vim.
HA! Comparing VSCode to Linux is like comparing an overweight, acne-ridden drug addict that lives in his mom's basement to an astronaut with 3 PhDs. They're barely even the same species.
There are some ways I could see the author being somewhat justified, especially when it comes to the need for debuggers. Software is getting more layers. The amount of it and the complexity of it is going up. Debuggers are super useful for helping me understand the libraries I use that I didn’t write, and how my own code interacts with them. There are also a lot more people writing code than there used to be, and because the number of people writing code has been growing, that means the distribution skews toward beginners. I feel like the number of languages in popular use is also going up, and the diversity of coding environments increasing. I don’t know that I would frame all this as ‘decay’ but it does mean that we’re exposed to higher volumes of iffy code and abstractions over time.
'good' as in performant--an area that game dev types (rightly, IMO) criticize and harp on? There's far less of it, video games aside.
Think of the perceivable slowness of many web application you use daily, Windows 11's, well, everything UI-related, etc.
Hell, my 3 year old iPhone can't scroll Uber Eats at 60fps consistently. Is 'Uber eats' 'good'? From a functionality standpoint, yeah, of course. But is displaying a list of images and text and expecting it to scroll smoothly too much to ask?
Software can be 'good' in terms of functionality offered and 'bad' at the same time, depending on your perspective.
IIRC I think Mr Fleury has a background in game-dev, so his perspective is totally understandable. Modern games are remarkable feats of software.
Many gamers are unhappy with the performance of modern games as fewer and fewer of them can manage 60 fps at release on even high-end hardware.
captn3m0•8mo ago
fileeditview•8mo ago
Anyways I can recommend it even though I am not finished with it yet.
remexre•8mo ago
ddelnano•8mo ago
remexre•8mo ago
ddelnano•8mo ago
I work on CNCF Pixie, which uses DWARF for eBPF uprobes and dynamic instrumentation. While I understood how our project uses DWARF, the book made many details much clearer for me.