Don’t reinvent the wheel on a tight deadline while the rest of the team does useful work. Don’t reinvent the wheel and then expect to be rewarded or respected as if you had really invented something.
I love my rabbit holes, but at work, it's often not viable to explore them given deadlines and other constraints. If you want your wheel to be used in production though, it better be a good wheel, better than the existing products.
Chesterton himself was using it as a religious metaphor, and I think most of us agree that software engineers are not literal gods.
The expense (time or otherwise) follows from how intimately you have to get to know the subject. Which is precisely why it's the best way to learn. It's not always viable, but when you can spare the expense nothing else compares.
Yea, reinventing the wheel is a great way to learn. You're not going to hear an educator tell you to not reinvent the wheel.
That menas that almost by definition, if a library is popular, it contains huge amounts of code that just isn't relevant to your use case.
The tradeoff should be whether you can code your version quickly (assuming it's not a crypto library, never roll your own crypto), because if you can, you'll be more familiar with it and carry a smaller dependency.
> if a library is popular, it contains huge amounts of code that just isn't relevant to your use case.
It is true that many libraries do contain such code, whether or not they have dependencies. For example, SQLite does not have any dependencies but does have code that is not necessarily relevant to your use. However, some programs (including SQLite) have conditional compilation; that sometimes helps, but in many cases it is not suitable, since it is still the same program and conditional compilation does not change it into an entirely different one which is more suitable for your use.
Also, I find often that programs include some features that I do not want and exclude many others, and existing programs may be difficult to change to do it. So that might be another reason to write my own, too.
So I had to write a simple queue, but since I wanted demos to work in the browser it has a IndexedDB backend, and I wanted demos it to work in an Electron app, so there is a SQLite backend, and I’ll likely want a multi-user server based one so there is a Postgres backend.
And I wanted to use it for rate limiting, etc, so limiters were needed.
And then there is the graph stuff, and the task stuff.
There are a lot of wheels to-create actually, if you don’t want any dependencies.
I do have a branch that uses TypeBox to make and validate the input and output json schemas for the tasks, so may not be dependency free for the core eventually.
Also, the dependencies often have a lot of extra "baggage," and I may only want a tiny bit of the functionality. Why should I use an 18-wheeler, when all I want to do, is drive to the corner store?
Also, and this is really all on me (it tends to be a bit of a minority stance, but it's mine), I tend to distrust opaque code.
If I do use a dependency, it's usually something that I could write, myself, if I wanted to devote the time, and something that I can audit, before integrating it.
I won't use opaque executables, unless I pay for it. If it's no-money-cost, I expect to be able to see the source.
When Primeagen was once interviewed, he built out a whole Java architecture; the interviewer asked him, "Have you heard of grep?" And that started a journey.
If it were to happen to me, feels like a full circle to go from glue and dependencies to pointers and data structures. A welcome reverie.
You will also rarely build a better implementation of these things than whatever is in the standard library or even some other library that already exists. If anything, it's better to, if one have a better idea, to contribute one's patches there.
A standard library data strucute or algorithm has to be something for everyone, so it can't be truly great at a specific thing. If you understand your specific use case extremely well it (and are competent...) it can be very easy to run circles around the standard library.
There is lot of complexity that mature wheels have taken into account or have had to solve and you are likely miss lot of it. Not that building your own does not help you to understand it.
Still, I wouldn't replace wheels on my car with ones I made myself from scratch... Just like I wouldn't replace reasonable complex library.
While sometimes reinventing the wheel is a useful exercise, as TFA lays out, this is often a symptom of a larger Not Invented Here mentality. This is generally a harmful tendency in some organizations that leads to derailing project deadlines, and misdirecting resources towards building and maintaining software that is not core to the company's mission.
So in most cases the advice to not reinvent the wheel is more helpful. Deciding to ignore it, especially within a corporate environment, should be backed by very good reasons.
Even if the open source alternatives already exist, does not necessarily mean that they do what you want them to do. In some cases they can be fixed to do what you need (since it is open source, that is an advantage), but sometimes it cannot really be done without rewriting it and making a new one.
>3 million weekly downloads
Dear God.
[1] https://github.com/workofart/ml-by-hand
[2] https://github.com/workofart/ml-by-hand/blob/main/examples/c...
[3] https://github.com/workofart/ml-by-hand/blob/main/examples/g...
The advice as I would give it is:
"Try to re-invent the things you are interested in".
"Do not underestimate the value of the continued interaction between reality and established solutions."
I would agree about this 15 or 20 years ago, where I saw some monstrosities by people who didn't knew that frameworks or ORMs existed, but the pendulum has swung too far in the other direction today.
The "crappy square wheels" of 2025 are libraries that don't fully solve problems and are forced upon others by developers who are hostile to any approach that doesn't involve reaching out to a 20 github stars library from their package manager.
And this has become extremely difficult to discuss because the discussion has also become binary with people treating any criticism as a sign of NIH.
accept that there are compatibility boundaries such that it is sometimes quicker to create a new X than locate it on the market, or that X is too expensive and it's time to pursue vertical integration
but teams who can't do build vs buy properly are kind of doomed, sentenced to endless cycles of Not Invented Here syndrome which block other work.
if you're in a meeting and someone says 'we can't launch our website until we develop a platform-native way to host websites' you're in the wrong part of the curve
Against all odds, I not only succeeded (mostly thanks to ignorance and stubbornness), but my wheel turns out to be unbelievably good at what it does. Possibly even world-class. After further experimentation, it also enables feats that can only be described as pure heresy with troubling ease. Time passes and some people from that niche start picking up my wheel. They all hold it wrong at the beginning because it's so alien, but once they get the hang of it they never go back.
I get bug reports and feature requests from all over the world for the oddest of use-cases and workflows. I have deep, in-depth technical discussions with brilliant people I would've never met otherwise. I've witnessed achievements done by others with my wheel beyond my wildest dreams. I discover things that keep me awake at night. I get kicks out of melting down the brains of my uninitiated coworkers and colleagues explaining what my wheel does and what I can do with it.
Don't be afraid to reinvent the wheel. You never know what crazy, wild path it might roll you down to.
It's a Ghidra extension that can export relocatable object files from any program selection. In other words, it reverses the work done by a linker.
I originally built this as part of a video game decompilation project, having rejected the matching decompilation process used by the community at large. I still needed a way to divide and conquer the problem, which is how I got the funny idea of dividing programs. That allows a particular style of decompilation project I call Ship of Theseus: reimplementing chunks of a program one piece at a time and letting the linker stitch everything back together at every step, until you've replaced all the original binary code with reimplemented source code.
It's an exquisitely deep and complex topic, chock-full of ABI tidbits and toolchains shenanigans. There's next to no literature on this and it's antithetical to anything one might learn in CS 101. The technique itself is as powerful as it is esoteric, but I like to think that any reverse-engineer can leverage it with my tooling.
In particular, resynthesizing relocations algorithmically is one of those problems subject to the Pareto principle, where getting 80% of them right is reasonably easy but whittling down the last 20% is punishingly hard. Since I refuse to manually annotate them, I've had to relentlessly improve my analyzers until they get every last corner case right. It's by far the most challenging and exacting software engineering problem I've ever tackled, one that suffers no hacks or shortcuts.
Once I got it working, I then proceeded in the name of science to commit countless crimes against computer science with it (some of those achievements are documented on my blog). Cross-delinking in particular, that is delinking an artifact to a different platform that it originates from, is particularly mind-bending ; I've had some successes with it, but I sadly currently lack the tooling to bring this to its logical conclusion: Mad Max, but with program bits instead of car parts.
Ironically, most of my users are using it for matching decompilation projects: they delink object files from an artifact, then typically launch objdiff and try to create a source file that, when compiled, generates an object file that is equivalent to the one they ripped out of the artifact. I did not expect that to happen at all since I've built this tool to specifically not do this, but I guess when everything's a nail, people will manage to wield anything as a hammer.
A common challenge is decompiling and reverse engineering a compiled program, e.g. a game. The author realized that it's an interesting approach to do a sort of reverse-linking (or I guess unlinking) process of a program into pieces and then focusing on reverse engineering or reimplementing those pieces instead of the whole.
When it comes to decompiling games, enthusiasts in that hobby want to be able to reproduce/obtain source code that compiles to exactly the same instructions as the original game. I think there is usually also a differentiation between instruction matching and byte for byte matching reproduction. But it definitely seems like a simpler and better approach to do this piece by piece and be able to test with each new piece of progress.
That's my layman understanding of it having only dabbled in decompiling stuff.
The key insight of delinking is that object files are relocatable and the process of linking them together (laying their sections in memory, computing all the symbol addresses, applying all the relocations) removes that property. But by reversing this process (creating a symbol table, unapplying the relocations/resynthesizing relocation tables, slicing new sections), code and data can be made relocatable again.
Since properly delinked object files are relocatable, the linker can process them and stitch everything back together seamlessly, even if the pieces are moved around or no longer fit where they used to be (this makes for a particularly powerful form of binary patching, as constraints coming from the original program's memory map no longer apply). Alternatively, they can be fed to anything that can process object files, like a disassembler for example.
Of course, the real fun begins when you start reusing delinked object files to make new programs. I like to say that it breaks down the linear flow of toolchain from compilation to assembly to linking into a big ball of wibbly-wobbly, bitsy wimey stuff. Especially if you start cross-delinking to a different platform than the original pieces of the program came from.
That was never satisfying to me because recompilation (decompiling to source code or IR that can be recompiled for another platform) is an academic field of study [1], so the problem of converting a shared library to a static one should be easier. After all, you have all the assembly and symbol information right there, it seemed like all you needed to do was massage it the right way.
[1] https://rev.ng/
But yeah, one recurring problem is that without literature on this esoteric and complex topic, any discussion remotely technical about it (especially without prior context) quickly bogs down, as it needs to be bootstrapped from first principles in order to make any sense.
I've been meaning to write the Necronomicon of delinking to at least be able to point people to a document that contains everything they need to know in order to understand or at least decently grasp this. At the very least it should make for a good horror bed time story for linker developers.
As for the reinvention part, I was referencing the wheel of video game decompilation specifically. By that point Super Mario 64 was already fully decompiled and several other projects were well underway. The established wheel was matching decompilation (either instruction or byte level) and the community was producing tooling for this sole purpose.
In a sense, I did reinvent the wheel of video game decompilation in stark contrast to everything else that was out there at the time. It just turned out horribly right, because before I knew it I was making chimeras executables made out of bits of programs coming from different platforms.
https://people.scs.carleton.ca/~soma/pubs/bfoster-gecco-2010...
Why do you add a dependency? Because you need a certain functionality.
The alternative to adding this dependency is to write the code yourself.
That is totally feasible for a lot of dependencies.
That is totally infeasible for a lot of dependencies.
It's a trade off, as always. The fact is, that most of us, who do this for a living, need to ensure that our software runs and continues to run and may be adapted in a timely manner, when new requests come in from the client/owner.
Using dependencies or not using depdencencies isn't gonna change that.
Now, granted, some ecosystems are a bit extreme on the "let's add a dependency for this 1 line of code."
On the other hand: should I really roll my own crypto? (Jk)
Should I really write those 5000 lines of well tested code myself, just because I can? And because MAYBE it's not touched in a month from now?
Every (later executed) line I add to the project, be it one written by myself or an external dependency, becomes part of the code. So I have to be able to maintain it. Might be easier if I write it myself. Might be way more difficult and time consuming if I write it myself...
I have so many tasks to do and while I enjoy coding, I have to make my top priority a working system.
So should I mindlessly add dependencies? Of course not!
Should I just reinvent the whole world? Of course not! (Unless the client absolutely wants me to. And pays me to do it)
When I give this advice it usually means I don't think the output is better than the existing thing and the dependency cost is better paid in the form of integration.I probably don't think you'll really maintain your creation or think about others using it when you do.
As long as we are throwing shitty incentives around.
But on a more neutral note, it's a tradeoff with many moving parts. Different choices for different scenarios.
It's always internet strangers who are too dogmatic even though there's zero context.
Your job as the decision-making engineer is to develop the expertise to be able to make the right choice more than 95% of the time.
I'm all for reusing frameworks, standard libraries, crypto. Sure, those are things we don't want to recklessly reinvent.
But there's a limit. There should be way more pushback against things like this: https://news.ycombinator.com/item?id=39019001
I’ve certainly done it at work because I didn’t have time (or desire) to learn a library.
But sometimes you have to understand the wheel to reinvent it.
I’m not gonna let a someone roll their own user authentication and authorization system in a rails application.
There’s so many moving pieces you won’t even think about and there’s gems that solve this extremely well. Study the gem and you will learn a lot more. Then reinvent the wheel.
especially in an industry context where you are being paid (there’s that pesky denominator again). R&D, which is what this is, is a very expensive and risky endeavor. 99% of startup projects fail, for example. They would not fail given infinite time, but you cant have that because time is not free:
Interest Is The Price Of Time
A great friend of mine once told me the following quote from an unknown author: "Reinvent the wheel, not because we need more wheels but because we need more inventors." That quote has brought my mind and heart to some modicum of tranquility at various occasions when I wanted to learn some concept and resolved to write my own "toy version" of a software library, framework et cetera. Later on, when I learned about Feynman's quote “What I cannot create, I do not understand”, amalgamated the sentiment that it is okay to build something in order to learn one concept. I have thus far learned that in every new journey to reinvent the wheel, so to speak, often led me to paths where my intuitions about the initial concept got stronger and beyond that, I learned several other concepts.
Many of my side projects are just my own version of things that are specifically tailored to my preferences. I have a notes web app that backs up to S3. I have my own music streaming service. I have my own music discovery algorithm.
Sure these things exist, but because I made them, they do exactly what I want them to do. There is no one "wheel" - there are infinite permutations of what a "wheel" can be.
A huge benefit to not re-inventing is that standardization is a big value. Compare auto tires to phone batteries (back when they were replaceable). Auto tires are standardizes whereas every phone used its own slightly differently shaped battery.
> those who tried to invent a wheel themselves and know how hard it is
> those who never tried to invent a wheel and blindly follow the advice
There's a third, and I think more common group: folks who know all that's involved with reinventing the wheel, and how to do it, and know the juice of doing it isn't worth the squeeze, and there's no value in doing it yourself, educational or otherwise.
How often do I see people metaphorically trying to use a car tire on bicycle with thee excuse of not re-inventing the wheel. There can be great benefits for the parts of your system to be tailor made to work together.
Sometimes there are very good reasons to reinvent the wheel, where the buisness demands a faster x, or there are privacy concerns for y. But most of the time its just a bad idea to re-invent the wheel. Do that shit on your own time, and keep it out of our production codebase.
I know there will be lots of people moaning about curiosity and the like. I somewhat agree. However imagine you are building a new table. Nothing interesting just metal frame with a composite wooden top. You are prototyping is, and suddenly, one of your engineers decides to reinvent screws. Not because there is a need, but because they are bored.
Its not like you need hidden, or fancy screws/attachement devices, the engineer is just curious about trying out a lathe. Not that you have an "automatic" lathe, so it not like you can scale up production.
Its fucking stupid right? yes.
Sure, test out how to make your own fasteners. Even do it on company time. Just don't try and sneak it into production. Its a bad idea.
> A: While I am a strong believer in the future of the rotary dial as an input device, Linus Torvalds might disagree
* HHS -> FDA -> CBER
It's important IMO (IMO only NOT AN EXPERT) because it helps you understand first principles better. As fundamentals change, its helps me to reevaluate these things even though I know nothing will ever come of them.
I am 422 agencies in so far, hoping to finish in-time for Juneteenth. Cant post her because........... but yea.
Re-invent the wheel!
moron4hire•6h ago
People who say "don't reinvent the wheel" should come up with something new to say.
But they can't, because they hate innovation so much.
mcnamaratw•5h ago
hoppp•5h ago
I think reinventing the wheel would be like creating a wheel that does not spin or somethin... some youtubers have done it actually. Its fun to mess around with but hard to create something valuable.
Ekaros•5h ago