The joys of age plus extremely-poor autobiographical memory.
It doesn’t help that what I think of as actually hard, day to day, is shit like documentation from FAANG companies lying to me and wasting a whole fucking day of my time, terrible error messages, mismanagement of upstream projects, bureaucracy making a two-day task take four weeks, and, in some common ecosystems, awful design of core tools and major libraries, or horribly wasteful library churn. “Hard problems” are nothing next to bullshit problems.
Which is to say that usually the hard stuff is an accidental request that nobody actually wants to pay for, so the only thing “hard” about it is recognizing that the client/stakeholder has wandered into territory they really ought not.
The ordinary stuff I worked on last month? Don't bother asking any details; they're gone.
I'm curious, how can he be so sure? How does he know that he has never failed a competent programmer or that he has never passed a poor programmer?
Just because he hasn't seen it fail, it doesn't mean his process hasn't failed.
False negatives are harder, but in small, close knit industries you sometimes hear of a person you rejected doing well
I switched to a battery of knowledge questions centered on the type of programmer I was looking for, it works much better and is even predictive of coding ability as it helps you learn this stuff.
And junior developers who have personal or university projects it’s sometimes too shallow for this check described in this article.
Any skillset requires practice. Any invested programmer will code in their free time; -the newbie modifying css scripts, -the veteran coding utilities for repetitive tasks, -the pro writing a driver back end to bypass resolution or buffer limit - the webdev's customized dashboard and scriptlets
Programmers have at least ONE hobby project, open source contribution, or tasked project that is not scoped under an NDA. If they don't they certainly will not be productive and probably not creative.
We don't expect this from any other profession, do you expect civil engineers or architects to work on personal projects in their free time? If they don't would you challenge them that they don't have the skillset or the critical thinking? That's just ridiculous.
The best of the best engineers have plenty of work to keep themselves fully occupied, especially in the prime of their career.
Hobby projects are a positive indicator for people straight out of college, though.
I don’t have kids (and can’t now thanks to a snip), so it’s relatively easy for me to find time to hack on something for fun, but it’s not wrong for a parent to want to, you know, be a parent.
My work and personal life are separate, and deliberately so.
> Asks questions with the goal of having the candidate teach him about his project.
I really like this. Asking candidates about their PhD thesis has not gone as well as I hoped, there's usually an increasing level of panic as they realise I read it before the interview. Asking about patches they've written to open source has the same effect.
Asking about something I can't verify changes the stress profile on it a lot. Going to change to this strategy. Thank you HN :)
I find Casey Muratori completely insufferable and this title riled me up, but the content is actually pretty good. My perception of him is that he is a good engineer, but generally overconfident and unwilling to approach other peoples' viewpoints with an open mind. The current title played right into my biases.
(At the time of writing, the HN title is "Casey Muratori: I can always tell a good programmer in an interview")
He preaches reasonable performance instead of completely shitting the bed or optimization to the maximum (although he also teaches that).
Look at the JetBeans Java challenge on YouTube where he coaches a (imo below average) programmer. He is very kind. So your assessment just seems unfair to me.
> With this approach, he claims he can always understand if someone is a competent programmer, and he has never seen it fail.
In this kind of situation, never seeing something fail is not a good sign. Evaluation metrics are noisy. One has to accept reality: no process is perfect. Better to admit it and actively seek out failures.
Statistical evaluations matter. Getting some objective distance matters.
> The first principle is that you must not fool yourself and you are the easiest person to fool. - Richard Feynman
As this article touches on, I'm big into pair programming interviews. I was part of the pair programming step at a past company and we'd always use StringCalc [0]. It's starts out super easy and gets gently progressively harder. The goal isn't to finish but to just to see how people think. We would do pretty legit pairing where we'd help if they got stuck on anything or thought we had a better solution. This shows so much about how someone thinks, collaborates, and responds to feedback. Often within 10 mins I could tell how it was going to go. We always had to finish, though, just in case.
Of course it depends on what type of app you have and how your company works and so on and so forth.
On the surface its even more contrived than leetcode. But it has a few benefits;
1. Harder to memorise and prepare for.
2. Harder to ask LLMs.
3. Checks formal schooling or detailed interest in the topic.
learning C with toy projects wont make you perform in this quiz. Spending dedicated effort reading about the inner workings of malloc, RTOS, chipset datasheets, and electronics may.
Many of the questions check understanding beyond the syntax, often one level of abstraction down. For embedded this works nicely. The higher level system design thinking is not applicable to us. We look for people with the mindset and interest to debug the most absure behavior quirks of the hardware when the code misbehave.
But for other fields in think this would falls appart. This particular works for bare metal embedded.
(... i think our interview process may be selecting for ASD as a sideffect)
Is this evidence that their hiring process is sound, and is it more a consequence of Linear being a rocketship? Perhaps if their retention number includes when a bad hire is let go, this is more believable that they are meeting their standards.
I’ve only worked at small startups, but usually “retention” means that no one has left for somewhere better.
My problem with that is that when I get interviewed about a project, I will talk about the whole project, not just my personal contributions to it which I believe is the primary purpose of a job interview. Of course, knowing about the whole of the project is important too, but if I just have an overview and little to no contributions, I wouldn't be a valuable asset to the company I'm interviewing at.
(Not to dismiss my own contributions of course, I'm a competent engineer and can do anything - it's more a matter of what I'm enthusiastic and energised about. I wouldn't be energised by clicking around in the AWS console, but I know of it, for example).
I may be overlooking some really important concepts here, but there I bug. It looks like the need is not for a "good programmer" (the article should define what is a good programmer btw) but for a programmer who applies to the standards for whatever it means. It is a bit chilling. And afaik those leetcode interviews tend to fade away.
> "How they navigate unknown codebases." That point seems very short-term sighted. For how long a codebase is considered as being "unknown" ?
Globally, "good" is not defined and the "scope" of the programmer's job isn't even touched, which I think changes the way you hire someone.
Anways, it was a nice read although I don't really know what to conclude. The pair programming concept is, for sure, the best I would like to experience in an interview.
For instance I found, the same questions, from a script, asked by different evaluators would regularly perform differently. But getting statistical relevance is hard!
So that’s the allure of leetcode. You can get a large population with standardization, relatively cheaply. That it’s actually a bad eval method gets lost in the wash which is unfortunate but I certainly understand it.
Conversely, “talk about your project” was a completely useless eval when I tried to use it. Good candidates failed, bad candidates passed, evaluators had all manner of biases to the point I started being suspicious that _time of day_ mattered more than the answer.
I’d 100% buy that an individual can accurately judge candidates with this approach, but I’d want heavy evidence if you claimed you could scale it.
You don't need a large amount of code, when you're crossing the few hundres/thousands then you're already in the territory where the software developer has trade offs to make.
A skilled BSer can do well in interviews and survive for years.
I want to get a glimpse into how the candidate is going to handle our tech stack, our code base, not one of their old code bases.
I still think this is very possible to learn without LeetCode (which I also disdain and never use).
Most people are reasonably good, so luck doesn't seem that unlikely - someone should draw resumes at random from their pile and make an offer to whoever wins - I'm curious how selecting for people who are lucky (or who God approves of if you want to go there) compares to your process. If you cannot show me data on why your process is better than that (or something else) I have to assume you don't really know.
PLEASE, when someone talks about how to interview can they at least put forth the effort to cite real research. If you cite someone else you can tell me you think it is invalid for whatever reason, but at least show me you care enough to read it before you tell me what I should do.
Personally I don't think this topic exciting enough to dig into (scientific papers tend to be hard to read, I want "an executive summary"). But when I interview someone I limit myself to the questions my HR department tells me to ask because they are scientifically validated to be useful.
Note that the above is the type of question my HR department tells me is scientifically validated. I have not read the research myself, nor do I know how to find it. As such if someone responds "that isn't" they might be right, you will have to judge their expertise themselves: I'm not qualified to know if they are right.
This approach is similar to what I have always said with open source contributions to credible projects since these days, Leetcode puzzles can be solved with AI tools.
I get what they are aiming for, but I foresee trouble:
For one, this kind of interview is way harder to set up than simply asking the candidate to solve an algorithmic question (which is flawed but way simpler).
Also, it can be hard to fine tune so that it's not unfair to the candidate. Some bugs can only be solved after days of looking at the problem, so you have to iterate over this interview setup to find the right difficulty, unfairly ditching candidates in the process. And it becomes "a project"; a lot of companies cannot afford to spend much time on this.
Finally, if you're pairing how do you refrain from helping too much? You're not the one being interviewed, the candidate is! If you allow AI, how do you tweak the problem so that it's both self-contained and reasonably easy, while keeping it impervious to being one-shot by AI?
Of course, traditional interview techniques share some of these problems, but they are way easier to set up. That's what missing here, there's a cost/benefit analysis for interviewers too.
I think this is a great way to tell if someone knows their craft but it could also select for people with 1) really good memory 2) really good bs. I have made lots of technical decisions that I stand by but I have a hard time remembering what I made for breakfast. I kind of have to have the code in front of me to remember those kinds of details.
I think we also need to shift our vocabulary around "leetcode interviews" because there are two very different things that that word is used for, and I think one is fine and the other is not, but because we use the same word for both people end up talking past each other. Basically there is:
1. Fizzbuzz-level (and a little bit higher) programming problems where there's no real puzzle solving, it's just checking that you can literally code at all.
2. Hard puzzle-style algorithmic leetcode questions. The stuff on leetcode.com. For some reason like 80% of this is dynamic programming questions.
I think fizzbuzz level questions are fine and necessary (yeah I wouldn't have believed it either but there's no way I'm hiring someone who can't write a simple for loop).
At the other end hard leetcode questions are not very good - they are often so hard (in an interview context) that they just select for people who have seen them before (e.g. if you grind the top 100 leetcode.com questions), or have had a lot of recent practice (especially of dynamic programming, which is crazy because I've never used dynamic programming once in my life despite doing a very algorithm-heavy job for a few years).
Even worse they often are selected because the interviewer read the solution, thought "ah yes that solution looks easy" and then it makes them feel good about knowing the solution - they don't even realise how hard it actually is.
So, hard leet-code questions should be avoided, but that doesn't mean you should have no whiteboard coding questions.
Presumably I couldn't hire Casey for a team lead position paying £120k per year. Equally, I don't want to miss out on talent by trying to catch every edge case and making an interview process 3 weeks long.
I hired a very technically strong candidate once, he loved optimising games as a hobby. Unfortunately we were a SaaS startup and he seemed to be allergic to using prebuilt components (think SQS) because "we could build them more efficiently". It's impossible to catch every foible like this in an interview scenario.
For these reasons, ultimately there will be bad hires. The biggest mistake is leaders not being willing to fire people. Sometimes this is for fear of reducing head count, because it makes them feel like an arsehole or they aren't themselves capable/invested enough to care. It's painful but I've found it always better to fire fast.
1) Can this person do stuff.
2) Can they learn stuff.
3) Are they interested in learning and doing the things we need them to.
The authors approach of going deep in one area is good for determining #1. You weed out all the people who "were on the team that..." or similar work-adjacent activities and find those that actually did the work / wrote the code / designed the thing / did the testing. Anyone with several years should have worked on more than one thing. Variety of things they actually did indicates an ability to learn which is #2 above. Finally you'll have to gauge how well they'll take to whatever it is you need them to work on, and that's a toss-up. This is where going to a big-name school can be useful since a big part of college is pushing through shit you don't care about for courses that aren't your thing. Otherwise a real interest in your project/product should be enough to motivate the learning and doing if they're capable.
The rest of the "team fit" stuff... You can figure out what they're like on the side while doing an hour long technical interview.
And that's my take on interviewing. I find it quite effective at finding good engineers.
https://caseymuratori.com/about
https://en.wikipedia.org/wiki/Casey_Muratori
He's also that guy in many of ThePrimeagen's YouTube videos.
And wow, that was a great rabbit hole to go down. Interesting guy.
> His opinion is that the second question is much harder to answer, and he doesn't know how to reliably do it, and whether there is a way to answer it.
In my experience, I look for signs that someone is a child trapped in an adult's body. (With a lot of leeway for younger candidates.)
IE, what I'm looking for is someone who isn't going to pick silly arguments, get into pissing matches in code reviews, or argue with facts of the requirements.
---
Years ago I used to use an interview question based around the differences of XML libraries built into .Net. The point was for the candidate to demonstrate that they understood tradeoffs; but one candidate suddenly took the tone of a teenager and yelled, "I don't want to work with XML, I want to work with JSON."
They clearly failed #2. It wasn't because they didn't like XML, it was because they rejected the point of the question.
I am currently a cloud architect, but generally speaking when I've interviewed candidates, I prefer this approach.
If you can't answer detailed questions about something you've allegedly done, you're cooked. In real life, you're not going into a situation where you can't use reference information to help you remember all the details.
Their major complaint of the project approach is not getting signal on adaptability to new codebases. That has never been a first concern at any company I've worked at, and frankly if engineers are touching a new codebase every month then I'm getting a bit worried.
Apparently, Fizz-Buzz is still a "difficult" coding question, considering the number of candidates that still squirm at the prospect of answering it.
What are you talking about? The question is basically just a gotcha to catch out folks that don't know about the modulo operator. Or is this satire and I completely missed the joke?
That you cannot find a an elegant solution to what seems like it should have one is what makes it difficult. A good sign of a good programmer is then you ask them about their code after it works they will say they don't like this solution and would like to spend time making it cleaner. If you know there isn't an elegant solution of course you won't bother spending that time, but if you don't know that if seems like there should be a better answer if you can just restructure the code a little.
An algorithm takes hours to pick up. A language may take weeks. A paradigm takes months. If you use, say, functional programming everywhere, you should probably check that the applicant knows how to write functional. It's not about map(), it's about whether you code with globals and side effects. If it's reactive programming, does it properly observe and collect? If you're doing declarative UI, how do you abstract the pieces, where is the button click handled?
Some things are learnable, but this is where you see people get defensive. If you're moving in from a similar paradigm but different language, you'll find a way and you'll even contribute your own insights. I think people tend to butt heads here and it's a good headbutting simulator if you do some live coding with the team you're working with.
The telltale sign of a padded resume is when the person acts like a cork -- you try to push down deeper and they always pop back up to surface level answers. Some give up the game quickly and admit that they were on the team that did the thing in question, despite their resume saying they did the thing. Other people are very practiced and unflappable and just bob back up over and over without shame.
For new engineers with say three years of experience or less, I can forgive the overstating their experience and I'll shift to asking them to describe their part of that project. But for the people with a few years on their resume, then that behavior is definitely a deal breaker.
Another line of questioning I take for people with experience are questions like: what are the tradeoffs of directed testing vs random testing? When do you have enough confidence in your testing that you say the product is ready to ship? What are some examples of where bugs made it out the door, and what was your process after that happened? What are some reasons why every part of a design might pass their unit tests, but the total design has failures?
These should be easy to discuss for anyone with experience, but there are a surprising number of people who fall flat. Similarly, I am shocked at how often I ask someone to write some code in any language they want to, even pseudocode, for a simple problem and are unable to do it. Eg, given a stream of numbers, maintain the largest two numbers seen so far.
I'll also bounce around and go into depth on particular technologies or protocols, etc which may be relevant as well as other projects' problems and solutions. I also sprinkle in my current interests whatever they may be and have them make assessments, not to see if they agree but to see how they go about analyzing and weighing different factors.
Some company named Linear conducts paid work trials with candidates.
https://linear.app/now/why-and-how-we-do-work-trials-at-line...
--
FWIW, I've always done as Muratori advises.
But I accept any kind of prior writing, in lieu of code. (I've interviewed for DBA, QA/Test, tech writing, etc. I believe, but cannot prove, that anyone who can write an essay can also code.)
Also, when possible, I let teams choose their own members, from the qualified candidates in the funnel. Per advice of Luke Hohmann and others. IIRC Facebook does this too.
But The Correct Answer™ is some kind of (paid) probationary period, with no fault divorce. Long enough to validate skills, get past the "honeymoon" period, and verify team fit.
Alas, that requires change in labor laws, universal health care, (very) cheap day care, and other misc reasonable social safety net infra.
2) Whether they will be productive at your company/project.
this is exactly what I do
also, if an interviewee doesn't know something that I think they should, I help them learn it
enough interviewers make the process unpleasant
I try to do the opposite
sgarland•3h ago
Another problem, which TFA hints at, is bias. System Design interviews are often terrible for this reason. If you present me with some scenario that I know is trivial to handle with a single server, I’m going to want to discuss that, but you are probably expecting me to talk about a message queue, a caching layer, etc. Both are valid depending on the situation, but if you’ve only ever known one type, you may dismiss others out of hand.
dangus•3h ago
kridsdale3•3h ago
ModernMech•1h ago
CoastalCoder•3h ago
OTOH, I'd hate to be in a position where I had to think carefully about every aspect of that stuff during an interview. Even if nobody noticed an inadvertent classified spill, it could still suck.
bluGill•2h ago
SamInTheShell•1h ago
CoastalCoder•1h ago
jvanderbot•3h ago
weavie•3h ago
AnimalMuppet•3h ago
hrimfaxi•2h ago
Making the dependencies of "it depends" explicit is the whole point.
bluGill•2h ago
This also means there are 100 builders in my area who only build stick frame houses and won't even talk to you if you want something else and only 1 or 2 who will even think about those other options. (they do compete with the other builders so costs are not unreasonable)
thelittlenag•3h ago
The goal here is to see if the candidate understands the domain (generic distributed systems) well enough on their own. For more senior roles I look to make sure they can then communicate that understanding to a team, and then drive consensus around some approach.
callamdelaney•3h ago
jvanderbot•3h ago
__alexs•3h ago
jvanderbot•3h ago
pelagicAustral•3h ago
bluGill•3h ago
MomsAVoxell•3h ago
Actually, this is a case of Peters principle[1], clear as day.
Recruiters should not recruit programmers if they, the recruiters, have not worked in a programming context. The best software development recruiter is another programmer.
It also extends, of course, to technical managers and is another factor in how one should approach recruitment interviews - is this placement going to result in a high Peter principle, and if so - is the candidate going to be capable of rising above it, or dealing with it in a way that is conducive to the organizations goals?
Because Peter principle is not always a 'negative' - its more of just a condition that results from a lack of communication between parties who really should know each others jobs, better.
People who know how to understand other peoples jobs as well as their own, work great together.
[1] - not Flynn effect
kube-system•3h ago
gedy•2h ago
Ran into this the other day, a company reached out, and while I wasn't job hunting, the senior role they were trying to fill was basically exactly a perfect fit for me from a skills and background on a Venn diagram.
The first two calls were young, non-technical women (they shared linkedin links), and they clearly did not understand what they were hiring for and couldn't answer questions. They insisted on their scripted questions, and didn't want to talk about the companies where I did the exact role they were hiring for.
I was not rude or arrogant about this but the next day got the "Unfortunately, we've..." email. It's actual pretty funny, I'm just glad I didn't really need the job.
Companies, stop letting HR be your first contacts and screening before technical folks. It doesn't work. No wonder your pipelines are full of the fakers and liars like many of you lament.
jonathrg•2h ago
MomsAVoxell•2h ago
https://en.wikipedia.org/wiki/Peter_principle
rvz•3h ago
That is the entire point of the technical interview.
Someone who is already experienced in a particular subject is also at least qualified to ask the right questions to the candidate related to the subject.
msluyter•3h ago
pton_xd•3h ago
Overall shallow knowledge is not a positive signal, in my opinion. If they really are a firefighter who constantly jumps around, the interviewer should lean in to the organizational challenges they face when identifying and fixing problems across a variety of projects and domains. There's always a way to drill down with more specific questions.
faangguyindia•3h ago
It all can be solved by a system where 1 company hires programmers then other companies can rent specific programer by hour and leave reviews.
This way a programmer doesn't need to interview, he's always employed and getting paid.
If they find no company to bid on your time, they can fire you after sometime.
ttoinou•3h ago
kube-system•2h ago
nwalters512•2h ago
bluGill•1h ago
walkabout•2h ago
It’d also make job hopping far less painful (see above: cheaper for the candidates) which is why they don’t.
So we may conclude: the point of leetcode interviews is wage suppression.
ModernMech•1h ago
Like you said, tech companies need candidates to feel like they barely passed a grueling interview because it makes them wary to jump ship and have to go through that again, not that they are well qualified industry professionals who have the credentials to move between jobs and work anywhere they are certified.
hansvm•2h ago
That's much less efficient for big software projects. The knowledge built up over time is one of the core things making you effective, so you have a big productivity loss if you don't retain the same people for a long time. With a continual middleman, the overhead from that "hiring" process is usually proportional to work done, so after a period of time it will have been more efficient for the employer to hire directly (even at a cost of tens of thousands of dollars) rather than to go through the middleman.
> leaves reviews
We've seen how well that works in every other part of the market.... Interviews are necessary because of a lack of trust. In your system, every party has an incentive to lie, and the incentives are strong enough that it's worth paying people to facilitate your lies. You see that with various contractors all the time, having somebody with good reviews take the contract and somebody with fewer skills actually do the work, the employer understating the work to be done, etc. Good people exist, but they're hard to find in a low-trust environment.
> always employed and getting paid
Interestingly, that usually benefits everyone except the programmer. As a rule of thumb, the person taking risks will have higher returns, so if you can afford those risks you should choose to take them yourself. Examples include various forms of device "warranties" (if you're paying, it's insurance, not a warranty, and I can afford the financial hit when a thumb drive fails, so I'm not going to pay a premium to insure against that possibility), some forms of actual insurance (full coverage on used cars is an example -- if my car is ever totaled I'll just buy another -- I have liability insurance out the wazoo, but full coverage isn't worth it), etc. Programmers often make gobs of money, so except for potentially the very beginning of your career you should be able to easily weather a few years without work. The exact contract details vary, but that would suggest then that programmers are better off on average cutting out the middleman providing those risk guarantees (and they're not really guarantees in the first place, right? people are currently without jobs because there are fewer programming jobs than there used to be, and no middleman scheme is going to fix that; you have to wait for this economic blip to blow over).
I could go on. Contracting is fine, but it's not a replacement for all salaried work.
creer•48m ago
alyxya•3h ago
Eridrus•2h ago
kube-system•2h ago
epolanski•2h ago
ch4s3•2h ago
WesolyKubeczek•2h ago
ch4s3•2h ago
renewiltord•32m ago
Eridrus•2h ago
You run into questions of how well a candidate remembers a project, which may not be perfect. You may end up drilling into a project that is trivial. The candidate may simply parrot things that someone else on the team came up with. And when candidates say things, you really have no way to understand if what they're saying is true, particularly when internal systems are involved.
I have found system design interviews specifically much much better at getting signal. I have picked a real problem we had and start people with a simplified architecture diagram of our actual system and ask them how they would solve it for us. I am explicitly not looking for people to over design it. I do give people the advice at the start of every skills interview tp treat this as a real work problem at my startup not a hypothetical exercise.
I have had a lot more luck identifying the boundaries of people's knowledge/abilities in this setting than when asking people about their projects.
And while everyone interviewing hates this fact, false positives are very expensive and can be particularly painful if the gap is "this person is not a terrible programmer, just more junior than we wanted" because now you have to either fire someone who would be fine in another role if you had the headcount for it or have a misshapen team.