Now, if they did answer with a constraint solver, I'd probably ask some followup whiteboard questions to make sure they do actually know how to code. But just giving a constraint solver as an answer definitely wouldn't be bad.
At this point, job interviews are so far removed from actual relevance. Experience and aptitude still matter a lot, but too much experience at one employer can ground people in rigid and limiting ways of thinking and solving problems.
A trick if you can't do a custom algorithm and using a library is not allowed during interview could be to be ready to roll your own DPLL-based solver (can be done in 30 LOC).
Less elegant, but it's a one-size-fits-all solution.
Otherwise penalizing interviewees for suggesting quick-and-dirty solutions reinforces bad habits. "Premature optimization is the root of all evil," after all.
There is some debate about what premature optimization is, but I consider it about micro optimizations that often are doing things a modern compiler will do for you better than you can. All too often such attempts result in unreadable code that is slower because the optimizer would have done something different but now it cannot. Premature optimization is done without a profiler - if you have a profile of your code and can show a change really makes a difference then it isn't premature.
On the other hand job interviews imply time pressure. If someone isn't 100% sure how to implement the optimization algorithm without looking it up brute force is faster and should be chosen then. In the real world if I'm asked to do something I can spend days researching algorithms at times (though the vast majority of the time what I need is already in my language's standard library)
They can also be dreadfully slow (and typically are) compared to just a simple dynamic program.
Do you know how few people in this world even know what a constraint solver is, let alone how to correctly define the problem into one?
I used a constraint solver to solve a homework problem once in my CS degree 3rd year. My god just writing the damn constraints was a huge cognitive load!
I do hope you're exagerating here, but in case you aren't: this is an extremely simplistic view of what (software) engineers have to do, and thus what hiring managers should optimize for. I'd put "ability to work in a team" above "raw academic/reasoning ability" for the vast majority of engineering roles, any day.
Not that the latter doesn't matter, of course, but it's by no means the one and only measure.
I should have said "if you deemed this a fail on the code interview, you are an idiot".
In this hypothetical, why do you do leetcode hard interviews?
In any real engineering situation I can solve 100% of these problems. That's because I can get a cup of coffee, read some papers, look in a textbook, go for a walk somewhere green and think hard about it... and yes, use tooling like a constraint solver. Or an LLM, which knows all these algorithms off by heart!
In an interview, I could solve 0% of these problems, because my brain just doesn't work that way. Or at least, that's my expectation: I've never actually considered working somewhere that does leetcode interviews.
A company's interview process tells you a lot about how the company thinks and operates. This was was surely a dumpster fire.
> It's easy to do in O(n^2) time, or if you are clever, you can do it in O(n). Or you could be not clever at all and just write it as a constraint problem
This nails it. The point of these problems is to test your cleverness. That's it. Presenting a not-clever solution of using constraint solvers shows that you have experience and your breadth of knowledge is great. It doesn't show any cleverness.
No it's just memorization of 12 or so specific patterns. The stakes are too high that virtually everyone going in will not be staking passing on their own inherent problem solving ability. LeetCode has been so thoroughly gamified that it has lost all utility of differentiability beyond willingness to prepare.
In no case is it a useful signal on if I can do my job better than someone else. Some people like this type of problem and are good at it anyway which is a good signal compared to average - but there are also above average people who don't enjoy this type of problem and so don't practice it. Note that both cases the people I'm talking about did not memorize the problem and solution.
Like in race? Like in wealth? Like in defection willingness? Like in corruption?
Asking for a friend who is regularly identified as among the most skilled but feels their career has been significantly derailed by this social phenomenon.
If somebody grinds LeetCode while hating it, it signals they are really desperate for a job and willing to jump through hoops for you.
If somebody actually enjoys this kind of stuff, that is probably a signal that they are a rare premium nerd and you should hire them. But the probably play Project Euler as well (is that still up?).
If somebody figures out a one-trick to minmax their LeetCode score… I dunno, I guess it means they are aware of the game and want to solve it efficiently. That seems clever to me…
Interviewers learn nothing from an instant epiphany, and they learn next to nothing from someone being stumped.
Unfortunately, this is why we can't have nice things. Problem solving questions in interviews can be immensely useful tools that, sadly, are rarely usefully used.
100% and it's a shame that over time this has become completely lost knowledge, on both sides of the interview table, and "leetcode" is now seen as an arbitrary rote memorization hurdle/hazing ritual that software engineers have to pass to enter a lucrative FAANG career. Interviewees grind problems until they've memorized every question in the FAANG interview bank, and FAANG interviewers will watch a candidate spit out regurgitated code on a whiteboard in silence, shrug, and say "yep, they used the optimal dynamic programming solution, they pass."
Absolutely agree. When I interview, I start with a simple problem and add complexity as they go. Can they write X? Can they combine it with Y? Do they understand how Z is related?
Last round I did at Meta it was clearly to test that you grinded their specific set of problems, over and over again, until you could reproduce them without thinking. It's clear because the interviewers are always a bit surprised when you answer with whatever is not the text-book approach on both leetcode and on the interview guide they studied.
Cleverness is definitely not high on the list of things they're looking for.
In my experience, interviewers love going to the Leetcode "Top Interview 150" list and using problems in the "Array String" category. I'm not a fan of these problems for the kind of jobs I've interviewed for (backend Python mostly), as they are almost always a "give me a O(n) runtime O(1) memory algorithm over this array" type challenge that really doesn't resemble my day to day work at all. I do not regularly do in-place array algorithms in Python because those problems are almost always handled by other languages (C, Rust, etc.) where performance is critical.
I wish interviewers would go to the "Hashmap" section for interviews in Python, JavaScript, etc., type of languages. They are much less about cleverness and more about whether you can demonstrate using the appropriate tools in your language to solve problems that actually do resemble ones I encounter regularly.
There's also the problem of difficulty tuning on some of these. Problem 169 (Majority Element) being rated "Easy" for getting a O(n) runtime O(1) memory solution is hilarious to me. The algorithm first described in 1981 that does it (Boyer–Moore majority vote algorithm) has a Wikipedia page. It's not a difficult to implement or understand algorithm, but its correctness is not obvious until you think about it a bit, at which point you're at sufficient "cleverness" to get a Wikipedia page about an algorithm named after you. Seems excessive for an "Easy" problem.
return Counter(nums).most_common(1)[0][0]
And that's 50th percentile for runtime and memory usage. Doing it with another one liner that's 87% percentile for time because it uses builtin Python sorting but is 20th percentile for memory:
return sorted(nums)[len(nums) // 2]
But the interviewer might be looking for the best approach, which beats "100%" of other solutions in runtime per Leetcode's analysis:
m, c = -1, 0
for x in nums:
if not c:
m = x
c = 1
elif m == x:
c += 1
else:
c -= 1
return m
If I were interviewing, I'd be happy with any of these except maybe the sorted() one, as it's only faster because of the native code doing the sort, which doesn't change that it's O(n log n) time and O(n) space. But I've had interviews where I gave answers that were "correct" to the assumptions and constraints I outlined but they didn't like them because they weren't the one from their rubric. I still remember a Google interview, in which we're supposed to "design to scale to big data", in which they wanted some fiddly array manipulation algorithm like this. I gave one that was O(n log n) but could be done in place with O(1) memory, and the interviewer said it was "incorrect" in favor of a much simpler O(n) one using dicts in Python that was O(n) memory. Had the interviewer specified O(n) memory was fine (not great for "big data" but ok) I would have given him the one liner that did it with dicts lolI guess my point is that interviewers should be flexible and view it as a dialogue rather than asking for the "right answer". I much prefer "identify the bug in this self contained code snippet and fix it" type problems that can be completed in <15-30 minutes personally, but Leetcode ones can be fine if you choose the right problems for the job.
I would rather work with a flexible data type with suboptimal performance than a brittle data type that maybe squeezes out some extra performance.
Your example of in-place array mutation feels like a good example of such a thing. I feel like there should be a category of interviewing questions for "code-safety" not just performance.
You need to make sure a candidate can program so asking programing question make sense. However the candidate should not be judged on if they finish or get an optimal or even correct answer. You need to know if they write good code that you can understand, and are on a path that if given a reasonable amount of time on a realistic story would finish it and get it correct. If someone has seen the problem before they may get the correct answer, but if they have not seen it they won't know and shouldn't expected to get the right answer in an hour.
All of the ones listed can be solved with a top down dynamic programing algorithm. Which just means "write recursive solution, add caching to memoize it".
For some of these, you can get cleverer. For example the coin change problem is better solved with an A* search.
Still, very few programmers will actually need these algorithms. The top thing we need is to recognize when we accidentally wrote a quadratic algorithm. A quick scan of https://accidentallyquadratic.tumblr.com/ shows that even good people on prominent projects make that mistake on a constant basis. So apparently being able to produce an algorithm on the test, doesn't translate to catching an algorithmic mistake in the wild.
If my wife's blood sugar is high, she takes insulin. If you need to solve a constraint problem, use a constraint solver.
If your company doesn't make and sell constraint solving software, why do you need me to presume that software doesn't exist and invent it from scratch?
At least that's been my experience. I'm sure there are exceptions.
> contractor
Do FAANG hire contractor in India?
Really? This kind of interview needs to go away.
However, coding interviews are useful. It's just that "knowing the trick" shouldn't be the point. The point is whether the candidate knows how to code (without AI), can explain themselves and walk through the problem, explain their thought processes, etc. If they do a good enough reasoning job but fail to solve the problem (they run out of time, or they go on an interesting tangent that ultimately proves fruitless) it's still a "passed the test" situation for me.
Failure would mean: "cannot code anything at all, not even a suboptimal solution. Cannot reason about the problem at all. Cannot describe a single pitfall. When told about a pitfall, doesn't understand it nor its implications. Cannot communicate their thoughts."
An interview shouldn't be an university exam.
> We can solve this with a constraint solver
Ok, using your favorite constraint solver, please write a solution for this.
> [half an hour later]
Ok, now how would you solve it if there was more than 100 data points? E.g. 10^12?
This doesn't mean they can't provide a constraint solver solution, but if they do, they'd better be prepared to address the obvious follow-ups. If they're prepared to give an efficient solution afterward in the time left, then more power to them.
Second of all, global optimization techniques exist and work. They give you a decent solution now and a better solution eventually. Techniques like Genetic Algorithms, Simulated Annealing, etc are all global optimizers which meet the requirements you mention.
I claim that gradient free global optimization is superior to local optimization/gradient based methods. I claim that we have criminally underutilized global optimizers, especially in the context of neural network weight optimization. GA's are amicable to current GPU design. It's really sad that those skilled at writing CUDA attention kernals haven't tried to build a fast distributed genetic algorithm for neural network weight training on a B200.
Greedy algorithms tell you nearly nothing about the candidate's ability to code. What are you going to see? A single loop, some comparison and an equality. Nearly every single solution that can be solved with a greedy algorithm is largely a math problem disguised as programming. The entire question hinges on the candidate finding the right comparison to conduct.
The author himself finds that these are largely math problems:
> Lots of similar interview questions are this kind of mathematical optimization problem
So we're not optimizing to find good coders, we're optimizing to find mathematicians who have 5 minutes of coding experience.
At the risk of self-promotion, I'm fairly opinionated on this subject. I have a podcast episode where I discuss exactly this problem (including discuss greedy algorithms), and make some suggestions where we could go as an industry to avoid these kind of bad-signal interviews:
https://socialengineering.fm/episodes/the-problem-with-techn...
-what tech you worked with and some questions about decisions
-debugging an issue they encountered before
-talking about interests and cultural fit
Instant green flag for me. Too bad that after receiving my offer covid happened and they had a hiring freeze.
The mature, state-of-the-art software companies do not give me leetcode problems to solve. They give me interesting & challenging problems that force me to both a) apply best practices of varying kinds and yet b) be creative in some aspects of the solution. And these problems are very amenable to “talking through” what I’m doing, how I’m approaching the solution, etc. Overall, I feel like they are effective and give the company a good sense of how I develop software as an engineer. I have yet to “fail” one of these.
It is the smaller, less mature companies that give me stupid leetcode problems. These companies usually bluntly tell me their monolithic codebase (always in a not-statically-typed language), is a total mess and they are “working on domain boundaries”.
I fail about 50% of these leetcode things because I don’t know the one “trick” to yield the right answer. As a seasoned developer, I often push back on the framing and tell them how I would do a better solution by changing one of the constraints, where the change would actually better match the real world problem they’re modeling.
And they don’t seem to care at all. I wonder if they realize that their bullshit interviewing process has both a false positive and a false negative problem.
The false negatives exclude folks like myself who could actually help to improve their codebase with proper, incremental refactorings.
The false positives are the people who have memorized all the leetcode problems. They are hired and write more shitty monolithic hairball code.
Their interviewing process reinforces the shittiness of their codebase. It’s a spiral they might never get out of.
The next time I get one of these, I think I’m going to YOLO it, pull the ripcord early and politely tell them why they’re fucked.
That being said, from a stoicism point of view, the interview ends up becoming a meta-challenge on how you approach a problem that is not necessarily appropriately framed, and how you'd go about doing and/or gently correcting things as well.
And if they're not able to appreciate it, then success! You have found that it is not the right organization for you. No need to burn the door down on the way out, just feel relief in that you dodged a bullet (hopefully).
So when I say I’d politely tell them why they’re fucked, it’s actually out of a genuine desire to help the company.
But you’re right, I’m also thankful that they showed their red flag so visibly, early enough, and I’m happy to not move forward!
The solution is typically not just to fix their code. They got in over their heads by charging ahead and building something they'll regret, but their culture (and likely the interviewer personal self-regard) depends on believing their (current) tech leaders.
So yes, the interviewer is most comfortable if you chase and find the ball they're hiding.
But the leadership question is whether you can relieve them of their ignorance without also stripping their dignity and future prospects.
I've found (mostly with luck) that they often have a sneaking suspicion that something isn't right, but didn't have the tools or pull to isolate and address it. As a leader if you can elicit that, and then show some strategies for doing so, you'll improve them and the code in a way that encourages them that what was hard to them is solvable with you, which helps them rely on you for other knotty problems.
It's not really that you only live once; it's that this opportunity is here now and should have your full attention, and to be a leader you have to address it directly but from everyone's perspective.
Even if you find you'd never want to work with them, you'd still want to leave them feeling clearer about their code and situation.
Clarifying my "YOLO" usage: I was being a little flippant, in the sense that when ending an interview early with direct critical feedback, the most likely outcome is a "burned bridge" with that company (you're never coming back).
Which reminds me one of my favorite twisted idioms: We'll burn that bridge when we get to it!
I guess I've finally found an acceptable real-world use case for this phrase :)
I played it for a while when interest rates were really low and used the thing for my own rainy day savings(I did get tired changing accounts all the time)
(if you have enough time)
He wanted to make an app to help sports club owners schedule players for the day based on a couple simple rules. I thought this was going to be easy, and failed after not realizing what I was up against. At the time I didn't even know what I didn't know.
I often look back on that as a lesson of my own hubris. And it's helped me a lot when discussing estimates and timelines and expectations.
> Given a list of stock prices through the day, find maximum profit you can get by buying one stock and selling one stock later.
It was funny to see this, because I give that question in our interviews. If someone suggested a constraint solver... I don't know what I'd have done before reading this post (since I had only vaguely even heard of a constraint solver), but after reading it...
Yeah, I would still expect them to be able to produce a basic algorithm, but even if their solution was O(n^2) I would take it as a strong sign we should hire them, since I know there are several different use cases for our product that require generalized constraint solving (though I know it by other names) and having a diverse toolset on hand is more important in our domain than writing always-optimal code.
Update... refactor... update... break off... etc. A lot of times, I'm looking at something where the tooling is 8+ years old, and the first order of business should be to get it working on a current and fully patched version of whatever is in place... replacing libraries that are no longer supported, etc. From there, refactor what you can, break off what makes sense into something new, refactor again. This process, in my experience, has been far more successful than ground up, new versions.
I say this while actively working on a "new" version of a software. New version being web based, "old" version being a winforms VB.Net app from over a decade ago. Old version has bespoke auth, new verion will rely on Azure Entra... Sometimes, starting over is the answer, but definitely not always.
Because of this, I've just started rejecting outright leetcode/ai interview steps... I'll do homework, shared screen, 1:1, etc, but won't do the above. I tend to fail them about half the time. It only feels worse in instances, where I wouldn't even mind the studying on leetcode types sites if they actually had decent explainers for the questions and working answers when going through them. I know this kind of defeats the challenge aspect, but learning is about 10x harder without it.
It's not a matter of skill, it's just my ability to take in certain types of problems doesn't work well. Without any chance of additional info/questions it's literally a setup to fail.
The worst ones i've had though had extra problems though:
one i was only told about when i joined the interview and that they would be watching live.
One where they wanted me streaming my face the whole time (maybe some people people are fine with that)
And one that would count it against me if i tabbed to another page. So no documentation because they assume i'm just googling it.
Still it's mostly on me to prepare and expect this stuff now.
1. People can be hired to take the test for you - surprise surprise 2. It is akin to deciding if someone can write a novel from reading a single sentence.
> It is akin to deciding if someone can write a novel from reading a single sentence.
For most decent companies, the hiring process involves multiple rounds of these challenges along with system designs. So its like judging writing ability by having candidates actually write and come up with sample plots. Not a bad test.
Its a learnable skill and better to pick it up now. Personally I've solved Leetcode style problems in interviews which I hadnt seen before and some of them were dynamic programming problems.
These days its a highly learnable skill since GPT can solve many of the problems, while also coming up with very good explanations of the solution. Better to pick it up than not.
And some people might say well, you should know that anyways. The problem for me is, and I'm not speaking for every company of course, you never really use a lot of this stuff in most run of the mill jobs. So of course you forget it, then have to study again pre interview.
Problem solving is the best way to think of it, but it's awkward for me(and probably others) to spend minutes thinking, feeling pressured as someone just stares at you. And that's where memorizing the hows of typical problems helps.
That said, I just stopped doing them altogether. I'd passed a few doing the 'memorizing' described above, only to start and realize it wasn't at all irrelevant to the work we were actually doing. In that way I guess it's a bit of a two way filter now.
Huh? Of course you can. If you're practicing on leetcode, there's a discussion thread for every question where you can ask questions till the cows come home. If you're in a job interview, ask the interviewer. It's supposed to be a conversation.
> I wouldn't even mind the studying on leetcode types sites if they actually had decent explainers
If you don't find the hundreds of free explanations for each question to be good enough, you can pay for Leetcode Pro and get access to editorial answers which explain everything. Or use ChatGPT for free.
> It's not a matter of skill, it's just my ability to take in certain types of problems doesn't work well.
I don't mean to be rude, but it is 100% a matter of skill. That's good news! It means if you put in the effort, you'll learn and improve, just like I did and just like thousands and thousands of other humans have.
> Without any chance of additional info/questions it's literally a setup to fail.
Well with that attitude you're guaranteed to fail! Put in the work and don't give up, and you'll succeed.
Yeah this one confused me. Not asking clarifying questions is one of the sureshot ways of failing an interview. Kudos if the candidates ask something that the interviewers havent thought of, although its rare as most problems go through a vetting process (along with leak detection).
Maybe it's my graphics programmer brain firing on all cylinders, but isn't this just a linear scan, maintaining a list of open rectangles?
That's a bad algorithm, then, not a greedy algorithm. Wouldn't a properly-implemented greedy algorithm use as many coins as possible of a given large denomination before dropping back to the next-lower denomination?
If a candidate's only options are to either use a constraint solver or to implement a naïver-than-usual greedy algorithm, well, sorry, but that's a no-hire.
It's interesting how powerful contraint solvers are (Ive never used one).
But actually all of these problems are fairly simple if we allow brute force solutions. They just become stacked loops.
taylodl•2h ago