This is very insightful and applies to many fields, maybe specially within STEM
Like for example the water flow analogy of current. It’s a great analogy that works to a great degree to explain and model a lot of things about electricity, but eventually breaks down
For 99%+ of the people and cases, the approximate analogy is perfectly useful
However, I think that's a broader thing than just "technical" knowledge: you should know a little about what your customers do, what your manager does, what the role is of systems peripheral to yours is, etc.
But, with the added context that it's from a personal blog, we should give the benefit of the doubt that the author is just not good at writing headlines and give the article a shot on its own merits...
Universities have their problems, but getting students to see the value in subjects on the fringe, or completely outside, of their primary field of study is not one of them. These are the places new and novel solutions are born. Even if someone isn’t an expert, knowing enough to bring in an expert and facilitating the conversation can pay dividends.
I was once tasked with getting a new team up to speed quickly in a new site we were standing up. The manager at the time wanted to forgo training entirely to just let them figure it out, in the name of speed. I dug my heels in and still ran everyone through it. With some, it was extremely fast, and there was no way they were going to absorb it. However, I wanted them to at least hear it, so if something came up, they may not know what to do, but they will hopefully at least know enough to ask, so we can then dive deeper and show them the right way. The company had its own way of doing almost everyone, so someone doing what they thought was right based on previous experience often led to a mess.
1) if they were effective then we wouldnt see millions of students jumping into 60k a year private institutions
2) I don't buy this at all, I think students mostly select based on prestige. Academic rigor and research opportunities are mildly correlated to this but def not what high schoolers and their parents index on
With the enlightenment, came enlightenment values. During this time, study of Greek and Latin were practically standard.
With the coming of industrialization, many adopted the German model of education and became glorified trade schools for the industrial age, churning out classics majors and engineers in equal measure.
Post WW2 with Vannevar Bush's influence, American universities became institutions of research and a crucial part of the military industrial complex.
Finally, with the advent of television, college football became immensely profitable sources of funding for many colleges and universities as well as a huge attraction source for alumni donations.
You say Universities are a clusterfuck, but in reality they have simply evolved with the times, and hence carry a lot of cultural baggage. I don't think that's a bad thing.
And btw, I don't deny that there is value in intermingling vocational studies and cutting edge research - it very much makes sense for some STEM disciplines, but that's the exception not the rule imo.
The elite schools are still elite schools. They just do cutting edge research now... research originally intended to win a thermonuclear WW3. Their humanities departments of these schools may very well still also stuffed with academics influenced by Soviet active-measures campaigns from the cold-war which actively seek to undermine the power of American institutions to win that long settled conflict.
The agricultural colleges of the 19th century now do research and have football teams now.
American Universities are incredible powerhouses of research. The impact of research efficacy is power law distributed, just like youtube influences, and American wealth. A lot of people in the tech industry seem to be jaded by Universities due to the fact that they've gotten outrageously expensive compared to the median income, that for the past 15 years you can make easier money doing webservices than Greek literature, and the majority of universities are not Harvard or MIT. But you can't deny the enormous contributions to society from American universities. Many people we hold in high regard started off as "mediocre" students.
And the truth is there are many colleges and universities in America that are vocational schools. They just don't want to admit it. How many community colleges in the states are really preparing kids to become nobel laureates? The anti-university sentiment is just another offshoot of the anti-elitism (and perhaps anti-intellectualism) running through American society today... a natural consequence of the situation where you have massively skewed wealth distribution and massively skewed success outcomes from that power law distribution I mentioned earlier.
Something extremely simple like a 1300-1400 SAT cutoff for university will get us half way there. University being an institution that qualifies for large research grants.
What universities were (and still are) really great at is being a place for intense learning. What 99% of their customers now want is vocational training to get a well paying job.
Exactly because of the status/prestige (+ government incentives + being very far from a free market), customers that don't really want what the university is selling are attending in droves.
My nephews are about to enter that same high school about 25 years later. From what I’ve gleaned, they still have the program for vocational training. I think the kids went on a field trip there is middle school.
Could they? This is a perennial HN complaint; however, the salaries for jobs that vocational schools would train your for show very little pressure to increase. That doesn't indicate some mythical pent up demand that a bunch of new vocational students could slot into.
Most people are going to college because they want to avoid working at the equivalent of an Amazon warehouse. If you can give them a way to do that, they'll happily skip going to a university.
The pay seems pretty comparable to an Amazon warehouse until you pass at least 3 years of training.
And not everybody gets to wire up air conditioned data centers. A lot of wiring is outside in the hot sun climbing up and down ladders.
If the salaries were significantly better, I should think we'd see a lot of people beating down the doors to be electricians, but we don't ...
> however, the salaries for jobs that vocational schools would train your for show very little pressure to increase.
Aren't "Code Camps" vocational schools?I mean another perennial HN complaint is about how "off-topic" university CS courses are. Personally, I disagree. I'm with Knuth on this one:
> People who are more than casually interested in computers should have at least some idea of what the underlying hardware is like. Otherwise the programs they write will be pretty weird.
To be fair, sometimes it really doesn't matter if the program is weird or not. A machinist or tradesman is different from an engineer. But clearly there is some value to those classes even if you don't end up using that knowledge directly. Doesn't make it off-topic or useless, but we definitely frame things this way.I mean the major thing I would change about code camps is make it a 1 year or 2 year process and actually get some depth and nuance instead of being a cram school focused more on trying to get you to pass the test than trying to teach you what the test is trying to test.
Are they? A lot of professors hate and/or are bad at teaching. Especially at R1 research universities.
There are lots of problems, specifically for undergrads:
- Large class sizes
- Professors that don't like teaching
- Professors that can't teach well
- Professors that hand off teaching to TAs, RAs, doctoral students, etc.
- Poor feedback on homework, essays, projects, and exams
- Unnecessary classes
- Outdated classes and curriculum
- Sports programs that serve as recruitment and distraction that are orthogonal to learning
- Admin structures that care more about facilities, growth, and recruitment than learning and research
- Systems that are "too big to fail"
- Student loans that are disconnected from bankruptcy, which feeds a recursively growing monster in such a way that it isn't exposed to evolutionary pressures. There is no risk, so malinvestment doesn't bear consequences.
> What 99% of their customers now want is vocational training to get a well paying job.
You can want a curriculum rich in theory and the vocational training for a well-paying job. The problem is that universities are full of perverse incentives - the admin and faculty are at odds, and often even the faculty itself isn't aligned with teaching.
It's a weird org structure steeped in tradition, nostalgia from alumnus, and a faculty tenure system that doesn't always reward the right things.
As usual, it's not black and white: there's no single answer that fits everyone or every field. That said, I'll give an example from computer science where I've seen many people struggle if they haven’t taken (and approved) a course on operating systems: topics like race conditions, mutexes, and concurrency.
While these aren't especially difficult concepts, they're not inherently tied to knowing a specific programming language. They transcend language syntax, even though some languages offer syntax for handling them. The problem I often see is twofold: either developers don't apply locking mechanisms at all (leading to unsafe behavior), or they use them excessively or incorrectly, resulting in deadlocks or significant performance issues. Clearly concurrency could be really hard but I am talking here about the basics.
Historically there was no publish or perish paradigm and you could do this. People did get kicked out for years of no production but it wasn't uncommon for researchers to take a long time to publish anything. Usually it was "hey, just show us what you've been doing". Getting researchers to communicate to the rest of the community. The problem wasn't people sitting around doing nothing, it was them being caught up in their work and not sharing their progress.
Now, things got flipped upside-down. You get fired if you don't publish fast enough. We missed the reason we started measuring publication rates in the first place.
So now we have the inverse problem. People are trying to publish too early. It compounded though. We now changed the peer-review process. That used to be you publish and then peers... review... Responding with papers of their own and such. Journals were less picky, mostly rejecting works for major mistakes or plagiarism. Other than that... well... you can't really verify the correctness of a paper just by reading it... The purpose of journals was that we didn't have the internet and it would cost a lot of money to send every paper to every library. Before, you'd literally just take your "pre-print" and put it in a section of the library at your university where your university peers would make comments. Now we don't talk to the researcher who's next door.
And we now have this thing of novelty that's completely subjective and compounding with how obvious something is only after hearing it and highly dependent on how well it is communicated. Frankly, if I communicate something really well it should be obvious and you should think that you could have come up with it yourself. That's because you understand it! But now that is a frequent reason to reject. We think we can predict impact of work but there's tons of examples where highly influential works got rejected for lack of novelty or seeming trivial. Hell, a paper that got the Nobel prize in economics got rejected multiple times for these reasons. Both getting called "too obvious" AND "obviously false"[0]. We're just really bad at this lol Those rejections don't make papers better, they just waste time resubmitting and trying to rewrite to figure out how to win at some slot machine.
Academia still works and is still effective, but that doesn't mean there aren't issues. Certainly there are often fewer pressures in an academic setting to do research than at a company. The Uni only cares that you produce papers. Even if it is a misaligned metric, the incentive is to just pick easier problems. The business needs you to make something with __short term business value__. Bigger companies can give more freedom, but the goals are different.
Really, the problem can be seen through the Technology Readiness Level chart[1]. Businesses rarely want to research anything below TRL 5. Really you want to be at 7 or 8. Where the problem with academia is the incentives make it so you want to be around TRL 3 or 4, which leaves TRL 1 and 2 vacant. It still happens, just less of it. Tenure can't fix this if you still got grad student who must publish or perish.
[0] https://en.wikipedia.org/wiki/The_Market_for_Lemons#Critical...
[1] https://en.wikipedia.org/wiki/Technology_readiness_level
What's most baffling to me is the rejection of research and theory (depth in knowledge). Claiming that the work isn't impactful. But that's like saying the ground you stand on doesn't matter...
I'm absolutely astounded this is such a common opinion among programmers and CS people. We're literally building the largest companies in the world and bringing about the information revolution and AI revolution on technology that isn't even 100 years old. It's rapidly getting better because of research and we're not waiting for a hundred years of return on investment here.
It's anti-intellectualism. Often spewed by those trying to prove their own genius, demonstrating the opposite. CS of all people should be able to recognize how complex even the simplest things are. For fuck's sake, we can't even get timezones right lol. We need to balance depth, not ignore it or reject it (I don't think the author argued that btw)
It feels excessively myopic. And honestly, the major problem with academia is the same myopia!
| How do you manage genius?
You don't
- Mervin Kelly (Bell Labs)
| What I would like to see is thousands of computer scientists let loose to do whatever they want. That's what really advances the field.
- Donald Knuth
I could quote a hundred more from a hundred fields.I think we have this weird image in our heads that researchers do nothing and if left to their own devices will just waste time and money. I write with my pocket computer that sends signals across the world and into space, passing through machines moving so fast their clocks disagree. Our science isn't taking centuries to benefit from. It rarely ever took decades.
Yet historically most science was done by the rich who had free time. Sure, we're moving faster now but we also have several orders of magnitude more scientists. Our continued growth doesn't mean we've become more efficient.
We seem to be really bad at attributing the causes of success. We're fixate on those at the end of a long chain of work. I mean even NVIDIA depends on TSMC, as does Apple, Google, Microsoft, and others. And TSMC is far from the root. I'm not trying to say how the economics should all fall out but its at least a helpful illustrative target to look at the biases in how we think.
I had a boss who let me do this for a while. He just told me to do whatever I wanted that would help the team. He didn’t talk to me for 2 years after that. For the first few weeks I was kind of stressing to find what to do and show some results, but after that the boredom set in, and that’s one things took off. It was the most productive I’ve ever been. I was regularly working 12+ hour days, because I was enjoying what I was working on. After 2 years I had so many projects and so much stuff that they built a whole team around what I was doing to spread the load out a little. That actually helped me get bored again, so the ideas started flowing again. Those were the good ole days.
A lot of what I did started as research, then I applied what I learned. It was a nice balance to keep things interesting, rather than being in research mode or build mode all the time.
Most people want to work. They think "hey, I'm here, might as well do something." When we're talking about experts in a field (academic or work), usually what interests them the most is the things that matter the most. Giving free time to "play" allows for these larger challenges to be solved. Things that you could never pitch to a manager because it's highly technical, hard to measure, and difficult to prove. But expertise tends to fill in those gaps.
Obviously you can't and shouldn't do this with everyone. Juniors shouldn't have completely free range. They need some to be able to learn this process, but need much more hand holding. But a senior? That's a position with high levels of trust. They should know the core business and you're literally hiring them to be an expert, right? And of course there are people that just want a paycheck. I think a surprising amount of them will still work regardless, but maybe not as much and as effectively. Certainly, micromanaging people will not get these people to do more work, and you risk just becoming overburdened with people in administrative positions.
Usually, you can sniff out the people that should be given more free reign. You don't have to understand the technical, you only have to detect passion. Some people will fool you, but passion is a pretty good litmus test. There's no optimal global solution here, so we have to accept some losses. Doesn't prevent us from trying to minimize that loss, but I think we get overly concerned with the losses that are easy to detect. Removing those often just results in your losses being harder to detect, not becoming non-existent. It's like the survivorship bias problem. You can't measure the hits on the planes that don't make it back. In our case, losses through employees (including managers) metric hacking. Frankly, we want our losses to be visible, because that makes them treatable.
I can do this since I love reading about all sorts of random topics, a lot which pop up here, and while I seldom recall the details, I can recall enough to know when it might be relevant and how to find it again.
Sooo many diverse topics has suddenly cropped up at work, where everyone else is fairly stumped but I can say "I'm sure I've heard of this before" and with a few minutes have found back the resource which details the solution, or something to that effect.
Thus I too prefer getting blasted with info when starting a new job or new project, so I can recall the relevant key words when they pop up.
I noticed colleagues calling me a lot less with questions that only I can answer. Several admitted to me that they now use AI for the same kind of “find me some obscure vaguely specified thing”. It is one of the few things the AIs do really well.
My other "superpower" is digging into documentation and figuring out how to actually use stuff I've never seen before. Another thing that might be under threat from AIs soon. I've certainly used AIs for my own hobby projects in this regard, sometimes with good result, so it's surely a matter of time.
Though at least at my current job, my most valuable skill is being able to understand the customer's needs, and being able to come up with solutions that solve their problems well without breaking the bank. Part of that is finding out how to best utilize existing code, which means I like to work on varied parts of the code base. Part of it is probing the customers to understand how they operate, which limitations they have and so on.
I think part of that is thanks to the same drive that lead me to all these obscure topics, which drives me to want to understand the existing code and the customers domain, which in turn puts me in a much better position to help and guide our customers to a good solution.
Not sure if AI's will do that too soon, time will tell.
I'm not sure why we aren't trying to make this more complementary. I really don't want my LLM to be direct search just as I don't want my direct search to be an LLM. Frankly, the context of what I'm looking for matters. Don't make it an end-to-end thing. Just give me a fucking toggle switch for direct search or fuzzy search (hell, throw in an "I don't know" and make it the default)
I'm not worried about the AI replacing me here because the "superpower" (and I assume the gp's) here isn't having broad awareness. It is the ability to abstract and connect seemingly unconnected things. Once the AI is there, it's going to have already been able to replace a lot more stuff. The "superpower" is creativity combined with broad knowledge-base. LLMs seem more complementary here than replacing.
I have run into this countless times when using AI. I asked for ideas around a topic on how to solve a problem, and it seems to miss a really good solution. I bring it up, and it the says something like, “oh yeah, that is much better.” On the flip side, if I lead it with some ideas, it has trouble breaking free of that and it tells me I already have the best idea.
If the topics coming together are seemingly unrelated, it takes a good prompt to get the AI to link those ideas on the path toward a solution.
Just today I was asking Copilot about different ideas on how to structure a new project. I laid out some pseudo code with my initial idea, and it gave it back to me with a more complex syntax. I asked why, and if there were any advantages to the way it did it, and then it told me no, my way was better, cleaner, and the preferred way for the language. Though after pushing it some more it did suggested another alternate suggestion, which is tried to dismiss as worse, until I explained why it would actually be better. As far as I’ve seen, at least with Copilot (which is all I’m currently allowed to use at work), it’s no match for a person with some experience and knowledge when it comes to more abstract thinking.
I mean it's on full display with social media, people these days are willing to chime in on things they have no understanding of and come to the wrong conclusions.
The problem is we make these low order approximations, recognize that they (ideally) help and congratulate ourselves. It's just a matter of stopping too early. You see people say "don't let perfection get in the way of good enough." I don't think perfection is usually the issue, rather a disagreement about what's good enough. So sayings like that just become thought terminating cliches[0].
[0] https://en.wikipedia.org/wiki/Thought-terminating_clich%C3%A...
Example: Someone just handed me a USB key with some old DOS software on it called SAGE made by a company which disappeared in 2011 (which apparently still powers a number of law offices... shocker), and they encountered a date handling issue (any case entered for 2026 borks some views and reports) and I'm facing either a decompilation or some other heavy analysis (none of which I'm familiar with) since the source code is obviously unobtainable (score another for open source? Seriously, when companies sink, they should be required to release their source code...). I'm not doing this for free, of course (I did tell them that failure was a distinct possibility), but I'm going to attack it with AI and rely on dev intuition for the rest and see what happens... (and use some kind of test harness, of course... to prevent regressions and prove the fixes)
https://news.ycombinator.com/item?id=26766722 (307 pts 60 comments)
https://news.ycombinator.com/item?id=32612931 (341 pts 39 comments)
To pick up one one of his examples, a few people at work understand Postgres very, very well. But some of them have troubles to discuss topics with developers, because they have no knowledge how application servers interact with the Postgres (usually via some pooling, sometimes not), how different kinds of applications have different query patterns (think REST-based applications that are heavily indexed for low-quantity retrievals vs ETL based applications) and so on. I can't write a production ready app in Rails, Spring, Django right now, or a data analysis system in Flink, Spark or whatever, but I tend to have an idea what the dev needs to do so.
On the flipside, if you have a motivated customer or provider, I find it very valuable to spend some time to show and teach them how our systems want to be touched, one way or another. Some "idle" or "non productive" time with some senior-ish devs just sharing ideas and knowledge how our Postgres functions, some somewhat unintuitive thoughts like index selectiveness, and wants to work at a somewhat shallow level has paid off a lot at work.
Suddenly people ask good questions about the postgres ecosystem before starting their project and such so they don't have to spend time building a postgres extension in a worse way in their application. How silly is that.
I also think it should be fairly obvious that some jobs can be accomplished by only having shallow or ad-hoc knowledge because they work on low performance/impact, but are still needed by someone, and thus require an employee.
What has not been obvious is why we can't differentiate roles that require deep vs. shallow knowledge officially, because there is still quite a lot of ambiguity in the actual work demands of "Software Engineer" (or "Software Developer") which makes this kind of defense in the OP necessary.
I find it hard to believe anyone does serious development work without some deep understanding of a piece of what they work on, the folks I've met that did that didn't last multiple review cycles.
What will remain important or grow in importance is general curiosity. Connecting completely disparate ideas or ways of thinking will lead you to new creative thoughts or solutions that AI would never produce because everyone else is working from the same standard ideas.
I was an English major in college and took classes in politics, philosophy, math, language, etc. based on personal interest. And I ended up as an engineer (with my trusty CS minor). TI've met several developers who have had a similar background and they tend to become the most well-rounded and business-aware ones on the team. I worry that this shift to higher cost/higher stakes/higher competition education is making this approach to learning feel untenable and my approach of 20 years ago comes across as totally irresponsible. But I would argue American education is leading to a factory approach at exactly the time when "structured thinking" is being fully replaced by AI. What is the value of crushing leetcode nowadays? Better to have a dev that has some intuition as to why people aren't clicking that new button.
However I think those unique engineers with vertical and deep knowledge in a tech stack (e.g. C, Java, Maths under NN) are still very needed in the world, because they are capable of building and repairing the fundamentals of everything which gets built upon. So, if you are interested a such fundamental stack, hack it, crash it, it won't be wasted time and world will need you :)
In addition, there will be times when X cannot directly solve the version of Y you have, but there are simple ways to tweak A or B such that now you do have a solution to the problem. So you can become much, much more effective at solving Y-like problems by understand the the building blocks behind standard solutions.
For example, the author talks very confidently about indexes, and makes a few conclusions, but they aren't as correct as his confidence suggests.
> That an index is only useful if it matches the actual query your application is making. If you index on “name plus email” and then start querying on “name plus address”, your index won’t be used and you’re back to the full table scan described in (1)
Not true. If you have single column indexes on both name and email, it could use the two indexes, though not as efficient as a single two-column index. If you query "name plus email" and the index is "name, email, age" then it could use the index.
> That indexes massively speed up reads, but must slow down database writes and updates, because each change must also be made again in the index “dictionary”
Must? No. The performance might be imperceptible and not material at all. If you have a ton of indexes, sure but not if you have a reasonable amount.
Shallow technical knowledge is fine but you should also have the humility to acknowledge that when you're dispensing said shallow knowledge. It can also lead to pretty bad engineering decisions if you think your shallow knowledge is enough.
See, there are databases that implement clever optimizations like this, but those are going to vary widely by database and you would need some domain expertise with that system to know if such optimizations are working. By contrast, this mental model does help you ensure that you can create indexes that are actually helpful in the vast majority of databases.
So I think the author's mental model is working out pretty well for him here, honestly.
I think the interesting question is like, if I have X amount of time and mental bandwidth to learn about a technology, what's the most helpful lossy compression of concepts that fits?
You cannot possibly know if your condensed version is accurate or sufficient if you can't point to the author of it and definitively state that they knew the original material well enough to summarize it.
I also continue to push back on the idea that [backend] devs shouldn't need to know SQL extremely well, and as an extension, their RDBMS vendor's implementation of the spec. You have to know your main programming language to get the job; why should the part of it that stores and retrieves all information for your application somehow be lesser? If you don't want to, then you don't get to write queries and design schemas, period. Access them via an API that's been designed by domain experts, otherwise you're putting everyone at risk.
I don't understand how this relates to the article. The author is not presenting himself as an expert on database indices, nor is the purpose of the article to educate people on database indices. If anything, he's illustrating techniques for dealing with technology when you're not an expert.
> I also continue to push back on the idea that [backend] devs shouldn't need to know SQL extremely well
The article isn't about any particular technology or type of software engineer, either -- this is just an example. We all have to use technologies that we're not world-class experts in, and part of being a professional engineer is learning ways to deal with that sad reality.
--
Edit to add: I do feel a certain sympathy/resonance for your claim that people should be really competent at the tools/tech they're using, and it's strictly "better" for everyone. But we also live in a world where the complexity and depth of software stacks is increasing rapidly, and developers often have to prioritize breadth over depth. (And yes I have seen a lot of people shoot their foot off with poorly-informed use of databases :/)
> We all have to use technologies that we're not world-class experts in
I’m not asking people to be experts, just to know how it works. If you write software that communicates over TCP, you should know how TCP works. If you write software that uses a deque, you should know how a deque works. Etc.
Re: the real world, perhaps that’s a good indication that we shouldn’t be unnecessarily increasing complexity, and use boring technology.
But if you "know how something works" in detail, such that you fully understand its workings and behavior, you're pretty much an expert. To really know how a database works is a project that takes hundreds of hours of dedicated study, and the deeper you look, the more nuance you find. Otherwise, you'll inevitably make the kinds of flawed generalizations that you dislike about the OP's mental model.
As I say, I have sympathy for your argument. I have spent a lot of time studying databases, I've contributed some patches to Postgres, I like understanding how things really work. But the reality is: full-stack development today is fractally complex. There are MANY components that each might require hundreds of hours to understand, and it's actually not economically valuable for you or your employer to rabbit-hole down each one before you start using it. You need to be able to pick up the key idea of a technology, using the appropriate resources, without fully studying it out.
--
I think that perhaps we understand the article differently. I think you understand it as a tradeoff between "understanding a system a little bit" vs. "understanding deeply," in which case, sure, it's easy to argue that we should all understand technologies deeply. But I think the real tradeoff is for beginners -- "understanding only the apparent outer workings of a system" vs. "having a first-order model of the components that lead to that behavior." Going one level down is the first step to going all the way, and there is a big difference in even going one level down.
This is precisely why I maintain that the entire notion of full stack engineering is flawed. It’s absurd to think that one person should be able to meaningfully understand front end, backend, networking, and infra. Even if you abstract away networking and infra (spoiler, you’ve just kicked the can down the road), I’d argue that expecting someone to be good at frontend and backend is ridiculous. Maybe if the industry didn’t have such insane abstractions and frameworks, it would be doable, but that’s not how it is.
I think he's wrong, but more importantly I think he needs to be more humble that it's not "good" engineering, it's probably bad. You can get by if you're a startup and just need to get stuff done, but don't start teaching people and writing blog posts on the topic when you have shallow knowledge.
More to the point, this lack of knowledge will almost certainly drive people to over-index, which harms performance.
| If you find yourself doing only theory, do some practical work. It'll help with your theory.
| If you find yourself doing only practical, do some theory. It'll help with your practical work.
The two complement one another. Theory informs you of what is possible while the hands on informs you on what is feasible. They are different kinds of intuitions. Theory gets you thinking in high abstractions, understanding the building blocks and how things are put together. The practice forces you to deal with everything that slips through the cracks, teaching you how to do things efficiently and effectively.It's like making a building. You technically only need the construction skills to make a building. But the engineering designs make you able to build bigger and safer. The theory lets you find new novel designs and materials. You need all of this working together to build something great.
I've always found theory and practice to go hand in hand. Maybe because I came to CS from physics, but they have hard divisions there. I focused on experimental physics but I was known for strong math skills which allowed me to bridge these two sides. But the major reason I did it is because I couldn't see how they were different. So when I later heard Knuth, it really resonated with me.
But truthfully, I've struggled sometimes working with others, getting pulled in different directions. I left physics to become an engineer and all they saw me as was a theorist despite most of my work for them being practical (running simulations and then building physical devices and the necessary test infrastructure).
Now I'm about to defend my PhD in CS (ML) and the biggest struggle I've had is that it feels like any time I spend on theory is seen as wasted. I do so much less than before, but it's been the most helpful tool to my work. So I'm hoping others can help me understand this culture. We definitely see theory differently but I'm not sure why or how. Everything seems hyper focused on making products. To ship as fast as possible. No time to think about what we're making. For me, that slows me down! Just because lines of code aren't appearing on screen doesn't mean I'm not hard at work. A few hours or even a day of thought has saved me weeks worth of work. I'm not the quickest to first result[1] but the result is I can do more with less and quickly adapt to all the changes that happen as the project evolves. New experiments, measurements, features and such get integrated quickly because I expect this to happen. But still, considered slow, and I can't find out why.
So I'm wondering how others have dealt with this. I know I'm not alone here. I'm probably not going to change much because the end result works. But why is CS so focused on short term and how do I highlight longer term work when we're just measuring weekly updates. The strategy means more weeks with "not much to show" but other weeks that look like extreme productivity. But it's all the same. Because the truth is, the earlier work is just difficult or impossible to measure. I'm pretty sure this is why CS is the way it is, but I've got to say, I've got a lot of experience with measures and analysis and the fact is not everything is measurable. We only can use proxies and those easily become unaligned.
[0] I didn't copy paste it so it might be a little off but the message is right
[1] after proof of concept. Early on I try to be quick to fail. I'm talking about once that's going and we're doing more serious building.
I know that some birds migrate depending on the season and they fly in certain formations for efficiency. I'd never,ever think I could have any serious conversation with a biologist or ornithologist.
There have been several times that my exposure has been wrapped into production applications.
I know what I need to know on that platform, and try to maintain little more.
Is this what everybody does?
Honestly, the author couldn't have chosen a worse example if they tried. RDBMS are so absurdly complex and have so many edge cases for everything, that they are my canonical answer to why it's a bad idea to have dev teams managing them, or designing schema. Disclaimer: I'm a DBRE.
Look at MySQL's ORDER BY [0], GROUP BY [1], and index merge [2] optimizations. Postgres' ORDER BY optimization [3] doesn't have nearly as many gotchas, but it more than makes up for it by having many more types of indices [4], as well as many types of operator classes within each of those. While you're at it, you should probably have a handle on collations [6], AKA "why did this ORDER BY do something odd?"
> You could learn about the data structures that represent the index
You mean a B+ tree (for standard types)? I would hope you covered that in DS&A already.
> about all the tricks that go into updating them efficiently
> But it will very rarely inform the database migration or query you’re writing.
I disagree. For example, if you didn't know that updating any indexed column in a Postgres table means the entire row is re-written (hope you haven't done anything silly, like storing medium-sized JSON blobs that aren't quite big enough to trigger TOAST), you would likely be much more cavalier with your indexing decisions.
I'm so very tired of the "I don't need to know how the things I use work" attitude. Do you need to be able to write your own database to effectively use one? No, of course not. Should you have read the official docs? Yes. You'll learn far more than from skimming random blogs and YouTube videos.
[0]: https://dev.mysql.com/doc/refman/8.0/en/order-by-optimizatio...
[1]: https://dev.mysql.com/doc/refman/8.0/en/group-by-optimizatio...
[2]: https://dev.mysql.com/doc/refman/8.0/en/index-merge-optimiza...
[3]: https://www.postgresql.org/docs/current/indexes-ordering.htm...
[4]: https://www.postgresql.org/docs/current/indexes-types.html
[5]: https://www.postgresql.org/docs/current/indexes-opclass.html
harrall•1d ago
When I'm listening to someone suggest an idea, you can tell
- if they're just working off something that they heard about (i.e. let's implement Redis, I heard it was fast),
- if they've done it before but don't really know how it works (i.e. MongoDB worked for me once, don't need to consider Postgres),
- or if they did both