In computer science courses, that's as simple as a println().
In machine learning courses, that's training on mnist dataset to do character recognition.
In electrical engineering, that's buying a raspberry pi to blink led.
In chip design ... Chatgpt says to design a 1-bit full adder using verilog?
...
I understand why the article thinks the market is looking for graduate education. To design a simple chip requires an *initial investment* (as with all hardware startups really). This is different from software where one can simply launch a web app with a container hosted on your preferred cloud provider...
... That said, with the rise of LLMs lowering the barrier of entry of software even lower (e.g. vibe coding), may we see more rise of hardware startups/innovations?
Getting to your first wafer costs something like $250k and upwards of fab costs, depending on what process you're using. Hence much of chip design effort is already spent on verification, it's probably over 50% by now. This is the exact opposite of vibes because mistakes are expensive.
Businesswise it's quite tough B2B sales because you're selling into other people's product development pipelines. They need to trust you because you can sink their project, way over and above the cost of the actual parts.
Edit: I cannot emphasise enough how much more conservative the culture is in chip design and EE more broadly. It belongs to a world not just before "vibe coding" but before "web 2.0". It's full of weird closed source very expensive tooling, and is built on a graveyard of expensive mistakes. You've got to get the product 100% right on the first go.
Well, maybe the second go, production silicon is usually "B" rev. But that's it. Economics dictate you then need to be able to sell that run for a few years before replacing it with an upgraded product line.
And so my point: the place where people best know how to make chips competitively in a cutthroat industry is NOT in schools, but in private companies that have signed all the NDAs. The information is literally locked away, unable to diffuse into the open where universities efficiently operate. Professors cannot teach what they don’t know or cannot legally share.
Chip design is a journeyman industry. Building fault-tolerant, fast, power-efficient, correct, debuggable, and manufacturable designs is table stakes. Because if not, there are already a ton of chip varieties available. Don’t reinvent the wheel because the intersection of logic, supply chain logistics, circuit design, large scale multi objective optimization, chemistry, physics, materials science, and mathematical verification is unforgiving.
You are obviously not going to etch silicon at home, but the design part is rather accessible as far as hardware goes.
They're not used in high-volume manufacturing (you’re not replacing ASML), but they’re solid for prototyping, research, and niche builds.
Just don’t underestimate the safety aspect—some of these chemicals (like HF) are genuinely nasty, and DIY high voltage setups can bite hard.
You're not hitting nanometer nodes, but for MEMS, sensors, and basic ICs, it’s totally within reach if you know what you’re doing.
The article shows some pay figures but those are American where everything pays insanely well compared to here so I'm not sure how relevant those are.
It's odd that it doesn't discuss the size of the industry though - I always thought of it as a small, relatively niche industry compared to software dev. and while there is probably less competition for that smaller amount of jobs, there's still a smaller amount of jobs.
And also as a Brit who has since got a green card in the US a couple of years ago - I'm not sure if the UK is a great comparison point - brexit decimated that level of engineering in the UK - the sort of companies that need that expertise are at the multinational level or have fabs, none of which is really true for the UK isolated from Europe. Every company I worked for/with in the UK has either moved offshore or completely closed down since. And low demand causes low wages.
If it's economically vital, why doesn't it pay as well as industries competing for similar graduates? (not just programming, but also finance sucks up a lot of mathematically inclined people)
I'm reminded of COVID where the most "essential" workers inevitably meant the most expendable.
People make value arguments but these are incredibly naive. Look at the companies with the top market caps. They'd tumble if TSMC or ACML went under.
So there's something else to how price gets defined. Be careful to think the status quo is "rational". Sure, there's an explanation but an explanation is not a good explanation. Usually it's a simple explanation to an incredibly complex topic which no one really can explain
No they wouldn't because their competitors would be in the same situation as them. If nobody has advanced chips, nobody has a competitive advantage.
If I had to guess it would be that every business needs (or thinks they need) a custom website, and often some custom software processes, and definitely their own custom network and IT, and likely some slightly customized ERP. But rarely, very very rarely, do any businesses want custom chips or computers such that a CE is required. The demand for software is so high.
What is interesting in these kinds of discussions is that intuitively, nobody really beliefs in the law of supply & demand. On an emotional level, people always think that effort and difficulty should be rewarded on their own - which is actually correct from a social and ethical POV, and yet:
markets are markets operating under the iron law of supply and demand.
These articles crop up from time to time and I really don't think the frame is hitting a high standard of thoughtfulness. Or it is a surreptitious attempt to influence government policy in which case fair enough. The problem with the chip industry is that Asia has a comparative advantage at it, probably because that is where all the advanced industrial capital investment is happening. It has nothing to do with people - it is usual for these matters to turn out to be a regulatory problem when the superficial issues are peeled back.
Asia has a comparative advantage because labor is much cheaper there. PHDs working at TSMC in Taiwan make less than fresh grads working in software in the US.
You started the sentence with comparative advantage then transitioned smoothly to an absolute advantage - if you want to see low wages, look to Africa. Relative wage levels doesn't determine directly cause comparative advantages. I also note you're also implicitly disagreeing with the article and suggesting that compensation gap is not a myth.
The is THE limiting factor of many fields. Bioscience etc. even car mechanics for that matter.
It all starts to click when you have a practical and physical application. A garage is readily available and not expensive.
A lab on the other hand… imo there should be more low cost options to produce chips. There are a few projects trying to do that. Some universities have some cooperation with fabs for dead/spare space.
Especially 1 and 3
EE education has absolute dog-crap didacticism.
EEs also have an awful "holier-than-thou" attitude in Engineering.
And then you make it only worse by requiring "Masters only or above". Well guess why, because your graduation was spent going around stuff that goes from nowhere to nowhere else.
And then goes to show how a SW engineer makes 30% more. Where’s the myth? Especially given that EE requires a lot more work so that 30% gap is actually worse.
"It is hard to make any sweeping conclusions that software pays more than hardware, or vice versa. The myth of software always being more lucrative may be unfounded, but the sentiment does exist among young professionals looking to choose career paths."
I.e. there surely exist hardware positions that pay better than some software position.
One has to be careful with statistics, you can't start with the aggregate and predict the individual case.
Yes, designing chips is hard, it takes a lot of knowledge. This is why medical doctors need to go through all that schooling... designing a tiny chip with more transistors running software that does amazing things is very difficult.
My Ph.D. is in computer engineering, specifically VLSI and chip design. This was from a few years ago. I _probably_ should have gone into industry, I mean, after all, it is what I went to school for and wanted to do. However, the starting salary for a chip designer (Intel / AMD / HP / IBM) was literally less than I was making at a side job (I worked my way through my Ph.D) as an IT sysadmin. Not only that, people that I knew well that graduated before me would call me up and tell me it was worse than hell itself. 80 hour weeks? Completely normal, outside of the 2 hours of commute time. Barely make rent because you live in California? Check. Pages / Calls all hours of the day outside of work? Check. 80 hours? You mean 100 hours a week leading up to a release, right? Check.
Looking back on it, it seems this was "the challenging" and if you made it past this (something like 5 years on) things calmed down for a chip designer and you moved into a more "modest" 60-80 hours a week role with less pressure and somewhat of a pay increase.
Yes, how do you attract talent under those conditions? It is not flashy work, takes a lot of schooling and the rewards are low. At least medical doctors can kind of look forward to "well, I can make _real_ money doing this", and have the satisfaction of "I helped a lot of people".
Why is everybody outside music, movies, crypto & pizza struggling to attract talent?
* Snow Crash (1992) might turn out not to be so precisely prescient due to upcoming dedollarization, AI democratization/bubble burst (the exact option depends on your personality type), & the solid state battery boom:>When it gets down to it — talking trade balances here — once we've brain-drained all our technology into other countries, once things have evened out, they're making cars in Bolivia and microwave ovens in Tadzhikistan and selling them here — once our edge in natural resources has been made irrelevant by giant Hong Kong ships and dirigibles that can ship North Dakota all the way to New Zealand for a nickel — once the Invisible Hand has taken away all those historical inequities and smeared them out into a broad global layer of what a Pakistani brickmaker would consider to be prosperity — y'know what? There's only four things we do better than anyone else:
music
movies
microcode (software)
high-speed pizza delivery
You hire hundreds of interns and entry level workers to let them fight in the bloodbath for 100h a week. Pay peanuts. Let them do all the work.
The ones who survive get a bit bigger salaries. Those who still persist in upper level bloodbaths are upgraded into millionaires. And paying them millions looks acceptable as it is so hard to reach the top.
While you clearly could share all those millions between entry level and paid internships, don't have 100h weeks and have a healthy industry.
Your concerns about horribly long hours and lower than IT/software pay are the most concerning part to me. But, if there's really a shortage of engineers who know how to do chip design, hopefully the market will take care of that via supply/demand at least once things get really out of whack.
1) If chip design (or X, anything) is so vital, so important to national security, why do universities insist that a degree of X include a lot of unrelated courses? You can argue that universities are not just for employment (yeah, as if most people go to university just for fun), but by the name of God, I really hate it when my university forced me to go through all those BS selective courses to reach 120. If you ask me, it's just money grabbing.
2) Why can't students go straight to a fab or whatever after bachelor and do their masters THERE? Isn't the industry a much better place to do that? Actually, why don't the industry simply hire high school students and go from there? Companies used to do that in the 50s/60s. I don't know if they still do that but I think it's rare.
Intel has been the semiconductor industry standard for comp for years, and they’ve lagged in this department for a decade. This has depressed the whole industry’s pay as so many chip companies view Big Blue as the market linchpin to set comp off of.
The exception here is for Apple and nVidia - where the work is still pretty grueling, but the pay is excellent.
You gotta be the cream of the crop to get in either of those places. The hiring bar for both is high, and getting higher.
Source: me. I run a job board for FPGA and RTL engineers.
osnium123•2d ago
osnium123•2d ago
Theory-first education: In an effort to build from fundamentals, there is too much emphasis on theory rather than a focus on applications. Compensation myth: There is a feeling that software pays more than hardware. Reality is not so cut and dry. Graduate degrees: A lot more employers ask for graduate level degrees to enter chip design creating bottlenecks in talent supply. Early specialization: Highly niche skillsets are less marketable and career limiting. Documentation shortages: Hardware design is entirely tribal knowledge and hard to self-learn. Chip design culture: Hardware companies have a retro feel to them, deadlines are tight, and mistakes are deadly.
guappa•4h ago
And in hardware mistakes are more costly, while in software most of the developers work on completely useless projects that are doomed to disappear soon.
pjmlp•2h ago