And CS folks should design hardwares because they understand concurrency better?!
A whole lot of my coursework could be described as UML diagramming but using glyphs for resistors and ground.
Robots handle much of the assembly work these days. Most of the human work is jotting down arbitrary notation to represent a loop or when to cache state (use a capacitor).
Software engineers have come up with a whole lot of euphemistic notations for "store this value and transform it when these signals/events occur". It's more of a psychosis that long ago quit serving humanity and became a fetish for screen addicts.
Learning KiCad took me a few evenings with YT videos (greetings to Phil!).
Soldering needs much more exercise. Soldering QFN with a stencil, paste and oven (or only pre-heater) can only be learned by failing many times.
Having a huge stock of good components (sorted nicely with PartsDB!) lowers the barrier for starting projects dramatically.
But as always: the better your gear gets - the more fun it becomes.
The exception was cutting edge motherboards that had to be released alongside a new Intel chipset but that project had at least a dozen engineers working in shifts.
The reason for the "talent shortage" (aka "talent more expensive than we'd like") is really just because hardware design is a niche field that most people a) don't need to do, b) can't access because almost all the tools are proprietary and c) can't afford, outside of tiny FPGAs.
If Intel or AMD ever release a CPU range that comes with an eFPGA as standard that's fully documented with free tooling then you'll suddenly see a lot more talent appear as if by magic.
Which doesn't pay as well as jobs in software do, unfortunately.
The problem, I think, is that there are many competent hardware engineers available abroad and since hardware is usually designed with very rigorous specs, tests, etc. it's easy to outsource.
The notable exceptions are:
* Formal verification, which is very widely used in hardware and barely used in software (not software's fault really - there are good reasons for it).
* What the software guys now call "deterministic system testing", which is just called "testing" in the hardware world because that's how it has always been done.
I know them. Especially older folks. Ramming all parts on one huge sheet instead of separation by function. Refusing to use buses. Refusing to insert part numbers into schematics so they can just export BoM directly and writing BoM by hand instead.
Watching these guys is like watching lowest office worker inserting values from Excel into calculator so he can then write the result into same Excel table.
Mostly B. Even if you work in company that does both you'll rarely get a chance to touch the hardware as a software developer because all the EDA tools are seat-licensed, making it an expensive gamble to let someone who doesn't have domain experience take a crack at it. If you work at a verilog shop you can sneak in verilator, but the digital designers tend to push back in favor of vendor tools.
Hardware people go to software because it is lower-stress and can pay better (well, at least you have a higher chance of getting rich, start-ups and all that).
I'm guessing this isn't part of most curricula anymore?
The supercomputer thing... never happened. And I turned out to have a CE career anyway.
RF design, radars, etc... are more an art than a science, in many aspects.
I would expect a Physics-trained student to be more adaptable to that type of EE work than a CS student...
I used Vivado (from Xilinx) a bit during my undergrad in computer engineering and was constantly surprised at how much of a complete disaster the tooling chain was. Crashes that would erase all your work. Strange errors.
I briefed worked at a few hardware companies and I was always taken aback by the poor state of the tooling which was highly correlated with the license terms dicated by EDA tools. Software dev seemed much more interesting and portable. Working in hardware meant you would almost always be searching between Intel, Arm, AMD and maybe Nvidia if you were a rockstar.
Software by comparison offered plentiful opportunities and a skill set that could be used at an insurance firm or any of the fortune 100s. I've always loved hardware but the opaque datasheets and IP rules kills my interest everytime.
Also, I would argue software devs make better hardware engineers. Look at Oxide computer. They have fixed bugs in AMD's hardware datasets because of their insane attention to detail. Software has eaten the world and EEs should not be writing the software that brings up UEFI. We would have much more powerful hardware systems if we were able to shine a light on the inner workings of most hardware.
Here's an example of my implementation of the original Tamagotchi: https://news.ycombinator.com/item?id=45737872 (https://github.com/agg23/fpga-tamagotchi)
The vagaries of analog electronics, RF, noise, and the rest is another matter. While it's possible that a CS graduate might have a hint of how much they don't know, it's unreasonable to expect them to cover that territory as well.
Simple example, did you know that it's possible for 2 otherwise identical resistors to have more than 20db differences in their noise generation?[1] I've been messing with electronics and ham radio for 50+ years, and it was news to me. I'm not sure even a EE graduate would be aware of that.
* Chip design pays better than software in many cases and many places (US and UK included; but excluding comparisons to Finance/FinTech software, unless you happen to be in hardware for those two sectors)
* Software engineers make great digital logic verification engineers. They can also gradually be trained to do design too. There are significant and valuable skill and knowledge crossovers.
* Software engineers lack the knowledge to learn analogue design / verification, and there’s little to no knowledge-crossover.
* We have a shortage of engineers in the chip industry, particularly in chip design and verification, but also architecture, modelling/simulation, and low-level software. Unfortunately, the decline in hardware courses in academia is very long standing, and AI Software is just the latest fuel on the fire. AI Hardware has inspired some new people to join the industry but nothing like the tidal wave of new software engineers.
* The lack of open source hardware tools, workflows, high-quality examples, relative to the gross abundance of open source software, doesn’t help the situation, but I think it is more a symptom than it is a cause.
stn8188•1h ago
The article is more in the area of chip design and verification than PCB hardware, so I kinda understand where it's coming from.
mixmastamyk•55m ago
em3rgent0rdr•32m ago
"Electrical and Computer Engineering" (ECE) departments already exist and already have such a major: "Computer Engineering".