This is why I did an EE degree, didn't get paid much, went into software and used that to pay for a mathematics degree.
If you want to get paid in software don't do something utterly commoditised and popular or you're just a fungible meat flavoured work unit. Get really damn good at something with some longevity in a stable niche.
Reminds me of Hank Hill "I sell hammers and hammer accessories."
In that analogy it also works that in that the level of cognitive difficulty is most challenging @ physics theoretical work --> engineering --> software. Inversely proportional to pay check size. Though a physicist can probably figure out software whereas the other way is a tougher slog.
We're making electric super cars that still use a steering wheel and gas pedal. Just because it's old doesn't mean it's deficient. Humans haven't fundamentally changed in the past 100 years, so it is probably the most intuitive way to manually control a car.
The typewriter/keyboard is probably the most intuitive method to input alphabet characters. Of course that doesn't preclude entirely new ways to control our computers. But if you set out to simply replace the mouse and keyboard without fundamentally changing how we interact with a computer, then you're setting yourself up for failure.
And then circuit analysis is just a big exercise in building "castles in the sky" and worrying your upside down staircase has railings
(Among other several pet peeves about EE that I could go on about)
I worked with EE for a while and it was very boring building stuff.
It basically took me changing careers to SWE and working for a games company to finally use the math part of my EE degree.
I ended up building my guitar amp years after.
They're different jobs.
The Lego level is more like being a technician. You can slap a few ready-made building blocks together, maybe tweak them a little using basic algebra, and you've got your design.
That's fine for guitar amps and simple synth circuits and such.
But if you use that approach while designing the control circuitry for a power plant or a rocket motor, in the best case failure will be very expensive, worst case people will die.
That's where the real engineering happens. You're modelling systems from first principles and you know enough to be fairly confident that the equations you create to characterise a complex design with multiple inputs and outputs accurately predict its behaviour.
If you start with hobby electronics you have zero experience or insight into that level. So when you begin your course you're completely blindsided by how much math there is, and have no idea what it's for.
And some domains, like robotics, have even more math. You can use plain old EE control theory, but you can extend it into modelling systems using Lie groups and Lie algebras - which are more often used in quantum physics.
Sounds like if you start with the math before you're old enough to pick up a soldering iron it might be a little different.
Early classes on circuits in EE will usually take shortcuts using known circuit structures and simplified models. The abstraction underneath the field of analog circuits is extremely leaky, so you often learn to ignore it unless you absolutely need to pay attention.
Hobbyist and undergrad projects thus usually consist of cargo culting combinations of simple circuit building blocks connected to a microcontroller of some kind. A lot of research (not in EE) needs this kind of work, but it's not necessarily glamorous. This is the same as pulling software libraries off the shelf to do software work ("showing my advisor docker"), but the software work gets more credit in modern academia because the skills are rarer and the building blocks are newer.
Plenty of cutting-edge science needs hobbyist-level EE, it's just not work in EE. Actual CS research is largely the same as EE research: very, very heavy on math and very difficult to do without studying a lot. If you compare hard EE research to basic software engineering, it makes sense that you think there's a "wall," but you're ignoring the easy EE and the hard CS.
Fabs are specific to the manufacturing of integrated circuits.
EE encompasses more than just manufacturing of ICs, for example research and applications in radio propagation and EM/wireless, signal integrity, antenna design, coexistence/desense, advanced power electronics, control systems, simulation/solvers, etc.
You can do a lot of actual original work without a fab.
Assume beginner knowledge of relevant mathematics/electronics and good software skills.
Am interested in both the practical side (eg. build a SDR from components) and the theoretical side i.e. the Physics/Mathematics to explain it.
Agree that complex EE work can be expensive for individual and smaller companies, indeed :)
A comment on the application side:
> "[..] for wireless applications you can follow the recommendations of the IC vendor and the remainder of the work is RF-engineering"
Zoom out to the system level, and you cannot just rely on IC vendor recommendations, and this kind of engineering can still require access to $$ labs.
Similar to complex software systems: for example take a large scale distributed system made out of many individual frameworks and services. The system as a whole may now exhibit emergent behaviour, and have failure modes due to the complexity of the system.
Same happens in complex EE designs, your design might pack in multiple cutting edge RF radios such as mmWave, UWB, with bespoke power amplifier, detection and antenna designs. Add in EM from multiple clock domains, high power distribution circuits, digital noise from FPGAs/CPUs, and EM from nearby sources. You can easily have noise couple from sources causing unintended issues in other subsystems. The vendor may say "keep a way from sources of noise", but your application may still be to engineer a solution that fits in the design envelope of a modern smartphone. The system level design needs to be engineered for EMC and coexist/desense, and validated which takes a ton of lab simulation and measurement/characterization work.
In EE the factories that produce PCBs are also called fabs.
I think the explosion in availability of inexpensive microcontrollers and FPGA dev boards have made it much easier for people to get into hardware design without spending a ton of money. This has also made it cheaper to buy high end test equipment, you don't need to buy a $3k Keysight oscope when a cheap Chinese USB oscope works just as well with plenty of features built-in for free. Obviously a proper academic or corporate research lab is going to be a lot different than a well-equiped hobbyist lab but the difference is not as stark as you'd imagine.
I say all this as a recovering semiconductor engineer: EE is a huge field. I can’t think of a subdiscipline where we’ve run out of new ideas to explore, and most of them don’t require bucketfuls of HF. The real problem is that the financial rewards are relatively small, the math is ferocious, and there are so few practitioners, let alone experts doing research.
Might just be me, but I found it all clicked when we started learning the fundamentals underneath these abstractions. For me it was harder in the first classes because it's about memorizing poorly understood concepts, my brain prefers logically deriving complex concepts as a learning method.
The gaps between the analogy and the real world actually make it harder to understand the fundamentals and just confuse people when you get to a deeper level understanding. It requires more unlearning than is worth it for the slight benefit of making the concepts slightly more intuitive to understand at the beginning.
Electricity behaves in many ways just like water (just at a significantly faster time scale) but I don't think it actually helped me learn how it all worked to start with.
Which is why I also don't generally like analogies and the kind
And if you managed to move ahead in hard mode, you shouldn't feel insulted.
Is it an analogy or are both models expressions of some underlying model of potentials and flows, and we happen to have more hands-on experience with water?
Voltage drops across components or look a like head drops across pipe fittings. Losses along a pipe are similar to wires. Head and flow rate are very similar to voltage and current across multiple paths. Kirchoff can apply to both etc.
Many of the quantities have direct parallels and derive from each other in similar ways.
Obviously there are limits. But my middling DC circuit knowledge helped a lot when learning hydraulics from a mathematical engineering perspective.
I think it would have helped me if we talked about the motor or other examples first, and then did some math to show how the resonant behavior can be useful.
If you start with easy circuit models, at least the labs can put together something tangible in the first couple semesters, to keep people interested.
And, I mean, a lot of engineering students end up going into sort of technician-y jobs, so keeping the hands-on spark alive has a lot of value, IMO.
1. Learn soldering
2. Treat circuits like black boxes. If I need X amount of Y, e.g. I need a circuit to smooth the voltage, I pick one black box with adequate attributes.
However this is pretty introductory and I have no idea how to learn to fix old consoles. Sometimes it’s just a broken capacitor but I first need to figure out which part is broken.
a) inspect for obviously damaged components. Capacitors that leaked, chips that released the magic smoke, etc.
b) confirm voltages are good
c) inspect the inputs and outputs of the ics to see if they're doing what you expect
d) depending on the boards involved, a lot of checking if pin A is electrically connected to pin B when it should be. Sometimes traces get broken and need to be fixed up.
I knew a number of folks in the first year who were very good at practical electronics, having come in from a technician side, but simply gave up due to the heavy maths load.
It got more complex when doing Control Theory, what with Laplace and Z transforms, freq domain analysis, and the apocryphal Poles and Zeros.
Further culling ensued at that point.
However, control theory turned out to be my favorite class. Learning how negative feedback loops are everywhere was an eye opener.
Also learning Laplace transforms was one of my first “holy shit this is freaking clever and cool” moments. Just like how parity bits in data streams can be used to detect AND correct errors.
I wonder, how much control theory is there in CPU?
One minor caveat is that most CPUs nowadays contain phase-locked loop (PLL) clock multipliers. Those fall into the domain of control theory but strictly speaking they're not part of the logic.
I maybe had the most trouble just figuring out which instantiated PLL in the chip belonged to which PLL design, and where someone stuck the documentation in the giant repo. Especially since a hardware designer may think, oh we don’t need to update the docs, “nothing changed,” but the PLLs did change names because of a process change and their parameters may have changed slightly, even if they’re essentially the same. And chasing down small changes and documenting them ends up being a lot of the job in software.
I just think this gives much better results. The model can be as simple or complex as you need, and we aren’t trapped in the linear response range. PID is good enough for many tasks, but it’s never good.
If you can model your problem with linear differential equations then control theory replaces the need for tuning. The coefficients you need just pop directly out of the analysis.
Eventually when if statements stop working I found that decision trees work great and XGBoost continues to be a great iteration of a decision tree.
[1]: I was an early hire at a tech unicorn and we built an autoscaler pretty early into the company's tenure. While it was a great success for a long time once k8s became established in the industry we had a really hard time training new talent to it and I left as we began a massive company-wide effort to move our workloads onto k8s.
It should be pretty obvious that you cannot overcome constraints by moving even harder in the direction of the constraint, which is what the integral term does.
I was working as an SW Engr and taking a set of courses towards a Mechatronics certificate (my employer did a lot of motion control work) and I had to basically take updated versions of the same classes. The lab instructor was an about-to-retire engineer from Schunk, and he did an amazing job of explaining what the theory meant in terms of real-world behavior. That's when it all finally sunk in and I could look at the math and "see" how a mechanism would respond.
To me EE = heavy math and that’s what makes it so fun.
I actually do software now but it’s completely different. There’s like no math in most applications of it. Putting something together with a Rasp Pi or Arduino feels like 98% software and 2% EE.
There might be a structural issue if you have a bunch of guys coming in from the technician side, as you say, who almost all get filtered out. You might need remedial classes, a different curriculum progression, something. Or else recruitment standards/expectation-setting are wacked-out.
But aren't there a lot of actual hardware products that are "simple circuit blocks connected to a microcontroller"? Like a toaster, shaver, keyboard, etc. If that's not "work in EE" then what is it classified under? It's not CS either.
Most of the orgs I worked in building simple circuit blocks connected to a microcontroller either farmed out the actual EE work to contractors or design houses or had 1 EE for like 20 different projects.
Another commenter pointed this out, but those products take about 1-2 days of engineering time.
At most levels, software will be in there somewhere, even those fake flickering candle LEDs have RAM, ROM, and a processor these days.
The Perseus Cluster of galaxies is estimated[0] at something like 816,592 light years in diameter, so that's 10^21 meters, and on the other end 2008 TC3[1] is an asteroid 4.1m across.
That is largely true of academic research. A critical difference though is that you don't need big expensive hardware, or the like to follow along with large portions of the cutting edge CS research. There are some exceptions like cutting edge AI training work super expensive equipment or large cloud expenditures, but tons of other cutting edge CS research can run even on a fairly low-end laptop just fine.
It is also true that plenty of software innovation is not even tied to CS style academic research. Experimenting with what sort of perf becomes possible via implementing a new kernel feature, can be very important research but isn't always super closely tied to academic CS research.
Even the more hobbyist level cutting edge research for EE will have more costs, simply because components and PCBs are not exactly free, and you cannot just keep using the same boards for every project for several years like you can with a PC.
I understand why a lot of people bail out of EE, and why a lot go to web dev specifically. EE relies so heavily on simple calculus that there's a distinct moment where you have to go "what the heck am I actually learning?". And seeing that software has this apparent depth (design patterns, OOP principles, Haskell, ORMs, Fieldingian REST, GraphQL, 10,000-word blog posts on vim vs emacs, etc.), they naturally get drawn there.
Maybe one day I will actually understand signal integrity but so far my experience has been "check return paths, match impedances and pray to the EE gods".
Hardware and software are called different things for a reason? I do agree that tinkering with the hardware always needs to be in-step with the lesson at hand. You can't just state KVL/KCL and move-on, you need to have the student build a circuit and play with it for a day or two.
Obviously yes, if you're doing heavy analog/power/RF stuff you're going to be pretty far from software, but EE is a really broad field.
I started college in EE, I absolutely loved my digital logic class. I talked to one of the lab TA's in Circuits 2 and said this wasn't as interesting as digital logic. He told me to switch to computer engineering. I looked at the courses and there were only 5 classes different. I switched that week.
I've been designing computer chips the last 30 years.
> Maybe I Am Just a Software Guy
That's the TL ; DR version of the article.
The TL ; DR version of my post is "I like programming but I find hardware far more interesting."
edit: +1 on the "I just started bruteforcing" part of getting frustrated with everything. It was not a good way of learning but even after switching programs I found myself preferring to just bruteforce problems I had lost hope in thinking through to completion without running into a mistake that'd require me start back over from the top when I have 200 of the same type of problem to do after. So much mental effort would be wasted trying to "get" things I just wasn't getting that I started to getting more satisfaction mentally from just managing to get the solution without doing the effort of doing it "right" (ignoring that my methods of bruteforcing would probably still take far more time and energy, it was at least something that didn't hurt me spiritually on every failure).
There are plenty of "applied" electronics technician or electrician's apprenticeship programs that are more like your software education. Take an induction motor, a variable frequency drive, a few sensors, and a programmable logic controller, and hook them together according to the manufacturer's instructions, and you can be off to the races operating a pump or a conveyor on day 1. But will you understand how the insulated gate bipolar transistors and filters in that variable frequency drive turn the rectified high-voltage DC bus into three-phase AC that generates a rotating magnetic field and induces a current in the motor armature? No, you don't need to know any of that to make the pump work.
You wrote:
> I couldn’t imagine ... a toy CPU implemented in SystemVerilog being ... useful
No, it's really not, but your work on real CPUs depends on registers and combinatorial logic and ALUs and MMUs. End users can typically just download Python and treat everything behind the screen as a black box, but if you really want to call what you're doing "engineering" or a "science", then developing an understanding for what happens behind the curtains is incredibly useful. If you've implemented a toy 8-bit CPU with load, store, compare, jump, and a few basic math instructions, you can write some assembly or a toy interpreter for it and you will have an understanding of how real CPUs work that can enable you to write better code later. Add some interrupts to that CPU and build a multitasking operating system, and you'll understand parallelism better.
All of modern technology is a pyramid. At the point of that pyramid is just a single doped semiconductor with a P-N junction. We build that junction into transistors, and transistors into gates, and gates into CPUs, and on those CPUs we execute assembly, and we write low-level languages that compile into assembly, and build operating systems and syscalls with those low-level languages, and access those systems with high-level languages, and connect those computers together with networks, and write applications that operate on those networks, and at the broad base of the pyramid there are billions of people using those applications.
In 2025, no one human brain comprehends the full stack anymore, we all become our own individual bricks in a particular layer. But to do the best work at any point in the pyramid, you ought to know a bit of how it works above and below you!
It's a nice way of putting it. The blunt thing that everyone is sort of dodging here is this: I think it's less learning style, and more IQ. EE is THAT much harder.
Software development for the most part is extremely easy. It's one of the few "engineering" fields where you can go to a bootcamp and learn it in 6 months. You won't see this kind of thing for quantum mechanics or electrical engineering.
Also the gap between theory and application in software is miniscule. Instantaneous even. You basically learn software via application.
A lot of software engineers take pride in their jobs and in their intelligence but they don't fully understand just how easy software is. Like you guys (to be accurate: most of you guys, not all) have an easy job and you learned a easy field. EE is challenging. You don't like it because it's harder and the intellectual capacity to handle it isn't there for everyone.
There's a reason why all hardware development moved to Asia. Software is just too attractively easy and the internet boom made it lucrative. Asians took whats available while the west took what’s most fun. And now the skill gap is more evident and we can’t go back.
But the cool thing about software is that it has paid so well despite its relative ease, it’s a well paying field that still secures a middle class life that’s resilient to inflation.
Once that’s gone, a whole lot of people are going to find themselves condemned to dropping an economic class or two.
But oh how much workload the 6002x course was... I needed 10-15 hours per week for all the reading, problems and labs, and doing that while working full time and commuting 2 hours a day was a grueling pace to survive!
For the same mental effort, you get orders of magnitude more "end product" from software than hardware, with greatly less overhead and greatly more flexibility.
Hardware is extremely punishing and "complexity friction" kicks in almost immediately. A multi-feature door alarm on a microcontroller is a one hour affair that a newbie could finagle. With a pure hardware implementation its a multi-day effort, plus another day of reworking the board to dial it in. And if you aren't copying a design, you likely need a degree as well.
There is also the fact that software pays much more than hardware, can be done remotely from just about anywhere, doesn't involve working in labs full of lead and solvents, and like the author noted, has a much higher "wow!" factor from people in general. Software makes you feel very powerful, hardware will humble you into the ground.
Career wise and financially, its worked out great. Even most of my EE friends didn't do much in EE after the first 5 years. Everyone just migrated to where the jobs were: Software development, IT, and Cybersecurity.
I had an eye opening experience when I had my first taste of programming when I took C programming in my second year of university. What do you mean I can run a command and see instant output? Amazing! This was not the case for my electronics and power engineering lab sessions. We were using equipment that had been around since the 80s with little to no supervision. Just a bunch of routine "experiments" which I can barely remember any of. In my third year, I took Digital Computer Design (a C.E elective) and I realized I had been wasting my time learning about how the power grid in my country works. I tried my best to salvage as much as I could by picking more C.E electives, albeit not many available, did as best I could.
Everyday I wonder, how different would my life have been if I studied CS or even CE, I do not know. But, I appreciate the little this journey taught me, that you can always squeeze lemonade out of whatever lemons life gives you. I see my old EE notes now and they don't make sense to me, but I appreciate the happy chills solving circuit problem sets gave me. I work in software now, and I get that 1000x more, and that is how I know I made the right choice.
Contrast this:
> One of the EECS professors was kind enough to offer a RC car kit to his students to program it. I decided to give it a try. Maybe the toy car wasn’t exciting or maybe I was pre-occupied with other course work during that summer, I didn’t even open the box.
With this:
> Writing web applications blew my mind. I can just write some code, click a few buttons and boom all my friends and family across the globe can just see it! This feels like magic.
Anecdotally, I saw this a lot in college. Students would start out in electrical engineering because they thought hardware was really cool, but when the time came to do the hard work they didn't have much motivation. They wanted to be a hardware engineer, but putting in the work was unappealing. Software has a wider range of job opportunities from intense FAANG-level jobs down to being the person who pokes at a company's old PHP website long enough to keep it serving pages. You can jump in and find a level that matches your motivation. With hardware, you have to clear some hurdles to begin being useful at all.
To my surprise, I think Arduino and Raspberry Pi have made this worse. I talk to a lot of people who see themselves as amateur EEs because they bought an Arduino and used some jumper wires to connect some sensors to the right pins. It's exciting. Then they lose motivation when they encounter a problem that requires doing anything more complex or custom. These people often overlap with the CS students who think the entire world of software engineering is writing CRUD apps composed of APIs connected together.
A key purpose of the repeated exercises in circuit analysis is to build up the student's intuition for how electricity works. Mathematically, it's "simple" -- just systems of (possibly complex) equations and basic diff eq. But for sophomores, all that is still new, and most students don't enjoy going deep into derivations.
Building kits and plugging pre-made modules into microcontroller development boards is fun, but it's not really engineering. You don't hire an EE to plug off-the-shelf components together, you hire an EE to do design work, to make sure everything is going to work under all operating conditions, and to diagnose problems when something goes wrong.
Finally, software is just easier[1] than hardware. Modern software is a mathematical idealization that runs of top of decades of high-level tools and abstractions. That's why it's so cheap and popular!
[1] This does not mean that everything in software development is easy, just that you don't need to deal with physics or chemistry or manufacturing or the procurement of physical goods in order to create new software.
It's definitely the case that there's a bigger jump from school project to actually useful product for EE than for CS. But now that we have affordable but decently featured FPGA boards, the barrier is much lower than before, at least for digital design.
For me, a senior-level circuits class using Horowitz and Hill "Art of Electronics" was the game-changer (in 1981!). 3rd edition (2015) still looks great to me (although yes it is large and expensive).
You might love mechanical engineering and machines so you get into machining parts. By hand it is exactly what you thought, but once you hit CNC you're back to a desk job, spending most of your time in CAD software not even touching the machine.
TrackerFF•2d ago
But, yes, probably half of my classes were a real drag to get through. It all depended on who the lecturer was, and how enthusiastic they were.
pclmulqdq•2d ago
Eextra953•2d ago
Palomides•2d ago
repair is definitely not the gateway it used to be, though
Aurornis•2d ago
Custom PCBs have been $5/square inch for a set of 3 from OSHPark for many years.
You can buy a usable hot air station on Amazon for the price of a DoorDash meal.
leohart•2d ago
Kirby64•2d ago
pkolaczk•1d ago
b33j0r•2d ago
My junior project in EE was a guitar fx pedal with a shielded breadboard on top. I won’t be bashful, that was the most popular project in the room.
Then… I got divorced and never finished my EE degree. I already had a degree in CS, and had pursued a second degree because I thought software was too limiting. Now, here I am, all limited.
The reason I never subsequently finished my degree was that I didn’t really want to work on CMOS nor transmission lines or microwave, and graduating with an ECE degree from U of Utah offered those as your career paths.