* The fact that there are comments misunderstanding the article, that are talking about PCB Design rather than (Silicon) Chip Design, speaks to the problem facing the chip industry. Total lack of wider awareness and many misunderstandings.
* Chip design pays better than software in many cases and many places (US and UK included; but excluding comparisons to Finance/FinTech software, unless you happen to be in hardware for those two sectors)
* Software engineers make great digital logic verification engineers. They can also gradually be trained to do design too. There are significant and valuable skill and knowledge crossovers.
* Software engineers lack the knowledge to learn analogue design / verification, and there’s little to no knowledge-crossover.
* We have a shortage of engineers in the chip industry, particularly in chip design and verification, but also architecture, modelling/simulation, and low-level software. Unfortunately, the decline in hardware courses in academia is very long standing, and AI Software is just the latest fuel on the fire. AI Hardware has inspired some new people to join the industry but nothing like the tidal wave of new software engineers.
* The lack of open source hardware tools, workflows, high-quality examples, relative to the gross abundance of open source software, doesn’t help the situation, but I think it is more a symptom than it is a cause.
> * Chip design pays better than software in many cases and many places (US and UK included;
Where are these companies? All you ever hear from the hardware side of things are that the tools suck, everyone makes you sign NDAs for everything and that the pay is around 30% less. You can come up with counterexamples like Nvidia I suppose, but that's a bit like saying just work for a startup that becomes a billion dollar unicorn.
If these well paying jobs truly exist (which I'm going to be honest I doubt quite a bit) the companies offering them seem to be doing a horrendous job advertising that fact.
The same seems to apply to software jobs in the embedded world as well, which seem to be consistently paid less then web developers despite arguably having a more difficult job.
How does one pivot? It seems to me the job market demand is probably even more concentrated than the software market?
From Software into Hardware? Your fastest route in is to learn Python and find one of the many startups hiring for CocoTB-based verification roles. Depends a bit on what country you're in - I'm happy to give recommendations for the UK!
If you're feeling like learning SystemVerilog, then learn Universal Verification Methodology (UVM), to get into the verification end.
If you want to stay in software but be involved in chip design, then you need to learn C, C++ or Rust (though really C and C++ still dominate!). Then dabble in some particular application of those languages, such as embedded software (think: Arduino), firmware (play with any microcontroller or RPi - maybe even write your own bootloader), compiler (GCC/LLVM), etc.
The other route into software end of chip design is entry-level roles in functional or performance modelling teams, or via creating and running benchmarks. One, the other, or both. This is largely all C/C++ (and some Python, some Rust) software that models how a chip works at some abstract level. At one level, it's just high-performance software. At another, you have to start to learn something of how a chip is designed to create a realistic model.
And if you're really really stuck for "How on earth does a computer actually work", then feel free to check out my YouTube series that teaches 1st-year undergraduate computer architecture, along with building the same processor design in Minecraft (ye know, just for fun. All the taught material is the same!). [Shameless plug ;) ]
I think there are two separate areas of concern here, hardware, and computation. I strongly believe that a Computer Science program that only includes variants of the Von Neumann model of computation is severely lacking. While it's interesting to think about Turing Machines and Church numbers, etc... the practical use of FPGAs and other non-CPU based logic should definitely be part of the modern CS education.
The vagaries of analog electronics, RF, noise, and the rest is another matter. While it's possible that a CS graduate might have a hint of how much they don't know, it's unreasonable to expect them to cover that territory as well.
Simple example, did you know that it's possible for 2 otherwise identical resistors to have more than 20db differences in their noise generation?[1] I've been messing with electronics and ham radio for 50+ years, and it was news to me. I'm not sure even a EE graduate would be aware of that.
the design of FPGAs was certainly part of my CS degree!
they even made us use them in practical labs, and connect them up to an ARM chip
I lived in both worlds (hardware/software) throughout my career. In school, I learned (in order): Analog electronics (including RF), Digital electronics, Microprocessors, Software, Systems. I've always thought that it's strange how few software people know hardware, and vice versa. In the software domain, when I began referencing hardware elements while explaining something, the software audience would usually just glaze over and act like they were incapable of understanding. Same goes for the hardware people when I would reference software elements.
I learned Ada sometime around 1991. Counting assembly for various platforms, I had already learned about a dozen other languages by then, and would later learn many more.
Sometime around 2000 I learned VHDL. In all of the material (two textbooks and numerous handouts) there was no mention of the obvious similarities to Ada. I wish somebody had just produced a textbook describing the additional features and nomenclatures that VHDL added to Ada -- That would have made learning it even easier. The obvious reason that nobody had done that is that I was among a very small minority of hardware people who already knew Ada, and it just wouldn't be useful to most people.
In all of my work, but especially in systems integration work, I've found that my knowledge of multiple domains has really helped me outperform my peers. Having an understanding of what the computer is doing at the machine level, as well as what the software is doing (or trying to do) can make the integration work easy.
More on-topic: I think it would be a great improvement to add some basic hardware elements to CS software courses, and to add some basic CS elements to EE courses. It would benefit everyone.
I'm a hardware designer. An EE. But over the last umpteen years I've gradually switched over to software because that's where I was needed. What I've found is that I became a very good software programmer but I still lack all the fundamentals of software engineering. There are things I won't or can't use because it would require too much study for me to get good at it or even understand it.
I would bet that a CS guy would have similar problems switching to hardware engineering.
Not all hardware is digital.
RF design, radars, etc... are more an art than a science, in many aspects.
I would expect a Physics-trained student to be more adaptable to that type of EE work than a CS student...
Someone with a physics background might be better prepared for the analog world than someone with a digital background.
I have trouble believing there's a talent shortage in the chip industry. Lots of ECE grads I know never really found jobs and moved on to other things (including SWE). Others took major detours to eventually get jobs at places like Intel.
No shortage of talent. It's just that the big players are used to cheap almost minimum wage Taiwanese wages and refuse to pay the full price of an EE.
I had written a whole big thing that could be summarized as "yes, of course" but then I read the article and realized that it is very specifically about designing silicon, not devices.
I understand that it makes sense for a blog called Semiconductor Engineering to be focused on semiconductor engineering, but I was caught off guard because I have been working on the reasonable assumption that "hardware designer" could be someone who... designs hardware, as in devices containing PCBs.
In the same way that not all software developers want to build libraries and/or compilers, surely not all hardware designers want to get hired at [big chip company] to design chips.
It is funny how "hardware design" is commonly used in the chip industry to describe what semiconductor design/verification engineers do. And then there's PCB designers using those same chips in their _hardware designs_.
Also there's computer architects being like "So, are we hardware design? Interface design? Software? Something else?"...
Meanwhile, all the mechanical engineers are looking from the outside saying "The slightest scratch and your 'hard'ware is dead. Not so 'hard' really, eh?" ;) ;)
Every sector has its nomenclature and sometimes sectors bump into each other. SemiEngineering is very much in the chip design space.
I wasn't taught directly (and don't know what I'm doing still), but I've had a lot of fun learning about retro hardware design as a software engineer. I've made a few of my own reverse engineered designs, trying to synthesize how the real designers would have built the chip at the time, and ported others for the Analogue Pocket and MiSTer project.
Here's an example of my implementation of the original Tamagotchi: https://news.ycombinator.com/item?id=45737872 (https://github.com/agg23/fpga-tamagotchi)
Why would they? Pay is just much lower, despite the fact that there's way more responsability. I personally know more people who switch from hardware to software than viceversa.
I'd do anything short of murder to get out of software. If I could find a career that paid enough to live somewhere nice and didn't have the horrible working conditions that software does (stack rank, fake agile, unrealistic deadlines, stack rank, etc.) I'd do it in a heartbeat.
UIUC CS grad from the late 80s. CS students had to take a track of electrical engineering courses. Physics E&M, intro EE, digital circuits, microprocessor/ALU design, microprocessor interfacing.... It paid off immensely in my embedded development career.
I'm guessing this isn't part of most curricula anymore?
Where I studied, they reduced that, at least the workload and class time, in favor of more math and informatics.
Definitely no ALU design on the curriculum, no interfacing or busses, very little physics. They don't even put a multimeter in your hand.
Informatics is considered a branch of logic. If you want to know how to design a computer, you should have studied EE, is their thinking.
I had to take computer architecture. We made a 4 bit CPU... or maybe it was 8 bit. I can't remember. But it was all in a software breadboard simulator thing. LogicWorks.
That curricula is often more specifically called "Computer Engineering". CS students meanwhile usually aren't bothered by anything below the compiler.
I actually started Illinois as a Computer Engineering major and switched to Computer Science because I thought I'd get to use all the cool supercomputers at the Beckman Institute. Those electrical courses were all part of my CS requirements. Illinois CS was big on architecture, having designed Illiac and all of that stuff. Hennessy/Patterson for life.
The supercomputer thing... never happened. And I turned out to have a CE career anyway.
As a computer engineer I usually copy reference schematics and board layouts from datasheets the vendors offers. 95% of my hardware problems can be solved with it.
Learning KiCad took me a few evenings with YT videos (greetings to Phil!).
Soldering needs much more exercise. Soldering QFN with a stencil, paste and oven (or only pre-heater) can only be learned by failing many times.
Having a huge stock of good components (sorted nicely with PartsDB!) lowers the barrier for starting projects dramatically.
But as always: the better your gear gets - the more fun it becomes.
Even as a professional EE working on high speed digital and mixed signal designs (smartphones and motherboards), I used reference designs all the time, for almost every major part in a design. We had to rip up the schematics to fit our needs and follow manufacturer routing guidelines rather than copying the layout wholesale, but unless simulations told us otherwise we followed them religiously. When I started I was surprised how much of the industry is just doing the tedious work of footprint verification and PCB routing after copying existing designs and using calculators like the Saturn toolkit.
The exception was cutting edge motherboards that had to be released alongside a new Intel chipset but that project had at least a dozen engineers working in shifts.
The subheading to this article seems a little extreme: "To fill the talent gap, CS majors could be taught to design hardware, and the EE curriculum could be adapted or even shortened."
The article is more in the area of chip design and verification than PCB hardware, so I kinda understand where it's coming from.
Weird article, came to it hoping to see if I could train into a new job. But instead it went on and on about AI for almost the entire piece. Never learned what classes I might need to take or what the job prospects are.
> "CS majors could be taught to design hardware, and the EE curriculum"
"Electrical and Computer Engineering" (ECE) departments already exist and already have such a major: "Computer Engineering".
Is this not what electrical engineers are for?
EE engineers design components and new materials for maybe computers (or not), CS engineers should be able to design CPU's.
Is the idea here that the code-generation apocalypse will leave us with a huge surplus of software folks? Enabling software people to go over to hardware seems to be putting the cart before the horse, otherwise.
Hardware people go to software because it is lower-stress and can pay better (well, at least you have a higher chance of getting rich, start-ups and all that).
Hilarious to see Cadence and Synopsys in this article. They are arguably the cause. The complete lack of open source tooling and their agressive tooling price is the exact reason this ecosystem continues to be an absolute dumpster fire.
I used Vivado (from Xilinx) a bit during my undergrad in computer engineering and was constantly surprised at how much of a complete disaster the tooling chain was. Crashes that would erase all your work. Strange errors.
I briefed worked at a few hardware companies and I was always taken aback by the poor state of the tooling which was highly correlated with the license terms dicated by EDA tools. Software dev seemed much more interesting and portable. Working in hardware meant you would almost always be searching between Intel, Arm, AMD and maybe Nvidia if you were a rockstar.
Software by comparison offered plentiful opportunities and a skill set that could be used at an insurance firm or any of the fortune 100s. I've always loved hardware but the opaque datasheets and IP rules kills my interest everytime.
Also, I would argue software devs make better hardware engineers. Look at Oxide computer. They have fixed bugs in AMD's hardware datasets because of their insane attention to detail. Software has eaten the world and EEs should not be writing the software that brings up UEFI. We would have much more powerful hardware systems if we were able to shine a light on the inner workings of most hardware.
EE folks should design languages because they understand hardware better?!
And CS folks should design hardwares because they understand concurrency better?!
I know you said it in jest, but there is a strong justification for cross-feeding the two disciplines - on one side, we might get hardware that’s easier to program and, on the other end, we might get software that’s better tuned to the hardware it runs on.
Working in EE post BSc in EE from 99-06, it's pretty much CS + I know how to bread board and solder if absolutely necessary.
A whole lot of my coursework could be described as UML diagramming but using glyphs for resistors and ground.
Robots handle much of the assembly work these days. Most of the human work is jotting down arbitrary notation to represent a loop or when to cache state (use a capacitor).
Software engineers have come up with a whole lot of euphemistic notations for "store this value and transform it when these signals/events occur". It's more of a psychosis that long ago quit serving humanity and became a fetish for screen addicts.
In Europe in order to get a CS degree and be an actual "Engineer" you must be able to so at least on a basic level.
Obviously. Hardware designers absolutely love to think that hardware design is totally different to software design and only they have the skills, but in reality it's barely diffetent. Stuff runs in parallel. You occasionally have to know about really hardware things like timing and metastability. But the venn diagram of hardware/software design skills is pretty much two identical circles.
The reason for the "talent shortage" (aka "talent more expensive than we'd like") is really just because hardware design is a niche field that most people a) don't need to do, b) can't access because almost all the tools are proprietary and c) can't afford, outside of tiny FPGAs.
If Intel or AMD ever release a CPU range that comes with an eFPGA as standard that's fully documented with free tooling then you'll suddenly see a lot more talent appear as if by magic.
> The reason for the "talent shortage" (aka "talent more expensive than we'd like") is really just because hardware design is a niche field that most people a) don't need to do, b) can't access because almost all the tools are proprietary and c) can't afford, outside of tiny FPGAs.
Mostly B. Even if you work in company that does both you'll rarely get a chance to touch the hardware as a software developer because all the EDA tools are seat-licensed, making it an expensive gamble to let someone who doesn't have domain experience take a crack at it. If you work at a verilog shop you can sneak in verilator, but the digital designers tend to push back in favor of vendor tools.
> is really just because hardware design is a niche field
Which doesn't pay as well as jobs in software do, unfortunately.
Exactly money is problem. I am by trade hardware designer. I have no problem to sit down, create PCB in KiCAD and have it made perfect on first try. But I am doing this just as a hobby because it does not pay much. SWE just pays better even with the AI scarecrow behind it.
Really? In my experience in the UK it pays ~20% better. We're talking about silicon hardware design. Not PCBs.
At least in the US, yes. Check out general1465's reply to me.
The problem, I think, is that there are many competent hardware design engineers available abroad and since hardware is usually designed with very rigorous specs, tests, etc. it's easy to outsource. You can test if the hardware design engineer(s) came up with an adequate design and, if not, refuse payment or demand reimbursement, depending on how the contract is written. It's all very clear-cut and measurable.
Software is still the "Wild West", even with LLMs. It's nebulous, fast-moving, and requires a lot of communication to get close to reaching the maintenance stage.
PCB Design != Chip Design.
The article was about chip design.
Not trying to stop you debating the merits and shortcomings of PCB Design roles, just pointing out you may be discussing very very different jobs.
I'm talking about chip design: Verilog, VHDL, et al.
Very specifications-driven and easily tested. Very easy to outsource if you have a domestic engineer write the spec and test suite.
Mind you, I am not talking about IP-sensitive chip design or anything novel. I am talking about iterative improvements to well-known and solved problems e.g., a next generation ADC with slightly less output ripple.
Sure, so, yeah "general1465" seemed to be talking about PCB Design.
And from what I know of SemiEngineering's focus, they're talking about chip design in the sense of processor design (like Tenstorrent, Ampere, Ventana, SiFive, Rivos, Graphcore, Arm, Intel, AMD, Nvidia, etc.) rather than the kind of IP you're referring to. Although, I think there's still an argument to be made for the skill shortage in the broader semiconductor design areas.
Anyway, I agree with you that the commoditized IP that's incrementally improving, while very important, isn't going to pay as well as the "novel stuff" in processor design, or even in things like photonics.
In fact I'll go further - in my experience people with a software background make much better hardware designers than people with an EE background because they are aware of modern software best practices. Many hardware designers are happy to hack whatever together with duck tape and glue. As a result most of the hardware industry is decades behind the software industry in many ways, e.g. still relying on hacky Perl and TCL scripts to cobble things together.
The notable exceptions are:
* Formal verification, which is very widely used in hardware and barely used in software (not software's fault really - there are good reasons for it).
* What the software guys now call "deterministic system testing", which is just called "testing" in the hardware world because that's how it has always been done.
Side note: Formal theorem proving is even more rare than formal model checking..!
> in my experience people with a software background make much better hardware designers than people with an EE background because they are aware of modern software best practices.
I know them. Especially older folks. Ramming all parts on one huge sheet instead of separation by function. Refusing to use buses. Refusing to insert part numbers into schematics so they can just export BoM directly and writing BoM by hand instead.
Watching these guys is like watching lowest office worker inserting values from Excel into calculator so he can then write the result into same Excel table.
Age has an effect, no matter if it's software or electronics. These types learned their trade once, some decades ago, and keep driving like that.
If you want old dogs to learn new tricks, teach them. No company has the money to spend nor the inclination to even suggest education to their workers. Companies usually consider that a waste of time and money. I don't know why. Probably because "investing" in your work force is considered stupid because they'll fire you the moment a quarterly earnings call looks less than stellar.