Ah yes, the famous last words of expecting testing to take less than two weeks, and that all tests will pass...
Especially in RF hardware design, you will have to plan for the hardware revision to inevitably have problems. And in hardware design, a new revision will take at least another week for a new prototype to arrive.
OP is on rev 5, so I'm assuming that the schematics itself will have been validated already, if the schematics haven't changed between v4 and v5 then it's not unrealistic to subtract the schematic validation part from the planning.
However, OP does also mention having made many routing / placement changes, and trying to move components under a heatsink and such. This is where all sorts of unforseen problems can arise. Especially with high-speed, RF, impedance matched design you can run into so many unforeseen RF black-magic problems. Trust me, I've been there.
In hardware, especially when RF is involved, it's not about how long the testing/validation itself takes, but the turnaround time to get a new prototype produced.
It does seem like the schedule question here is not if testing takes two weeks, it's if rev 5.1 actually fixes the issues, and how long testing revs 5.2 and 5.3 will inevitably take.
(Actually, a device that can measure bit error rates would be great too).
If you're diagnosing signal quality you're going to want to look at the analog signals, which means sampling at a rate significantly faster than the baud rate, and at a 8-bit or higher resolution to actually see analog behaviour. Suddenly you're dealing with 400Gbps of incoming sampling data - and you have to do realtime analysis on that to trigger at the right time, and be capable of storing at least a few tens of thousands of samples for display.
Yes, I mean sampling. I want to see the eye-diagrams preferably, using a DIY device. It should be possible, perhaps using delay-lines (as now on the HN frontpage).
Also, if a device like this exists, then maybe someone can write an open-source tool to compute the bit-error-rate from digital inputs. Or write some Wireshark extension to do decoding of raw signals.
This Keysight company makes nice tools but they are out of reach of hobbyists and small companies, and cheaper tools should be possible since we're all having USB3 devices in our computers already (digital ones, ok).
Unfortunately, the author burned out on it and project is dead. But the presentation is still worth watching.
The title doesn't match the article title though, so unless the author and OP are the same it's a bit weird.
ChuckMcM•9mo ago
Teaching KiCad a New Trick - Matching Delays
At time of writing, KiCad only understands the length of traces and pins. When length matching, it takes length as a single number added up across every layer. This leads to delay mismatches, as the signals on the inner layers are slower than the signals on the outer layers. When assigning pin lengths, you need to arbitrarily choose a layer to convert a delay value (given by the manufacturer), to a length. This also results in delay mismatches.
I wanted to do this right, just like Altium does, but I didn’t want to have to calculate and add up all the delay values by hand in a spreadsheet. So I made a script to rewrite custom design rules to try to get KiCad’s length matching to be delay matching (including pad delays).
Closed source design tools leave you stuck, and often when a need like this surfaces you end up paying a lot of money for an "option pack" that adds the capability. If you have ever wondered if KiCad was up to doing any kind of design, this should assure you that no it works just fine and you can kick that $10,000 Altium license to the curb.
LordShredda•9mo ago
ericwood•9mo ago
bsder•9mo ago
Sure, if you're routing 8+ layer boards with blind vias and PCIx16 and DDR5 buses every day, go buy an Allegro or Expedition licence for 6 figures. It's absolutely worth the money.
For Altium, I find that the "showstopper bug that Altium has":"feature that Kicad doesn't have" ratio is almost always strongly in favor of Kicad.
explodingwaffle•9mo ago
The rate of development since V6 is crazy fast IMO. Very much an OSS success story.
crote•9mo ago
We saw something similar with Blender. At a certain point it becomes good enough that for some professionals it becomes a viable alternative to its obscenely expensive proprietary competition. If those companies are willing to donate $500 / seat / year to OSS instead of spending $1500 / seat / year on proprietary licensing, they can get some developer to fix the main issues they run into. This in turn means the OSS variant gets even better, which means even more companies are willing to consider switching, which means even more budget for development. Let this continue for a few years, and the OSS alternative has suddenly become best-in-class.
dcrazy•9mo ago
rowanG077•9mo ago
Mbwagava•9mo ago
dcrazy•9mo ago