In order to be able to design equipment, the instrumentation generally needs to outperform the equipment, sometimes by a significant margin. If I'm looking at the eye of a digital signal, I need to capture much faster than the signal.
It'd be fun to have a book of tricks from this era. At some point, it will fade into obscurity. Right now, it's a whole different bag of tricks for the state-of-the-art. They feel less... more textbook and less clever.
On the other hand, what's nice is that in 2025, decent equipment is cheap. There's a breakpoint where below around 100MHz, you can't do basic work, and above you can. That's roughly where FM pickup and a lot of oscillations sit. That used to cost a lot, but as technology progressed, we're at a point where a decent home lab can be had for well under a grand.
I think you'll get a kick out of:
"Analog Circuit Design: Art, Science and Personalities"
https://www.amazon.com/Analog-Circuit-Design-Personalities-E...
In order to be able to design equipment, the instrumentation generally needs to outperform the equipment, sometimes by a significant margin.
Flashback to my days as beginning TLP engineer. I was subjecting ESD protection structures to kV pulses with ~nanosec rise-time. The oscilloscope measures the pulse as it enters and reflects. You’d increase the voltage until the device breaks and do a wafer mapping. I remember a conversation where I showed the setup to a colleague from a diff department, telling him we’re developing next-gen protection against static discharges.To which he replies: why don’t we use what the guys from the oscilloscope are using?
Though it isn't a book, the Hewlett-Packard Journal is a gold mine for this type of content: https://web.archive.org/web/20120526151653/http://www.hpl.hp...
E.g. An 8-Gigasample-per-Second, 8-Bit Data Acquisition System for a Sampling Digital Oscilloscope (October 1993): https://web.archive.org/web/20120526151653/http://www.hpl.hp...
I mean, I think this would be a very nice project for someone with hardware skills and some time on their hands, and it would be useful too.
I've also looked at the existence of CCD memory, but it doesn't seem to be a thing anymore. I didn't find any such modern chips.
I think they can be used with analog signals too. After all, they are (I suppose) just a chain of transistors holding charge, like in a CCD. It is just the most efficient implementation, since doing it with logic gates will bring more overhead. EDIT: maybe not, it seems at least some of these chips have digital input/output stages. Maybe it could work if you put a very fast 8-bit ADC in front of the delay line, and used 8 delay lines, one for each bit? :)
Anyway, I have totally zero experience with these chips.
But I can imagine you clock-in the signal using a fast clock (maybe the internal clock), and then you clock-out the signal using a slow clock (slow enough for a subsequent ADC chip).
Also, perhaps you can put a bunch of these combinations in parallel to increase bandwidth, or to increase sample depth.
-not quite the same but similarly novel.
If you're designing a board, not being able to look at its signals is a major limitation. Is something wrong with the transmitter, receiver, cable, connector, pcb, firmware, driver? Who knows! It doesn't work, and that's all you're going to get. Have fun randomly tweaking stuff in the hope that it is magically going to work.
The HP logic analyzers back then had a really neat touchscreen interface based on criss-crossing infrared beams in front of the CRT face. The only thing that I've ever used that felt even better than a capacitive touchscreen, though obviously lower resolution.
As for the HP touch screen, I tore down a bunch of HP 16500A logic analyzers and reverse engineered the touch screen PCB. It uses a pretty simply LED/photo sensor matrix. You can see the PCB in one of the pictures here: https://tomverbeure.github.io/2022/06/17/HP16500a-teardown.h....
I remember pulling a 486 out of its socket in the 1990s and putting it back with the wrong orientation. There was a poc and a bit of smoke. Something on the mainboard had burnt and it wasn't working anymore.
I used smell to locate the fault, a big trace on the PCB, which I soldered back and magic, it all worked again...
Uh oh. We needed that board. What to do? Well, it can't hurt to try. We had "freeze spray" for debugging purposes, so got a bottle of white-out handy (what's that?), frosted up the board really well on the component side, powered up, and quickly marked the devices that defrosted notably quicker than the rest.
Got the solder station lady to replace all those parts and it worked again.
Old days...
I have an entry level standalone oscilloscope that i got but never used. I once looked for tutorials and unpacked it, ready to test, but:
It's covered in that kind of plastic that goes all gooey if left unattended for a long time.
Any hints on how I can clean it up so i can touch it again?
The biggest use case for this is sensor interfaces where the signal is still analog (not passed through an ADC yet). Voice recognition is a typical example where analog neural networks are used to certain level of success, and now people are pushing for insge recognition but the architecture of a digital camera isn't compatible with that, so I don't see much happening there.
Funny fact is these kind of circuits are used in analog portion of the chips to implement rather complex calibration/training loops (correlation, LMS optimization, pattern recognition etc) heavily since early 90s. There's a lot of analog computing happening in every SoC.
Thinking about it, I might still have the device somewhere in the attic.
I recently got a TDS684A and also made the same discovery, and wrote half a blog post about it which remains unfinished/unpublished. I don't have much of an EE background (at least, not on this level) so my article was certainly worse. It's also my only decent scope, so I don't have a good way to take measurements of the scope itself.
Relatedly, I dumped the firmware of mine (referencing Tom's notes on a similar model) and started writing an emulator for it: https://github.com/DavidBuchanan314/TDS684A
It boots as far as the vxworks system prompt, and can start booting the Smalltalk "userspace" but crashes at some point during hardware initialization (since my emulation is very incomplete!) - some screenshots/logs: https://bsky.app/profile/retr0.id/post/3ljzzkwiy622d
Edit: heh, I just realised I'm cited in the article, regarding the ADG286D
Because that's more than enough for scanning a screen-width's-worth of samples from the analog CCD snapshot.
In a digital camara, the CCD columns basically capture the image instantaneously. It has an infinite sample rate!
Then the data is shifted out the CCD some rate that basically doesn't matter, as long as it isn't so slow that it takes seconds.
formerly_proven•4h ago
Is this maybe using some form of correlated double sampling?
> It looks like the signal doesn’t scan out of the CCD memory in the order it was received, hence the signal discontinuity in the middle.
Or maybe the samples are also interleaved in the low-order bits in some way. This could be because the organization of the CCD isn't symmetric for the input and output paths, perhaps to reduce area or power, since only one path has to be fast. This would make sense because if you implement the CCD using n parallel CCD bucket brigades you only have to put a fast S&H and multiplexer in front of it, then you can drive the CCD brigades at a fraction of the actual sample rate, and the capacitive load of each of those clock phases is much lower as well.
dp-hackernews•4h ago
Oversampling Versus Upsampling: Differences Explained https://www.soundstagenetwork.com/gettingtechnical/gettingte...
tverbeure•3h ago
Some people have also suggested deliberate addition of a pseudo-random signal that gets removed after sampling to counteract some CCD issues. But I don't know how that would work either.
monster_truck•3h ago
formerly_proven•54m ago
crote•1h ago
I wouldn't be surprised if the CCD has all sorts of funky analog stuff going on internally which has different impacts on different samples, which would be incredibly hard to deal with on its own.
However, if this behaviour is merely a fixed offset, it would be fairly easy to compensate for this on the digital side: just do a calibration with a known signal, and the measured offset can be used to reverse its effect in future sampling.
tverbeure•50m ago
Another possibility is that there's some charge decay which you could calibrate for.
Retr0id•24m ago