It had an amazing selection of ports, all unshielded and designed for flat ribbon cables. But that wouldn't fly in the USA.
Another reason to use dark mode I guess
If the FCC hadn't been so strict I think there's a good chance we'd be using computers with a lineage going back to Atari versus IBM today.
Commodore ate Atari's lunch with the C64 and pricing, but Atari could have launched the 400/800 at lower price points with more lax emission standards. They would have had lower peripheral price points, too, since the SIO bus and "smart peripheral" design was also an emissions strategy.
On the home computer front the Atari 8-bits beat the pants off of the PET for graphics and sound capabilities. More success in the late 70s might have enabled Atari to build down to a price point that would have prevented the C64 from even happening.
On the business side Atari R&D had interesting stuff going on (a Unix workstation, Transputer machines). Alan Kay even worked there! They were thinking about business computing. If the 8-bits had had more success I think more interesting future products could have been brought to market on the revenue they generated.
And/or many of the other manufacturers of that era. I have encountered execs from that era that still believe the whole thing was some sort of shrouded protectionism.
And you make a good point about the SIO bus - this was when every other machine had unshielded ribbon cables everywhere. Their devotion to daisy chained serial really crippled them in terms of speed, and when USB finally arrived, I initially scorned it due to the prejudice formed by my experience with the Atari peripherals! It turns out they were on the right track all along!
On the other hand, I have been struggling to get my IP KVM at home working and it turned out that the cause of its failure was some cheap PoE extractors that spew noise across the spectrum, especially interfering with VGA sync.
Modern equipment, assuming you aren't buying bargain-basement AliExpress junk (which I do, from time to time) is surprisingly good at RF rejection.
And, amusingly, this just popped up on Twitter: https://x.com/hakluke/status/1980479234159398989
I kept reading "must accept" as a technical requirement, somehow like "must not be shielded against" or "must not use technical means to protect against", rather than what I now think is the intended legal sense "does not have any legal recourse against".
It's weird that they phrased it in terms of how the device itself must "accept" the interference, rather than the owner accepting it.
"Tempest-LoRa: Cross-Technology Covert Communication via HDMI RF Emissions", https://news.ycombinator.com/item?id=44483262
When I would fire up my KIM-1, the TV would turn to snow.
There was a toy called the "Big Trak", a programmable ATV toy. If you ran that underneath the desk with a TRS-80 on it, it would crash.
The TRS-80 Model 1 was notorious for this, as you connected the computer to the expansion interface with a bare, 40(? ish?) pin ribbon connector. It was a beautiful broadcast antenna for computer signals.
The FCC was an impetus for the Model 3.
I guess there are two ways to look at it. Either the regulation was wildly successful, so the problems persist only in the less-regulated spaces. Or we spend a lot of effort chasing the wrong problem.
https://www.nytimes.com/2019/05/04/us/key-fobs-north-olmsted...
Apparently the regulations work well enough to provoke an official response when garage door openers stop working over the area of a few houses… a level of reliability I’d long taken for granted
However, once aware of the potential problems it wasn't too hard or even very expensive to design hardware which avoided the most serious problems. Properly grounding components and a little bit of light shielding here and there would generally suffice to ensure most devices wouldn't cause noticeable issues more than two walls and 30 feet away. I think by the 90s the vast majority of hardware designers knew how to mitigate these issues while the evolution of consumer device speeds and designs reduced the risks of actual interference on both the 'interferor' and 'interferee' sides.
Unfortunately, the FCC's regulatory testing requirements didn't similarly evolve. Hardware designers I worked with described an opaque process of submitting a product for FCC testing only to receive a "Pass/Fail" with no transparency into how it was tested. Sometimes the exact same physical product could be resubmitted a month later with zero changes and pass. This made things unpredictable and slow, which could be a lethal combination for small hardware startups. So there emerged a sub-industry of "independent RF testing labs" which you could pay to use their pricey gear and claimed expertise to test your device and tell you why it failed, let you make a change right there and retest again until you passed. This made things more predictable but it could cost upwards of $10K (in 90s dollars) which was a serious hardship for garage startups. I was told a lot of the minor hardware changes made during such interactive testing probably did nothing to decrease actual interference in the real-world and only served to pass the test.
Then came the era of "weaponizing" FCC certification. Small startups could avoid the costs and delay of FCC testing by filing their product as a "Class A" device (which meant only for use in industrial/scientific environments) instead of as a "Class B" (consumer) device. The devices still had to not interfere but their makers could self-certify their internal tests without going through FCC testing. When new hardware startups would threaten a large, established company product with a cheaper, better product shipped as "Class A", BigCo would report them either interfering or just being used in consumer environments - despite the device very likely not interfering with anything. This ended up creating a lot of problems for such startups because if their cool new product ended up even once in an arguably "retail distribution channel", they could get hit with big fines - all without ever causing any actual interference - and even if the device was able to pass FCC testing and would have been certified as Class B. It got especially ridiculous when a lot of cheaper products were simply generic designs, like a modem using the standard Rockwell chip set and reference design. These were often made on the same production line and even used the same circuit board in a different case as other products which all passed FCC testing. But if you didn't have your official "FCC Cert", you could get busted.
I left the hardware space in the early 2000s so I never heard if these regs were ever modernized, but it sure seemed like they were in need of it.
superkuh•4h ago
And really, it's not the consumer's place to be aware of these things. It's the regulators'. And they've dropped the ball.
transpute•3h ago
Would be nice to have more metal cases for SBCs, like the one on R4S, https://www.androidpimp.com/embedded/nanopi-r4s-review
KKSB makes metal cases for some SBCs, https://kksb-cases.com
superkuh•2h ago
oakwhiz•2h ago
transpute•2h ago
dontlaugh•28m ago