If the all-in-one design was used from the off (save on moulds and shipping disc drives) they could done a pro conversion kit:
https://youtu.be/atw3FYKzog4 Also, move the joystick ports to the right rather than under the keyboard.
A few tweaks here and there would have pushed it a lot more:
Unified clocks for genlock and scrolling: https://youtu.be/yexNdSLEpIY?si=pa46sJOr_9Fin4LC
Stereo output like the CPC464: https://m.youtube.com/watch?v=0yY4BlPfLf4
AMY included, along with a 1bit PWM on the DMA chip for DMA sound.
The ST had a janky RTC anyway: https://atariage.com/forums/topic/303859-battery-pack-inside...
3 button joypads from the start, using U+D and L+R combinations for 2 more buttons
Double-sided drive from the start.
Finally, they should have included the blitter socket and a full 2x32pin expansion instead of the cartridge port. The blitter socket especially would be handy to drive a T212 transputer in 87, when the blitter was available, instead of producing the ATW.
I saw Atari ST in music studios well into the late 90s/early 2000s because back then quiet beige PCs weren't a thing yet: PCs virtually all came with super noisy fans, which was a big no-no for music studios.
A buddy would bring his Korg synth to my neighbour's house and hook it to their Atari ST. Another dude I remember would play drums from Dire Straits song from his Atari ST hooked to some MIDI gear and then he'd take his guitar and play Dire Straits songs.
These were the days.
I'm not surprised some musicians still use them. If I'm not mistaken Kavinsky (who became famous after the movie Drive came out but recently had renewed interest for he performed at the Olympics games' ceremony) started doing music at a late age, on an Atari ST a friend of his game him.
As an anecdote PCs were so noisy that I asked my neighbour (an electrical engineer) if it was possible to come up with a system where the fan would slow down when the CPU wasn't too hot: and sure enough we were then modding our PSUs with "thermistors" and we'd be calibrating our tiny hack, no shit, with boiling water in the kitchen (ah clueless teenagers). Funnily enough about 10 years later every single PSU had variable fan speed.
That's the thing: we were used to quiet 8-bit and then 16-bit computers and when we had to move to these piece-of-shit PCs (but with fast CPUs / FPUs and that were upgradeable), we had to endure these painful ultra noisy CPU/PSU fans (and HDDs).
So the Atari ST just made sense. You could have these super fast (compared to Atari ST) PCs but they were noisy, fugly, piece of unbearable shits that the cool guys in music studios simply wouldn't tolerate back then.
Now of course at some point PCs became just too good and several brands started focusing on quietness and it was then possible to have a totally silent PC, both looking cool and being literally cool (big heatsink, quiet fans, etc.).
But yeah the Atari ST was and still is certainly for some a thing for creating music.
Lots of respect to the Atari ST for his MIDI port (and that comes from a Commodore Amiga owner and fan).
DOS was crap, when you had tGEM and Amiga OS.
Windows 1 and 2 was beyond terrible.
They were shit for games
They were bulky
They were slow
They crashed all the time.
They were ugly
They were noisy.
They were hard to manage (autoexec.bat, no OS in ROM, stupidly tricky partitioning tools, incompatible drivers for the same hardware but in different applications, etc)
But IBM lost control of the hardware market so they became cheap, ubiquitous crap.
And that’s literally the only reason we got stuck with PCs.
We had affordable windowing and GUIs on the Atari and the Amiga, with instant boot from ROM and tons of RAM. The Amiga had the beginnings of multitasking and hardware acceleration for graphics and sound.
Then suddenly the industry decided to go back to a cut-down version of late 70s S-100 computing, with insanely unaffordable prices, crippled specs, bizarre semi-manual memory management, ugly hardware, and a command line interface that was basically CP/M but not as good.
Infuriating.
Just like today's GUIs. They all look like Windows 1.0
It was basically just the old 8-bit micros that kept IBM compatibles looking good.
I hate to go to bat for MS-DOS, but it had at least one real advantage over CP/M: a single disk format. As doomed to failure as the various non-PC DOS machines (e.g. Tandy 2000 and DEC Rainbow) were, they could at least share disks.
The Olivetti had a B&W 640x400 monitor and a Logitech mouse that plugged into the back of the keyboard. You could replace the 8086 CPU with an NEC V30 for a bit more speed.
Please don't take it too harshly, but your list of grievances is almost radically different to my experiences of personal computing in the late Eighties to mid-Nineties... to me somewhat of a faszinosum all on its own. In my litle corner of the world the Amiga was virtually nonexistant [1], largely undesireable, and prophesized to be a corpse already as early as 1991.
I'll give you one thing, tho: A mostly non-plastic, compact keyboard computer case (Amiga 600-like) for a suitably powerful IBM compatible would've been awfully nice. Still would be, for a vintage bird that is. We only got "Schneiderplastik", the Euro-PC to be more precise [2], and that one wasn't updated in a satisfying fashion.
1. The only people I knew that had Commodores were two of my best friends, one with a Commodore 64, the other with a 128. The demosceners I grew up with were Atari ST guys, all of them (becoming) musicians.
People got accustomed to whatever personal computer they used every day and many grew fond of it. After all, the power and capability of a desktop computer in the 80s was unprecedented and, for many, revelatory. That said, in the mid-to-late 80s, the PC platform was generally under-powered dollar for dollar compared to most of its leading competitors, most of which were based on Motorola 680x0 CPUs. The strength of the PC during this time was it's rapidly expanding library of business software applications and hardware expansion options in the form of add-in cards (something which Apple had with the Apple II but abandoned for a while with the Mac, the Atari ST never really had and only the "professional" series Amiga's had (A2000, A3000, A4000).
Being underpowered per dollar doesn't mean the PC couldn't be extremely useful or the best platform for a given scenario and it certainly doesn't mean there weren't hobbyists who used and loved their late 80s PCs as dearly as any other 80s personal computer owner. Of course, this power balance was largely addressed by the mid-90s - which is why the PC juggernaut then extinguished the Amiga, Atari and (very nearly) the Mac.
Anyway, off to some specifics:
> "The strength of the PC during this time was it's rapidly expanding library of business software applications and hardware expansion options in the form of add-in cards [...]".
A standardized general-purpose computing platform "for the future". Exactly what spoke to me, as disseminated in the publications I read as a kid in 1991.
> "Of course, this power balance was largely addressed by the mid-90s - which is why the PC juggernaut then extinguished the Amiga, Atari and (very nearly) the Mac."
"Power balance"? I didn't think in such abstracts when I made my choice, and conducted the long and merciless attrition-lobbying campaign for financial support, to buy a PC. The Amigas and the Ataris were simply not a factor for a variety of different, but very tangible and practical reasons:
Atari ST? I was not (on my way to become) a musician with the need for a precise and affordable backpack-portable computer instrument.
Amigas? The big birds were seen, outside of some specialist-niches, as uneconomical compared to their IBM-compatible brethren.
The vanilla home computers were seen as affordable, but extremely limited, racking-up (hidden) secondary costs to make them more usable. Often enough they carried a certain cultural stigma as well, being perceived by our financiers as gaming toys and therefore time wasters. And most importantly? No one I personally knew had an Amiga. Who to swap software with, where to find a mentor? Yeah...
The Atari guys I befriended used their machines almost exclusively for dabbling in electronic music, later as part of the emerging East German EBM and hard techno scene.
Games? The titles I was interested in either didn't exist on M68k platforms (flight simulations à la Aces of the Pacific, wargames such as the Harpoon series, or adventures like The Lost Files of Sherlock Holmes), were practically unplayable (e. g. Red Baron), considered inferior (e. g. Wing Commander)... or just came out too late.
By listening to stories of some Britons of my age, it only recently came to my attention how privileged I have actually been. Some of these people told me stories of buying their first A600s or A1200s only in 1993! At that time it was hard to separate me from my trusty, second-hand PC... a machine with a CPU-type nearing its eighth (!) birthday (386DX-25).
But in the late 1980s, oh my. An Amiga 500 in 1987 was really a lot better than a PC of the time for many things. It was also a lot cheaper. Maybe half the price. The Amiga and the Atari ST didn't improve enough by 1991. By then a PC was better.
But by 1988 the PC was so far outselling everything else that the writing was on the wall.
This article has a graph of market share by year.
https://arstechnica.com/features/2005/12/total-share/
People who had Amigas and Atari STs couldn't quite understand how their machines, that they perceived as so much better, were being outclassed by PCs running MS-DOS. On an Amiga 500 in 1987 you had a decent GUI. Until Windows 3 PCs didn't.
For example, Pro-Write on the Amiga having real time spell checking and being WYSIWYG in the late 1980s. It wasn't until Word 6 in 1993 that Word was really much better.
If youve only used a PC in the 90s then it’s easy to see the Atari and Amiga crowd as rose-tinted fanboys. But they’re comparing 90s IBM PCs with 80s competitors.
Really, that says more about how IBM PCs were 10 years behind the competition than about how great IBM-compatibles were.
Yes, the "bang for the buck" made all the difference. For a while.
PCs in the 80s were so bad that most homes still ran 8-bit micros with a BASIC ROM.
Windows 3.0 wasn’t even released until 1990.
And let’s be honest, the problems with autoexec.bat, config.sys and different incompatible DOS drivers for games existed throughout. It was only really the uptake of DirectX that fixed that (and to an extent, Rage / OpenGL, but you could technically get DOS drivers for them too).
But that was a whole generation of computers away from the 3.x era of PCs, and another generation again from the 80s.
But by the mid-90s, we had a whole new set of problems with PCs: the OS silently corrupting itself. It was so bad that best practice advice was to reformat and reinstalling Windows every 6 months (I did it much less regularly than that though). And this was a common idiom throughout the entire life of the 9x era of Windows too. But to be fair, that was also a common idiom with pre-OSX Macs too. Apple had become painfully shit at the point too.
If The ST and Amiga were still evolving like PCs were, then by the late 90s I’m sure Amigas might have suffered from the same longevity problems too. But in the 80s, they (and STs, Macintoshes, and Acorn Archimedes) were stood head and shoulders above what PCs could do at that point in almost every regard.
I'm East German; you people got a headstart. My relevant hands-on experience is centered around a 386DX system, technology introduced in the mid-Eighties. 1987 brought VGA, AdLib, and MT-32s to the table, with games support gearing up in '89, the year the Sound Blaster was released. Fall 1990 saw the release of Wing Commander. Of course that's just technology, economical realities tell a different story.
> Windows 3.0 wasn’t even released until 1990.
Windows was as relevant to me as an Amiga. GUIs didn't do much for me until much later. Still prefer CLIs, TUIs (and minimal GUIs that come as close to the latter as possible).
> And let’s be honest, the problems with autoexec.bat, config.sys and different incompatible DOS drivers for games existed throughout.
I never experienced serious troubles in DOS. The first two, and only, games I could not get to work were two infamously bug-ridden Windows titles of the late 90s: Falcon 4.0 and Privateer: The Darkening. By the time they fixed 'em with a litany of patches I was busy with other things.
> But by the mid-90s, we had a whole new set of problems with PCs: the OS silently corrupting itself.
News to me. How bizarre!
> But in the 80s, they (and STs, Macintoshes, and Acorn Archimedes) were stood head and shoulders above what PCs could do at that point in almost every regard.
Hardware? Until '87. Games? Until late '90 I'd say, at the earliest, accounting for a strong genre bias. [1] Then, outside of niches (Video Toaster, cheap DTP, music production) and certain "creature comforts", it was over; the eco system began to atrophy.
1. The first two DOS-platformers that wowed me visually were Prince of Persia 2 ('93) and Aladdin ('93/'94); all my other genre preferences were, to put it diplomatically, underserved on 16-bit home computers.
That doesn’t mean PCs were somehow more capable in the 80s though ;)
> Windows was as relevant to me as an Amiga.
Same point as above.
> I never experienced serious troubles in DOS.
I think that’s probably rose tinted glasses on your part then. The pain was real. Games seldom “just worked” and you often had to be really knowledgeable in your system to get stuff working.
To this day, I’ve never heard the music in Transport Tycoon because that game refused to work with whatever midi drivers I threw at it.
> > But by the mid-90s, we had a whole new set of problems with PCs: the OS silently corrupting itself.
> News to me. How bizarre!
I’d be amazed if you’ve never once heard about the old problem of computers getting slower or buggier over time and a reinstall fixing things.
> Hardware? Until '87.
You’re comparing theoretical top of the line PC hardware (which nobody actually owned and no one had written software for yet) with commodity Amigas and STs.
And even those top of the line PCs still missed a few tricks that made some genres of games better on Amigas and Atari STs, like fast blitters.
It wasn’t until the era of ray casting 2.5D 1st person shooter that PCs started looking better than their counterparts. And then things really accelerated (no pun intended) with 3D hardware graphics acceleration. Which, to be fair, was available for Amigas too, but the only software that targeted them were 3D rendering farms.
It clarifies specifics relating to my personal experiences with the discussion matter, addressing (perceived) realities of a local market. How people use computers is of the utmost relevance; a fact which you, given your lamentations here, certainly must have internalized.
> I think that’s probably rose tinted glasses on your part then. The pain was real. Games seldom “just worked” and you often had to be really knowledgeable in your system to get stuff working.
No rose-tinted glasses here. And I believe you that your and others' pain was real. Many people could not work their heads around a PC; many of 'em fell for cheap SX clunkers with other substandard components, ffs. That's obviously an inherent problem of such an open platform: PCs are highly individual in many subtle ways; a trade-off one had, and still has, to negotiate in one fashion or another.
> You’re comparing theoretical top of the line PC hardware (which nobody actually owned and no one had written software for yet) with commodity Amigas and STs.
I'm comparing hardware available on the market (with key system components coming together in 1987/88, and games supporting such top-of-the-line hardware showing up in numbers from '88 onwards). I also spoke to economical realities in nearly every post in this disc; I am well aware that 16-bit home birds had a technical lead for a short while, and were an even better value proposition for many people a while longer. For some, just as valid, this still holds true.
> And even those top of the line PCs still missed a few tricks that made some genres of games better on Amigas and Atari STs, like fast blitters.
Yes, already addressed by referring to Prince of Persia 2 and Aladdin (1993/94!).
> It wasn’t until the era of ray casting 2.5D 1st person shooter that PCs started looking better than their counterparts.
So, your stylistic (genre) preference maps it into the time between 1991 (with Hovertank 3D in April as well as Catacomb 3-D in November) and Wolfenstein 3D (May 1992). Okay.
With mine it begins earlier, largely because of proper 3D-titles: Deathtrack (1989, PC-exclusive), LHX: Attack Chopper (1990, no Amiga/Atari port), and Red Baron (1990, got the Amiga slideshow in 1992), as well as the odd non-3D action title here and there, e. g. Silpheed (1989, no Amiga/Atari port).
One can probably go even back to 1988, for at least parity in certain markets and their segments, if one compares the technological edge in an intellectually honest fashion, i. e. what the platform, hardware and software, was really technically capable of.
And productivity software, part of the deal, is of course its very own world.
I remember my PC1 fondly. Well, I still have it. I learned to code in GW-BASIC, Turbo Pascal and C (in that order) with it. I was using it for a long time, until 1997, for serious work (coding and university assignments), when I finally had the money to upgrade to a Pentium PC.
As much as my world was PC-centric, the first time I saw an Atari ST and what it could do, my jaw dropped. I knew of the Amiga from magazines, but the first time I actually saw one was several years later, after I acquired my Pentium PC and I admit it wasn't that impressive then. But still I couldn't help but think: "wow, you had that in 1989?". There was no comparison with the PCs of its time.
> But still I couldn't help but think: "wow, you had that in 1989?". There was no comparison with the PCs of its time.
This speaks more of local market realities, e. g. "demo software" on hardware in an actual computer shop or, as in your example, at a friend's home, especially in the form of 2D arcade action games, then at its peak of popularity on 8- and 16-bit home computers... and yet to shine on PCs (and as opposed to glossies and the like, where the first 486 machines stepped onto the stage turn of the year 89/90).
But at that time I wasn't thinking about computers that much, still digesting the beginning of the end of the GDR.
Every time I got to play on non-PC home computers I’d be blown away by how much better those machines were.
These days I collect retro hardware and the Atari STs and Amigas are still easier to maintain.
So my opinions aren’t that of a Amiga fanboy. PCs in the 80s were really that shit.
I do think a lot of the problem was Microsoft though. I never liked any of Microsoft’s software even back then. And that was long before they gained the reputation that so many older timers like myself still remind people about. I actually wrote my own windowing system in Pascal because I was fed up with early Windows. It wasn’t a patch on GEM but back then I didn’t know you could run GEM on a PC.
They'd have to have been a bit more careful about it than IBM were.
I am confident it would still feel like everything is terrible.
Apple did survive in that era, though not unassisted, and the differentiation they landed on(particularly after Jobs came back) was to market a premium experience as an entry point. I think that is probably going to be the exit from today's slop.
In this era, spec is not a barrier - and you can make <$100 integrated boards that are competent small computers, albeit light on I/O - and that means that there's a lot more leeway to return to the kinds of specialty, task-specific boxes that the PC had converged. There's demand for them, at least at a hobbyist level.
For example, instead of an ST and outboard synths for music, you could now get an open-source device like the Shorepine Tulip - an ESP32 touchscreen board set up with Micropython and some polished DSP code for synths and effects. It's not powerful enough to compete with a DAW for recording, but as an instrument for live use, it smashes the PC and its nefarious complexities.
I never really understood why people thought this was a big deal. I had my Amiga hooked to a DX7 synth with a serial to MIDI cable that had a couple active parts in it. MIDI is a serial protocol and the Amigas had full RS232 ports with hardware interrupts, +12v, -12v, as well as audio in and out on unused pins. The serial to MIDI In/Out cable cost around $15 more than two MIDI cables. You can still buy them today: https://retroready.one/products/ka12-serial-port-midi-interf....
While the Amiga could do multi-tasking, applications could also optionally take over the entire machine, which many games obviously did but so did real-time video applications like the Video Toaster, animation, audio and music apps. Lots of Amiga games and video apps "raced the beam" in real-time without ever dropping a field of 60 fps video. So, at least in principle, the Amiga hardware was as capable as the Atari ST in terms of ability to respond to interrupts at MIDI rates. The Amiga also had video editing software, which back then, involved controlling three professional VTRs in real-time over that single serial port to do A/B roll editing on precise timecodes 29.97 times a second.
So, yeah, I agree that the Atari totally won the MIDI production music market because it had much better software applications. If you were primarily a music producer or studio, there was certainly no reason to pay more for an Amiga's custom graphics capabilities - and if you were serious, the Amiga's more limited music app selection made the choice for you. My only quibble is that, IMHO, the claims of 'MIDI jitter' somehow being endemic to the Amiga were more Atari marketing FUD than reality. I don't doubt that some user at some point did something on an Amiga that caused MIDI timing issues, but it wasn't because of some fundamental hardware limit. It was due to configuration, some app incompatibility, bug or some other solvable issue - because similar timing issues could occasionally happen on the ST too - and then would be solved.
There were ways to do non-interlaced video on the Amiga, but just like having to buy external MIDI adapter ... more $$, more hassle.
That and floppy format compatibility with MS-DOS made it easier to move data around.
I think they’d be better on the back unless you are supposed to plug them out all the time.
> Finally, they should have included the blitter socket
That would be hard without having a functioning one first. The blitter would be also handy for a number of things, from PCM sound to network and disk transfers.
I agree it would be difficult to design a correct socket, but from interviews it was always the plan to have a blitter, and a socket as standard would have helped adoption.
The main thing is that the T212 is a great coprocessor, faster than the 68881 fpu and with a 2k cache. Introducing the transputer as a coprocessor would potentially have changed the computing landscape
Obviously it'd be better to have a protocol like the megadrive, but given the setup in the ST, this is a hack without having to change the ST hardware.
If the up+down button was used for firing the main weapon, this wouldn't work, especially when you've got the machine gun and you can just hold the button down. You wouldn't be able to duck or aim up while shooting, or it would be unreliable.
If you were trying to use it for a three button fighting game, I think you'd have problems too. Especially if you are doing 'negative edge' techniques where you would hold a button while inputting the directional sequence and release the button to trigger the special.
The 9 pin pinout has 2 spare button inputs anyway. Maybe it'd be feasible to use those.
Beating the Amiga to market, and beating it on price were super important.
But I do think there was a serious problem with follow through. The Blitter and GDOS and then the STe came too long to come after. The Blitter never became standard, and the games and software suffered for it. And updates on the operating system were slow and thin until it was way too late.
I do agree that the cartridge port thing -- it being limited to 128kB expansion -- was needless. One more pin, even, would at least allow for a proper OS upgrade via cartridge port! Definitely one of the stupidest design decisions on the machine.
Realistically it's amazing the ST was as good as it was, given the 6 month development time and the kings of penny pinching at the helm :)
I quite liked the STe. The mono monitor was great, RAM upgrades were easy, and they'd improved some of the display hardware's worst limitations. Even though TOS was never especially good, they'd fixed all the worst bits by that point.
Still could have benefited from some other extra hardware and OS tweaks though I think.
- 800 KB disk format supported directly by the OS
- blitter is not as useful as it could be, due to sharing bus time with the CPU. It should be able to use ACSI bandwidth if not in use/Shifter bandwidth during non-display periods, so it can run in parallel with the CPU
- 256 px 5 bitplane mode (so still 40 words per line), probably an EHB kind of affair if 32 palette entries would be too much
- something to improve endless scrolling? No carry out of bit 15 when computing Shifter address? You'd end up wrapping before the display period was finished if increasing the display stride, but you can work around that in software...
- put the 9 pin joystick ports on the side
- write signal for that stupid cartridge port that is almost (but not quite) useful for general purpose expansion
Things I remember about about the 520ST:
- Those horrible diagonal function keys. There was no reason for them to be diagonal, rather than normal keys as they were on the IBM. But I've always hated function keys.
- Games like Dungeon Master (really still quite a good game today).
- Not a bad C compiler, but I can't remember who by - LightSomething?
- The GEM GUI was not so bad, but using it with a floppy disk was.
But all-in-all I was quite happy to get my PC-compatible to do serious work with.
I guess the idea was to have a clean design with cables out of the way, but it really was a bad place for them.
I don't know if they were consistent with the other keys in terms of feel, but they were a striking, unique design feature that instantly identified the machine as being Atari without compromising practicality.
When I see generations that grew up with game consoles, and talk about the current uptake on desktop games, they really have no idea what they missed out in home computing and the first wave of indie game development, from bedroom coders .
Tangent: the older I get, the more it annoys me that this expression kind of implies a failure of young people to study history, when I feel like it's more the responsibility of previous generations to preserve and pass down history for them to learn from. Especially because it's usually people in power in some form who are trying to keep the newer generations naive here so they can be fooled again.
Not saying that this interpretation was your intent (in fact I suspect it's the opposite), just directly expressing my annoyance at the expression itself.
However curiosity also plays a big role.
If I know so much about computing history since the 1950's, is because I do my research, and take advantage of all archives that have been placed online, certainly I wasn't around to live all of it.
But everything has been preserved and passed down. The entire home computing phenomenon has been archived and is available on the internet thanks to the rampant 'software piracy' which was common at the time, and detailed schematics and manuals coming with the computers (which have all been digitized and are available on the internet). Even my obscure KC85 games I wrote as a teenager and 'distributed' on cassette tapes by snail mail are available as download because some kind person(s) digitized all that stuff during the early 90s and put it on some 'underground' game download portals.
The 80s and early 90s home computer era will be better preserved than anything that came after it.
Indeed. Sadly, many more recent games will probably be lost to time forever due to DRM, online service components going offline or never being distributed on physical media in the first place. As someone into vintage computers and preservation, I worry that future generations may look back and see the late 2010s and certainly the 2020s as a 'dark age' when surveying the history and evolution of digital gaming. All we'll have are YouTube videos (assuming those survive future business model tectonic shifts) but no ability to actually experience the game play first-hand.
Recently I've been exploring the back catalog of more obscure PS3 and X360 games via emulation and have found some absolutely terrific titles I never even knew existed. Some of them were only ever sold through the console's online store and never available on physical media. With the XBox 360 and Nintendo Wii stores now long offline, only the PS3 store remains available - and who knows for how much longer, since Sony already announced its closure once and then changed their mind. There's now a race to preserve many of these titles
The good news is that not only was almost all of it preserved, teenagers today are really interested in retro gaming. My 15 year-old daughter, who's not into computers more than any other 15 year-old girl, just asked if she could go with me to the vintage computer festival this Summer. She tells me her friends at school are all interested in running emulators to play classic games from arcade to SNES to PS2 and N64.
I guess the 'dark lining' to that silver cloud is that this interest from teens in retro gaming is partly thanks to the increasing downsides of modern gaming (cost, DLC, ads, hour-long download/installs, etc). While game graphics continue to get more and more impressive, stuff like real-time path tracing doesn't seem to excite teens as much as does me. Ultimately, it's about game play more than visuals. Lately I've been exploring the immense back catalog of N64, PS2, PS3 and x360 games via emulation and there are some incredible gems I never even heard about back in the day. It's especially great now thanks to the huge variety of mods, enhancements, texture packs, decompilations/recompilations and fan translations. And current emulators can upscale and anti-alias those games even on a potato desktop or laptop with a low-end discrete GPU.
The Mega STe had a funkier case, VMEbus, and upgraded specs, but mushy rubber dome keyboard, more brittle plastics.
I like to collect the Mega as the best of the bunch, personally.
I liked how the keyboard was detachable and the hard drive was the same size at the motherboard case, so you could stack them.
Is this in any way related to the general "speed is going up but latency is getting worse" phenomenon of hardware in the last decades?
The Atari 2600 for instance was known for "racing the beam", updating the image while it was being drawn on the CRT monitor. A latency measured in pixels rather than frames! It was necessary because the console didn't have enough memory for a framebuffer.
The Atari ST is special for the inclusion built-in of MIDI ports, and it was made cheaply, which at the time meant direct connections and it resulted it low latency.
We like being able to plug everything anywhere. And I admit it is damn cool being able to connect a display, most kinds of storage devices, keyboard and mouse, all while charging my laptop on a single port, at any time. I may even be able to disconnect my laptop and put my phone instead and it will do something sensible. If you did that back in the day, there was a good chance for one of the devices to turn into a smoke machine.
It comes at a cost though.
Back in the day, you would not have been able to do any of this with one port. Each type of device had it's own uniquely shaped connector/pins combo. You were not going to connect your SCSI devices into the VGA monitor port accidentally. Closest I ever saw was someone attempting plug in a Mac ADB cable to the S-Video port, but that just resulted in bent pins. It just so happened those pins were on an Avid Film Composer dongle instead of a replaceable cable.
I think there's a very strong future in emulation of achieving FPGA-like latency by using a Raspberry Pi Pico/Pico2 to emulate each of the target machine's subsystems/chips.
antirez mentioned running some of these on RP2040's
Oh wow! I remember hearing that oculus were doing this on their devices and thinking it was new.
Latency would be a Basic Feature. Once you cross 7 ms (or 5 ms, or even 3 ms if you absolutely insist) you're happy, above that everything is absolutely unusable.
The Atari has an absolute stable and extremely low jitter. Some guy measured it to 1µs. Cannot find the link though, sorry.
So the Atari has low latency around 2-4ms with an extremely low jitter. This is execatly what you want from a MIDI clock and sequencer driving multiple MIDI devices.
Look, I'm not trying to convince you to get rid of your Ataris, quite the contrary. I'm just disagreeing that it's impossible to have low jitter nowadays, but I fully agree that things used to be simpler before everything was done via USB.
Here's a review from a nice scotsman explaining how this works:
https://www.youtube.com/watch?v=XCZqkSH9peI
or a walkthrough from the creator:
https://www.youtube.com/watch?v=hkw9dmLfkZQ
Note that this is an old version, I just saw the there's now the "Nome II", and at least for Mac, he has actually developed a USB protocol to provide a stable clock (which as you've already written is totally possible via USB, it's just nobody cared enough):
https://midi.org/innovation-award/u-sync
For Windows, the sync is still done via audio through a special VST.
The YT channel by the creator has many more interesting stuff, he also has done very precise jitter measurements, see for instance:
Regarding "midi notes" Sim'n Tonic himself is saying this to the Midronome: "Note that only these MIDI messages are simply forwarded when they are received, their timing is not changed. So if your DAW sends them with a lot of latency and/or jitter, the Midronome will forward them with the same latency/jitter. Actually this is a problem I plan on tackling as well [...]"
So the Midronome does not solve the problem of inaccurate midi notes coming from a modern DAW. The USAMO does by the way.. But only with one midi channel at once. And of course, coming back to the actual topic, the Atari hasn't a problem at all with accurate midi notes, it is absolutely tight at all 16 channels. So it seems there is indeed nothing comparable to the Atari nowadays. Maybe it will in the future.
Can Nome II send MIDI Notes?
Nome II is like a MIDI hub, you can ask it to forward any MIDI sent over USB to one of its MIDI outputs. It will not only forward these instantly but merge them smartly with the MIDI Clock, without affecting it.
This is simply not true. Many performers use Windows laptops and MIDI to control their stage equipment without issue.
MIDI is a serial protocol.
At any given time only one message can be sent down the wire. [1]
So on the beat, an implementation can send either the clock pulse or note on or something else. [2]
If you send the clock everything else has to wait. If you send something else, the clock has to wait.
Now with modern computers, you are also dealing with USB which is a low priority parallel protocol and has to coordinate with everything else a modern kernel does.
Music is hard.
[1] premium hardware sequencers sometimes have two or more Midi Out to reduce contention.
[2] Midi Time Code solves this by encoding monotonic time into Midi and is how serious sync is done over Midi, e.g. in Hollywoood
You are concerned about a 9600 baud protocol.
There is zero 'shame' on the 'present future' when it comes to music production tools. It is like one of the hugest bright spots/biggest equalizers. Best thing I did was go ITB. No headaches. No hardware maintenance on obscure hardware. No MIDI limitations or even considering of my MIDI chains. Just music making.
As an aside, all-digital workflows take the joy out of music being made in the moment, by ear and by feel. There is no replacement, for example, for a professional sound engineer adjusting a mix strictly by the sound in their headphones and the feel of the sliders under their fingers.
No they cannot.
Regarding jitter, this is the worst, because the brain cannot adapt to the changes, whereas constant latency can be regulated somehow by the brain.
I’ve got a full studio at home, but tbh i never know what people mean by this
[0] https://upload.wikimedia.org/wikipedia/commons/5/54/Atari_10...
It's funny how some young producers today wonder "how did people do it without a computer before the 2000?"...well guess what, we did used computers! I cannot however remember what software sequencer I was using, I know it had MIDI effects (like MIDI echo), that's all I remember.
And by 1998, Logic was fairly advanced anyway and even had plenty of plugins.
Possibly/ probably Cubase. Anyone remember the Mike Hunt version? I'm still using cubase on a nice pc, but I miss the stability of the atari.
Honestly, I don't think even Apple could touch the best of Atari and Commodore industrial design in the back half of the 1980s. To be blunt, the early Macintoshs simply weren't practical in their design: for starters, a tiny monitor - that was originally black and white (which in 1984 was already kind of a joke) - and very limited upgradeability, relatively poor multimedia capabilities (speech synthesis was no more than a gimmick that was also available on other platforms), and then the whole aesthetic just wasn't that pleasant.
And I say this as someone who, personally, has only owned Apple machines for the past 15ish years, so I'm not exactly coming at this from a "not a fanboi" perspective. I'd still take 1980s Atari or Commodore aesthetic over modern Apple, or modern anything else for that matter[0].
Also, as an aside, I really enjoyed seeing "Atari Means Business with the Mega ST" as the top headline on Hacker News in 2025. Even on a Sunday when content typically tends to be more varied and interesting this was still an entertaining surprise.
[0] I suspect the reality may be that I'm an "anything but Wintel" kind of person, although not at any cost, because I did run PCs exclusively for 11 or 12 years. They never really helped me enjoy computing in the way the other machines have though.
For example: I cannot think of any desktop models that lacked internal expansion. They may have used a riser card to stack in two or three slots sideways, but the slots were there. The design may have been crude, but at least your desktop wasn't turned into a disaster every time the technological landscape shifted: when hard drives became affordable, the world switched to 3.5" floppies, if you decided to use online services or send faxes directly from your computer, get a CD-ROM, or cable Internet.
This says that the keyboard on the Mega ST was better. And yet still not good enough. Egads, that ST mess was a terrible keyboard.
Still liked the Speccy better…
I have an adapter on mine that converts it to USB and I can use it on a modern computer.
Though I never do. Mainly because it's got Ctrl/Alt[Meta] but nothing I could map to Hyper/Super.
Mega STe and TT reverted to terrible mushy rubberdome.
I was astonished to find about 22 distinct C compilers, including their own libraries, assemblers, linkers etc. for the Atari ST and its successors. That's not counting separate versions, just distinct products from different vendors.
From what I can see now looking at archive sites, there was a huge amount of activity in developer tools on the ST back in the day. Much more than I thought at the time. It might have been a serious contender for the dominant architecture (along with the m68k CPU), if IBM PC-compatibles and x86 hadn't won.
Recently I looked for Atari ST C compilers, out of curiosity to test portability of a C program I'm working on.
I've been testing C code for diverse Unix systems.
As I used to own an Atari 520ST (with 1MB RAM beautifully piggy-backed and hand-soldered on the existing RAM chips :-), it seemed like a good idea to peek at C on an ST emulator. I didn't use C when I had a real Atari ST (no C books in my local library), so I expected to find one or two C compilers, not 22!
If I recall, Lattice C was popular. Mark Williams was another one. "Alcyon C" was included I think in the ST development kit, but was considered poor.
I think people use "Pure C" these days, but of course also GCC is likely best:
http://vincent.riviere.free.fr/soft/m68k-atari-mintelf/
Is maintained by Vincent Rivière, who is a major contributor on EmuTOS.
Try and get a compiler and linker to fit in 360k these days!
There wasn’t such a thing as a general developer market.
When you didn’t have internet and cloud services and free Unix, how could you develop for something else than a specific platform and device?
If you bought a Mega ST to write programs, your target audience were still only the people who had a regular ST. You couldn’t reach anyone else. So the advantage was minimal.
The idea that there can be a developer market separate from the baseline end-user platform is quite new. It emerged around 2007-2010 when web apps became a realistic option and you didn’t have to be on Windows to target the 90% of people who are on Windows.
rbanffy•1d ago
And with emulated VME graphics, with an HDMI output and a USB-C port. And 3-button mouse. Able to run Atari Unix.