I own 4 Mega STe and I am using them almost daily. One of them, the rest is spare parts. Producing music with it. The Atari is my MIDI master clock and central piece of MIDI sequencing together with Cubase 3.1 for the Atari. Seriously the MIDI timing is unbeaten until today! The MIDI ports are directly attached to the CIA chip which is again directly connected to the Motorola 68k CPU. Runs absolutely stable even 35 years later. No crashes what so ever and also no distractions by updates or "phone home applications". It just works, distractless! Shame on the "present future".
vanderZwan 2 days ago [-]
> Seriously the MIDI timing is unbeaten until today!
Is this in any way related to the general "speed is going up but latency is getting worse" phenomenon of hardware in the last decades?
GuB-42 2 days ago [-]
Yes, back in the days, I/O was often really low latency because memory and therefore buffers were expensive and gate count was limited, it meant more direct connections, which meant low latency.
The Atari 2600 for instance was known for "racing the beam", updating the image while it was being drawn on the CRT monitor. A latency measured in pixels rather than frames! It was necessary because the console didn't have enough memory for a framebuffer.
The Atari ST is special for the inclusion built-in of MIDI ports, and it was made cheaply, which at the time meant direct connections and it resulted it low latency.
zozbot234 2 days ago [-]
You can have low latency and low jitter today, but you will need to use a microcontroller not a general-purpose CPU. The old 16/32 bit retro machines are essentially microcontroller architecture devices with general-purpose computer peripherals, for pretty much the reasons you mention. But there are many cheap microcontrollers available today, such as the Raspberry Pico series.
GuB-42 2 days ago [-]
And when you factor in FPGAs, you can get down to the microsecond or less. Low latency is possible, it is just that priorities are often elsewhere.
We like being able to plug everything anywhere. And I admit it is damn cool being able to connect a display, most kinds of storage devices, keyboard and mouse, all while charging my laptop on a single port, at any time. I may even be able to disconnect my laptop and put my phone instead and it will do something sensible. If you did that back in the day, there was a good chance for one of the devices to turn into a smoke machine.
It comes at a cost though.
dylan604 2 days ago [-]
> If you did that back in the day, there was a good chance for one of the devices to turn into a smoke machine.
Back in the day, you would not have been able to do any of this with one port. Each type of device had it's own uniquely shaped connector/pins combo. You were not going to connect your SCSI devices into the VGA monitor port accidentally. Closest I ever saw was someone attempting plug in a Mac ADB cable to the S-Video port, but that just resulted in bent pins. It just so happened those pins were on an Avid Film Composer dongle instead of a replaceable cable.
YZF 2 days ago [-]
I think modern general purpose CPUs are perfectly capable of low latency and jitter. The problem isn't the CPU, the problem is stuff around the CPU (mostly the operating system). The less deterministic aspects of modern CPUs (branch prediction, speculative execution, caches etc). happen at timescales much smaller than what you usually care about (and possibly smaller than the jitter specs on microcontrollers).
bbarnett 2 days ago [-]
ps/2 keyboards trigger an interrupt. USB are polled.
Lerc 2 days ago [-]
A rp2350 with psram and microsd could probably do a commendable job at pretending to be an entire Atari ST while providing a bootload of extra low latency goodies at the same time
bitwize 2 days ago [-]
I refer to the RP2xxx chips as "headless Amigas" because their PIO modules essentially function like Coppers: they are simple state machines that offload I/O functionality off the CPU.
I think there's a very strong future in emulation of achieving FPGA-like latency by using a Raspberry Pi Pico/Pico2 to emulate each of the target machine's subsystems/chips.
Lerc 2 days ago [-]
Have you seen https://github.com/floooh/chips A bunch of PIO linked chips using these interfaces would feel like a weird blend of software and hardware that stands separately to the FPGA world. I have wondered if it would actually work as a larger scale paradigm. Imagine a single piece of silicon with a bunch of RP2xxxx level processor/ram blocks with PIO links between them all. I'm not sure how it would come out compared to FPGAs for balance of flexibility/cost/power consumption/etc. but I suspect it could find a niche.
antirez mentioned running some of these on RP2040's
kjs3 2 days ago [-]
The 68000 is "essentially microcontroller architecture"? I don't think there are many people who understand architecture that would agree with that statement.
nailer 2 days ago [-]
> The Atari 2600 for instance was known for "racing the beam", updating the image while it was being drawn on the CRT monitor. A latency measured in pixels rather than frames!
Oh wow! I remember hearing that oculus were doing this on their devices and thinking it was new.
snickerbockers 1 days ago [-]
FWIW you can get excellent latency on a modern device but only if you run everything in real mode and forgo complicated buses like USB that are effectively network link layers.
weinzierl 2 days ago [-]
This is true, but in my opinion also misleading. Speed and latency are fundamentally different. Speed would be a Performance Feature in the Kano model, meaning there is usually a linear relationship between speed and user satisfaction.
Latency would be a Basic Feature.
Once you cross 7 ms (or 5 ms, or even 3 ms if you absolutely insist) you're happy, above that everything is absolutely unusable.
Aldipower 2 days ago [-]
You are missing out the jitter. This is often the worst part of modern implementations. If there is a jitter of 4ms and peaking sometimes with 20ms, then a 5ms latency is still bad. This implementation is basically unusable. Like many modern USB ones..
The Atari has an absolute stable and extremely low jitter. Some guy measured it to 1µs. Cannot find the link though, sorry.
So the Atari has low latency around 2-4ms with an extremely low jitter. This is execatly what you want from a MIDI clock and sequencer driving multiple MIDI devices.
deng 2 days ago [-]
How do you think any professional works nowadays with MIDI? A good, modern USB interface (from Focusrite or similar) has a jitter well below 1ms, usually in the range of 200µs. If that is too much, simply sync your DAW with an external, dedicated clock, which will usually give you a jitter in the single µs range.
Aldipower 2 days ago [-]
I have a Focusrite and the MIDI timing is terrible. Sure, there is more to it then just the interface. With USB you just cannot guarantee a stable midi timing, because there is no good midi buffer implementation for it. Technically it would be possible, but no one cares.. Professionals using something like MIDI to audio converters via an VSTi plugin that takes midi signals, modulates them onto a audio signal (which can be easily buffered) and some dedicated outboard equipment converts this back to MIDI. If you are working with hardware synths, etc. this is the only option you have nowadays with non-vintage hardware. A lot of producers do not work with midi anyways, they use plugins, that's why it is some kind of a niche problem and there's not much talking about it.
deng 2 days ago [-]
First off, I'm assuming of course we are talking Mac here, because Windows is unusable for MIDI. If you have terrible MIDI timing with a Mac, then yes indeed, you'll need to sync via audio, but there are nice and inexpensive solutions for this, for instance the Midronome.
Look, I'm not trying to convince you to get rid of your Ataris, quite the contrary. I'm just disagreeing that it's impossible to have low jitter nowadays, but I fully agree that things used to be simpler before everything was done via USB.
Aldipower 2 days ago [-]
Agreed. It is of-course not impossible, but it is almost impossible out-of-the-box (literally ;-)) I have a USAMO (Universal Sample-Accurate MIDI Output) device, but do not use it, because as I said, Atari is king here. :-) Not sure how the Midronome can solve the problem of midi notes coming inaccurate from a modern DAW? But maybe I do not understand it completly. Need to have a deeper look. Since some years I am using Linux with a Focusrite for mastering and audio tracking. Midi was bad with Linux and Windows since I got my first USB interface and went away from PCI interfaces. But this shouldn't matter too much. :-)
deng 1 days ago [-]
> Not sure how the Midronome can solve the problem of midi notes coming inaccurate from a modern DAW?
Here's a review from a nice scotsman explaining how this works:
Note that this is an old version, I just saw the there's now the "Nome II", and at least for Mac, he has actually developed a USB protocol to provide a stable clock (which as you've already written is totally possible via USB, it's just nobody cared enough):
Thanks a lot!
The scotsman is cool and his t-shirt too. :-D T-Shirt says in German "Little pig".
Regarding "midi notes" Sim'n Tonic himself is saying this to the Midronome:
"Note that only these MIDI messages are simply forwarded when they are received, their timing is not changed. So if your DAW sends them with a lot of latency and/or jitter, the Midronome will forward them with the same latency/jitter. Actually this is a problem I plan on tackling as well [...]"
So the Midronome does not solve the problem of inaccurate midi notes coming from a modern DAW. The USAMO does by the way.. But only with one midi channel at once. And of course, coming back to the actual topic, the Atari hasn't a problem at all with accurate midi notes, it is absolutely tight at all 16 channels. So it seems there is indeed nothing comparable to the Atari nowadays. Maybe it will in the future.
deng 1 days ago [-]
Not sure if that is still accurate. This might only be available for Mac, but on the FAQ for Nome II it says this:
Can Nome II send MIDI Notes?
Nome II is like a MIDI hub, you can ask it to forward any MIDI sent over USB to one of its MIDI outputs. It will not only forward these instantly but merge them smartly with the MIDI Clock, without affecting it.
handbanana_ 1 days ago [-]
> because Windows is unusable for MIDI
This is simply not true. Many performers use Windows laptops and MIDI to control their stage equipment without issue.
deng 1 days ago [-]
The Windows MIDI/USB stack adds considerable amount of jitter to the MIDI clock, compared to the much superior ones in MacOS. I will fully admit that "unusable" is a personal opinion based on my experience. Of course performers also use Windows, but I heavily doubt you are able to see which device in their rack acts as a master clock, and how they sync their devices, apart from the fact that most performers nowadays don't use MIDI at all.
handbanana_ 14 hours ago [-]
Midi is used heavily for guitar patch and lighting automation as well as triggering backing tracks in a DAW running on stage. The use of MIDI (over USB) has only increased on stages.
deng 9 hours ago [-]
This is getting ridiculous, we are talking about making music, so triggering notes from different devices in sync. You know, what MIDI was originally designed for, not triggering some lights, guitar patches or a background track. You are exactly proving my point: MIDI nowadays is pretty much reduced to SysEx for doing simple automations. None of that is seriously affected by jitter in the ms range. You sound like you have no idea how electronic music was done before VSTs were a thing.
Aldipower 2 days ago [-]
Yes, I think this is a on point statement. :-)
brudgers 2 days ago [-]
Not really.
MIDI is a serial protocol.
At any given time only one message can be sent down the wire. [1]
So on the beat, an implementation can send either the clock pulse or note on or something else. [2]
If you send the clock everything else has to wait. If you send something else, the clock has to wait.
Now with modern computers, you are also dealing with USB which is a low priority parallel protocol and has to coordinate with everything else a modern kernel does.
Music is hard.
[1] premium hardware sequencers sometimes have two or more Midi Out to reduce contention.
[2] Midi Time Code solves this by encoding monotonic time into Midi and is how serious sync is done over Midi, e.g. in Hollywoood
genewitch 1 days ago [-]
What's the S in USB?
an_aparallel 2 days ago [-]
I really wish you could easily get somwthing like the Myster ST clones...seems like supply is spotty, and price seems pretty high. Id love an original if they were less marked up too...
khazhoux 2 days ago [-]
> Seriously the MIDI timing is unbeaten until today!
I’ve got a full studio at home, but tbh i never know what people mean by this
_DeadFred_ 2 days ago [-]
I have the equivalent of a $500,000+ studio from my childhood, all in my laptop.
You are concerned about a 9600 baud protocol.
There is zero 'shame' on the 'present future' when it comes to music production tools. It is like one of the hugest bright spots/biggest equalizers. Best thing I did was go ITB. No headaches. No hardware maintenance on obscure hardware. No MIDI limitations or even considering of my MIDI chains. Just music making.
deng 1 days ago [-]
There's not one word in his post where he looks down on VSTs or anything. It's just how he likes to make music, and he is unhappy with the state of modern MIDI implementations. In fact, it's the exact opposite: you are shaming him for still using MIDI.
_DeadFred_ 1 days ago [-]
His Shame on the "present future" comment disagrees with you.
deng 8 hours ago [-]
Like C++, this cannot be parsed with a context-free grammar. The "present future" refers to
No crashes what so ever and also no distractions by updates or "phone home applications"
which is something I would guess most people would indeed see as shameful regarding our present future in software, but OTOH, this is HN, so who knows.
_DeadFred_ 4 minutes ago [-]
My old synths crashed, and required physical maintenance. In addition I lost songs that failed to read off of the crude tape backup that my Quantitizer/sequencer used.
I am much happier with my setup that I could have never afforded than my much lessor previous setup. This being HN I'm sure that are people than can afford to spend much more than me for gear so they might prioritize the minor differences you list over no access at all, but I much prefer having access.
bitwize 2 days ago [-]
Some musicians still like to play instruments -- for them and their listeners, ITB production is seen as a cheat and not real musicianship -- and for them the lack of a stable MIDI clock on today's hardware absolutely does matter. A trained musician can feel time difference as small as 1 ms. Any latency or jitter greater than that and a perfect track could be ruined.
As an aside, all-digital workflows take the joy out of music being made in the moment, by ear and by feel. There is no replacement, for example, for a professional sound engineer adjusting a mix strictly by the sound in their headphones and the feel of the sliders under their fingers.
_DeadFred_ 1 days ago [-]
I have a Novation SL MkII as my controller keyboard. It is a much more tactile experience than say a DX7 or other menu diving synth. It has faders built in for mixing. As someone who has done both I have so much more joy being all digital. I have access to so much that I never did before.
handbanana_ 1 days ago [-]
> A trained musician can feel time difference as small as 1 ms
Great paper. So the average psychophysical gap threshold is at ~2ms! This is lower then I expected tbh, but always suspected.
Aldipower 1 days ago [-]
You always want latency or jitter as low as possible. Latency adds up in the chain. Instrument/Sequencer (2ms) -> Some digital effect (3ms) -> Digital Mixer (3ms) -> In-Ear monitor bridges an air gap with processing (6ms)
Suddenly you have 14ms latency. Bad, but reality. So, every ms less is better.
Regarding jitter, this is the worst, because the brain cannot adapt to the changes, whereas constant latency can be regulated somehow by the brain.
aa-jv 1 days ago [-]
MIDI baud rate is 31250, not 9600.
alexisread 2 days ago [-]
So many missed opportunities with the ST, given the breakneck development speed.
If the all-in-one design was used from the off (save on moulds and shipping disc drives) they could done a pro conversion kit:
3 button joypads from the start, using U+D and L+R combinations for 2 more buttons
Double-sided drive from the start.
Finally, they should have included the blitter socket and a full 2x32pin expansion instead of the cartridge port.
The blitter socket especially would be handy to drive a T212 transputer in 87, when the blitter was available, instead of producing the ATW.
tom_ 2 days ago [-]
The tiny ST with the external disk drive had the joystick ports at the side - a far superior design.
I quite liked the STe. The mono monitor was great, RAM upgrades were easy, and they'd improved some of the display hardware's worst limitations. Even though TOS was never especially good, they'd fixed all the worst bits by that point.
Still could have benefited from some other extra hardware and OS tweaks though I think.
- 800 KB disk format supported directly by the OS
- blitter is not as useful as it could be, due to sharing bus time with the CPU. It should be able to use ACSI bandwidth if not in use/Shifter bandwidth during non-display periods, so it can run in parallel with the CPU
- 256 px 5 bitplane mode (so still 40 words per line), probably an EHB kind of affair if 32 palette entries would be too much
- something to improve endless scrolling? No carry out of bit 15 when computing Shifter address? You'd end up wrapping before the display period was finished if increasing the display stride, but you can work around that in software...
- put the 9 pin joystick ports on the side
- write signal for that stupid cartridge port that is almost (but not quite) useful for general purpose expansion
bravesoul2 2 days ago [-]
It's good enough for Fatboy Slim
TacticalCoder 2 days ago [-]
The Atari ST had a MIDI port: that notoriously lacked on the Commodore Amiga (I think an Amiga with a stock MIDI port would have been a homerun).
I saw Atari ST in music studios well into the late 90s/early 2000s because back then quiet beige PCs weren't a thing yet: PCs virtually all came with super noisy fans, which was a big no-no for music studios.
A buddy would bring his Korg synth to my neighbour's house and hook it to their Atari ST. Another dude I remember would play drums from Dire Straits song from his Atari ST hooked to some MIDI gear and then he'd take his guitar and play Dire Straits songs.
These were the days.
I'm not surprised some musicians still use them. If I'm not mistaken Kavinsky (who became famous after the movie Drive came out but recently had renewed interest for he performed at the Olympics games' ceremony) started doing music at a late age, on an Atari ST a friend of his game him.
As an anecdote PCs were so noisy that I asked my neighbour (an electrical engineer) if it was possible to come up with a system where the fan would slow down when the CPU wasn't too hot: and sure enough we were then modding our PSUs with "thermistors" and we'd be calibrating our tiny hack, no shit, with boiling water in the kitchen (ah clueless teenagers). Funnily enough about 10 years later every single PSU had variable fan speed.
That's the thing: we were used to quiet 8-bit and then 16-bit computers and when we had to move to these piece-of-shit PCs (but with fast CPUs / FPUs and that were upgradeable), we had to endure these painful ultra noisy CPU/PSU fans (and HDDs).
So the Atari ST just made sense. You could have these super fast (compared to Atari ST) PCs but they were noisy, fugly, piece of unbearable shits that the cool guys in music studios simply wouldn't tolerate back then.
Now of course at some point PCs became just too good and several brands started focusing on quietness and it was then possible to have a totally silent PC, both looking cool and being literally cool (big heatsink, quiet fans, etc.).
But yeah the Atari ST was and still is certainly for some a thing for creating music.
Lots of respect to the Atari ST for his MIDI port (and that comes from a Commodore Amiga owner and fan).
Aldipower 2 days ago [-]
And, just to add a third point, the Atari runs stable! I just tried to sequence with a SoundBlaster AWE32 and Voyetra MIDI Orchestra MIDI Sequencer under Windows 95b. For fun. I already recorded some MIDI tracks, then suddenly after 60 minutes Windows presented me with the famous bluescreen. Everything I've just recorded and didn't autosave lost. Haha.
mrandish 2 days ago [-]
> The Atari ST had a MIDI port: that notoriously lacked on the Commodore Amiga
I never really understood why people thought this was a big deal. I had my Amiga hooked to a DX7 synth with a serial to MIDI cable that had a couple active parts in it. MIDI is a serial protocol and the Amigas had full RS232 ports with hardware interrupts, +12v, -12v, as well as audio in and out on unused pins. The serial to MIDI In/Out cable cost around $15 more than two MIDI cables. You can still buy them today: https://retroready.one/products/ka12-serial-port-midi-interf....
Aldipower 2 days ago [-]
The big deal was latency and jitter the Amiga could not provide that good as the Atari this way with the serial interface. I remember such discussion back in the days. Also the multitasking capabilities of the AmigaOS were to good! Yes, this hindered precise jitter. That is why Steinberg and other sequencer software producers bet on the built in and rock solid Atari Midi implementation. Which meant also a lack of good software for the Amiga then on top of it.
mrandish 2 days ago [-]
While it's true the Amiga never got the breadth of serious music sequencing software that was on the ST, I never experienced any issues with MIDI jitter or timing.
While the Amiga could do multi-tasking, applications could also optionally take over the entire machine, which many games obviously did but so did real-time video applications like the Video Toaster, animation, audio and music apps. Lots of Amiga games and video apps "raced the beam" in real-time without ever dropping a field of 60 fps video. So, at least in principle, the Amiga hardware was as capable as the Atari ST in terms of ability to respond to interrupts at MIDI rates. The Amiga also had video editing software, which back then, involved controlling three professional VTRs in real-time over that single serial port to do A/B roll editing on precise timecodes 29.97 times a second.
So, yeah, I agree that the Atari totally won the MIDI production music market because it had much better software applications. If you were primarily a music producer or studio, there was certainly no reason to pay more for an Amiga's custom graphics capabilities - and if you were serious, the Amiga's more limited music app selection made the choice for you. My only quibble is that, IMHO, the claims of 'MIDI jitter' somehow being endemic to the Amiga were more Atari marketing FUD than reality. I don't doubt that some user at some point did something on an Amiga that caused MIDI timing issues, but it wasn't because of some fundamental hardware limit. It was due to configuration, some app incompatibility, bug or some other solvable issue - because similar timing issues could occasionally happen on the ST too - and then would be solved.
cmrdporcupine 2 days ago [-]
The other story with the ST and music studios and productivity software generally was the excellent monochrome 640x400 monitor, which was far better than dealing with interlace on the Amiga. And was cheaper than an Amiga colour monitor.
There were ways to do non-interlaced video on the Amiga, but just like having to buy external MIDI adapter ... more $$, more hassle.
That and floppy format compatibility with MS-DOS made it easier to move data around.
gramie 2 days ago [-]
That reminds me of using Twister, that optimized the position of sectors on the floppy disk to minimize seek times and speed loading dramatically (and I think they squeezed a few more sectors onto the disk so that it could hold more -- maybe more sectors on the outer rings of the disk?).
schlupa 1 days ago [-]
normal 720 K floppies were very generous with the sectors on a track. It was easy to format a floppy with 10 sectors per track even without reducing the gap between sectors. On Atari it was almost standard praxis which made that floppies had generally 800K capacity. It was even possible to squeeze 11 sectors per track by reducing the inter-sector gapto a minimum. Furthermore, most floppies allowed to write on track 81 and 82 (sometime even 83). So it was possible to have floppies with up to 902K capacity (not a good idea in the long run, I recently tested such a floppy I had made 30 years ago and it had a lot of read errors athing that 720K and 800K do not).
Aldipower 2 days ago [-]
Yeah, thanks for the good write-up, maybe you're right and there's also some Atari marketing FUD. The Amiga was/is definitely an impressive machine too. That is given. :-)
hnlmorg 2 days ago [-]
To be fair, everything about PCs back then sucked.
DOS was crap, when you had tGEM and Amiga OS.
Windows 1 and 2 was beyond terrible.
They were shit for games
They were bulky
They were slow
They crashed all the time.
They were ugly
They were noisy.
They were hard to manage (autoexec.bat, no OS in ROM, stupidly tricky partitioning tools, incompatible drivers for the same hardware but in different applications, etc)
But IBM lost control of the hardware market so they became cheap, ubiquitous crap.
And that’s literally the only reason we got stuck with PCs.
spankibalt 2 days ago [-]
I hope someone will write an entertaining satire about the sometimes almost PTSD-like bitterness and bizarre selective perceptions of the "anti-PC" crowd, especially in the Amiga space. :D
Please don't take it too harshly, but your list of grievances is almost radically different to my experiences of personal computing in the late Eighties to mid-Nineties... to me somewhat of a faszinosum all on its own. In my litle corner of the world the Amiga was virtually nonexistant [1], largely undesireable, and prophesized to be a corpse already as early as 1991.
I'll give you one thing, tho: A mostly non-plastic, compact keyboard computer case (Amiga 600-like) for a suitably powerful IBM compatible would've been awfully nice. Still would be, for a vintage bird that is. We only got "Schneiderplastik", the Euro-PC to be more precise [2], and that one wasn't updated in a satisfying fashion.
1. The only people I knew that had Commodores were two of my best friends, one with a Commodore 64, the other with a 128. The demosceners I grew up with were Atari ST guys, all of them (becoming) musicians.
Sure, there is clearly some "rose-colored memory" effect going on - how could there not be? As someone who used Amigas, Atari STs, Macs and PCs back in the day - and who still owns over a hundred unique models of vintage 80s and 90s computers, they ALL suck in many annoying ways AND most had some unique strengths. We all learned to live with what we had and how to make it do what we needed done.
People got accustomed to whatever personal computer they used every day and many grew fond of it. After all, the power and capability of a desktop computer in the 80s was unprecedented and, for many, revelatory. That said, in the mid-to-late 80s, the PC platform was generally under-powered dollar for dollar compared to most of its leading competitors, most of which were based on Motorola 680x0 CPUs. The strength of the PC during this time was it's rapidly expanding library of business software applications and hardware expansion options in the form of add-in cards (something which Apple had with the Apple II but abandoned for a while with the Mac, the Atari ST never really had and only the "professional" series Amiga's had (A2000, A3000, A4000).
Being underpowered per dollar doesn't mean the PC couldn't be extremely useful or the best platform for a given scenario and it certainly doesn't mean there weren't hobbyists who used and loved their late 80s PCs as dearly as any other 80s personal computer owner. Of course, this power balance was largely addressed by the mid-90s - which is why the PC juggernaut then extinguished the Amiga, Atari and (very nearly) the Mac.
spankibalt 2 days ago [-]
I don't think we negotiate the same phenomenon. You seem to describe the harmless, almost romantic indulgences of nostalgics. I talk about the bizarre, often enough toxic distorsions of a certain breed of user, who still fights, after all these years, their Platform War. The sort of fan who blames "the PC" for the "ills of the industry".
Anyway, off to some specifics:
> "The strength of the PC during this time was it's rapidly expanding library of business software applications and hardware expansion options in the form of add-in cards [...]".
A standardized general-purpose computing platform "for the future". Exactly what spoke to me, as disseminated in the publications I read as a kid in 1991.
> "Of course, this power balance was largely addressed by the mid-90s - which is why the PC juggernaut then extinguished the Amiga, Atari and (very nearly) the Mac."
"Power balance"? I didn't think in such abstracts when I made my choice, and conducted the long and merciless attrition-lobbying campaign for financial support, to buy a PC. The Amigas and the Ataris were simply not a factor for a variety of different, but very tangible and practical reasons:
Atari ST? I was not (on my way to become) a musician with the need for a precise and affordable backpack-portable computer instrument.
Amigas? The big birds were seen, outside of some specialist-niches, as uneconomical compared to their IBM-compatible brethren.
The vanilla home computers were seen as affordable, but extremely limited, racking-up (hidden) secondary costs to make them more usable. Often enough they carried a certain cultural stigma as well, being perceived by our financiers as gaming toys and therefore time wasters. And most importantly? No one I personally knew had an Amiga. Who to swap software with, where to find a mentor? Yeah...
The Atari guys I befriended used their machines almost exclusively for dabbling in electronic music, later as part of the emerging East German EBM and hard techno scene.
Games? The titles I was interested in either didn't exist on M68k platforms (flight simulations à la Aces of the Pacific, wargames such as the Harpoon series, or adventures like The Lost Files of Sherlock Holmes), were practically unplayable (e. g. Red Baron), considered inferior (e. g. Wing Commander)... or just came out too late.
By listening to stories of some Britons of my age, it only recently came to my attention how privileged I have actually been. Some of these people told me stories of buying their first A600s or A1200s only in 1993! At that time it was hard to separate me from my trusty, second-hand PC... a machine with a CPU-type nearing its eighth (!) birthday (386DX-25).
hnlmorg 1 days ago [-]
You’re talking about the 90s though. Thats actually several generations of PC later.
PCs in the 80s were so bad that most homes still ran 8-bit micros with a BASIC ROM.
Windows 3.0 wasn’t even released until 1990.
And let’s be honest, the problems with autoexec.bat, config.sys and different incompatible DOS drivers for games existed throughout. It was only really the uptake of DirectX that fixed that (and to an extent, Rage / OpenGL, but you could technically get DOS drivers for them too).
But that was a whole generation of computers away from the 3.x era of PCs, and another generation again from the 80s.
But by the mid-90s, we had a whole new set of problems with PCs: the OS silently corrupting itself. It was so bad that best practice advice was to reformat and reinstalling Windows every 6 months (I did it much less regularly than that though). And this was a common idiom throughout the entire life of the 9x era of Windows too. But to be fair, that was also a common idiom with pre-OSX Macs too. Apple had become painfully shit at the point too.
If The ST and Amiga were still evolving like PCs were, then by the late 90s I’m sure Amigas might have suffered from the same longevity problems too. But in the 80s, they (and STs, Macintoshes, and Acorn Archimedes) were stood head and shoulders above what PCs could do at that point in almost every regard.
spankibalt 1 days ago [-]
> You’re talking about the 90s though. Thats actually several generations of PC later.
I'm East German; you people got a headstart. My relevant hands-on experience is centered around a 386DX system, technology introduced in the mid-Eighties. 1987 brought VGA, AdLib, and MT-32s to the table, with games support gearing up in '89, the year the Sound Blaster was released. Fall 1990 saw the release of Wing Commander. Of course that's just technology, economical realities tell a different story.
> Windows 3.0 wasn’t even released until 1990.
Windows was as relevant to me as an Amiga. GUIs didn't do much for me until much later. Still prefer CLIs, TUIs (and minimal GUIs that come as close to the latter as possible).
> And let’s be honest, the problems with autoexec.bat, config.sys and different incompatible DOS drivers for games existed throughout.
I never experienced serious troubles in DOS. The first two, and only, games I could not get to work were two infamously bug-ridden Windows titles of the late 90s: Falcon 4.0 and Privateer: The Darkening. By the time they fixed 'em with a litany of patches I was busy with other things.
> But by the mid-90s, we had a whole new set of problems with PCs: the OS silently corrupting itself.
News to me. How bizarre!
> But in the 80s, they (and STs, Macintoshes, and Acorn Archimedes) were stood head and shoulders above what PCs could do at that point in almost every regard.
Hardware? Until '87. Games? Until late '90 I'd say, at the earliest, accounting for a strong genre bias. [1] Then, outside of niches (Video Toaster, cheap DTP, music production) and certain "creature comforts", it was over; the eco system began to atrophy.
1. The first two DOS-platformers that wowed me visually were Prince of Persia 2 ('93) and Aladdin ('93/'94); all my other genre preferences were, to put it diplomatically, underserved on 16-bit home computers.
hnlmorg 24 hours ago [-]
> I'm East German; you people got a headstart
That doesn’t mean PCs were somehow more capable in the 80s though ;)
> Windows was as relevant to me as an Amiga.
Same point as above.
> I never experienced serious troubles in DOS.
I think that’s probably rose tinted glasses on your part then. The pain was real. Games seldom “just worked” and you often had to be really knowledgeable in your system to get stuff working.
To this day, I’ve never heard the music in Transport Tycoon because that game refused to work with whatever midi drivers I threw at it.
> > But by the mid-90s, we had a whole new set of problems with PCs: the OS silently corrupting itself.
> News to me. How bizarre!
I’d be amazed if you’ve never once heard about the old problem of computers getting slower or buggier over time and a reinstall fixing things.
> Hardware? Until '87.
You’re comparing theoretical top of the line PC hardware (which nobody actually owned and no one had written software for yet) with commodity Amigas and STs.
And even those top of the line PCs still missed a few tricks that made some genres of games better on Amigas and Atari STs, like fast blitters.
It wasn’t until the era of ray casting 2.5D 1st person shooter that PCs started looking better than their counterparts. And then things really accelerated (no pun intended) with 3D hardware graphics acceleration. Which, to be fair, was available for Amigas too, but the only software that targeted them were 3D rendering farms.
spankibalt 17 hours ago [-]
> That doesn’t mean PCs were somehow more capable in the 80s though ;)
It clarifies specifics relating to my personal experiences with the discussion matter, addressing (perceived) realities of a local market. How people use computers is of the utmost relevance; a fact which you, given your lamentations here, certainly must have internalized.
> I think that’s probably rose tinted glasses on your part then. The pain was real. Games seldom “just worked” and you often had to be really knowledgeable in your system to get stuff working.
No rose-tinted glasses here. And I believe you that your and others' pain was real. Many people could not work their heads around a PC; many of 'em fell for cheap SX clunkers with other substandard components, ffs. That's obviously an inherent problem of such an open platform: PCs are highly individual in many subtle ways; a trade-off one had, and still has, to negotiate in one fashion or another.
> You’re comparing theoretical top of the line PC hardware (which nobody actually owned and no one had written software for yet) with commodity Amigas and STs.
I'm comparing hardware available on the market (with key system components coming together in 1987/88, and games supporting such top-of-the-line hardware showing up in numbers from '88 onwards). I also spoke to economical realities in nearly every post in this disc; I am well aware that 16-bit home birds had a technical lead for a short while, and were an even better value proposition for many people a while longer. For some, just as valid, this still holds true.
> And even those top of the line PCs still missed a few tricks that made some genres of games better on Amigas and Atari STs, like fast blitters.
Yes, already addressed by referring to Prince of Persia 2 and Aladdin (1993/94!).
> It wasn’t until the era of ray casting 2.5D 1st person shooter that PCs started looking better than their counterparts.
So, your stylistic (genre) preference maps it into the time between 1991 (with Hovertank 3D in April as well as Catacomb 3-D in November) and Wolfenstein 3D (May 1992). Okay.
With mine it begins earlier, largely because of proper 3D-titles: Deathtrack (1989, PC-exclusive), LHX: Attack Chopper (1990, no Amiga/Atari port), and Red Baron (1990, got the Amiga slideshow in 1992), as well as the odd non-3D action title here and there, e. g. Silpheed (1989, no Amiga/Atari port).
One can probably go even back to 1988, for at least parity in certain markets and their segments, if one compares the technological edge in an intellectually honest fashion, i. e. what the platform, hardware and software, was really technically capable of.
And productivity software, part of the deal, is of course its very own world.
sien 1 days ago [-]
Your point totally stands.
But in the late 1980s, oh my. An Amiga 500 in 1987 was really a lot better than a PC of the time for many things. It was also a lot cheaper. Maybe half the price. The Amiga and the Atari ST didn't improve enough by 1991. By then a PC was better.
But by 1988 the PC was so far outselling everything else that the writing was on the wall.
People who had Amigas and Atari STs couldn't quite understand how their machines, that they perceived as so much better, were being outclassed by PCs running MS-DOS. On an Amiga 500 in 1987 you had a decent GUI. Until Windows 3 PCs didn't.
For example, Pro-Write on the Amiga having real time spell checking and being WYSIWYG in the late 1980s. It wasn't until Word 6 in 1993 that Word was really much better.
schlupa 1 days ago [-]
The big advantage we had on Atari and Amiga was that the 68000 could address more than 640K without a sweat. PC's had this annoying limit up until the 90s and the complexity that it introduced was mind blowing (EMM, EMS, XMS etc.).
In 87 when I was student at University, I managed to write all my sofware on the Mega ST2 and print my papers with Signum! on my 9 pin matrix printer in a quality that my PC colleagues were absolutely jealous of. As said, the advantage was then quickly lost even if I still could use my 1991 acquiered TT up until the mid 90. But by then, the PC was indeed already in another category (CD-ROM, SVGA, Soundcards, Win95 and/or NT or OS/2, beginning of Linux etc.). Our poor niches computer couldn't follow against the sheer mass of the market.
spankibalt 10 hours ago [-]
> PC's had this annoying limit up until the 90s and the complexity that it introduced was mind blowing (EMM, EMS, XMS etc.).
Competent enough people on both ends, end-users and programmers alike, simply worked around that. In the end, it still allowed for a platform of industry-leading applications and games, many of them not available on Amigas or Ataris.
hnlmorg 1 days ago [-]
Exactly.
If youve only used a PC in the 90s then it’s easy to see the Atari and Amiga crowd as rose-tinted fanboys. But they’re comparing 90s IBM PCs with 80s competitors.
Really, that says more about how IBM PCs were 10 years behind the competition than about how great IBM-compatibles were.
spankibalt 1 days ago [-]
> An Amiga 500 in 1987 was really a lot better than a PC of the time for many things. It was also a lot cheaper.
Yes, the "bang for the buck" made all the difference. For a while.
gapan 2 days ago [-]
There were others too. At least the Olivetti Prodest PC1 [1], which I had, and the Sinclair PC200 [2], which a close friend had. Other friends had the EuroPC, some had the Amstrad 1512 and other different PC compatible boxes.
I remember my PC1 fondly. Well, I still have it. I learned to code in GW-BASIC, Turbo Pascal and C (in that order) with it. I was using it for a long time, until 1997, for serious work (coding and university assignments), when I finally had the money to upgrade to a Pentium PC.
As much as my world was PC-centric, the first time I saw an Atari ST and what it could do, my jaw dropped. I knew of the Amiga from magazines, but the first time I actually saw one was several years later, after I acquired my Pentium PC and I admit it wasn't that impressive then. But still I couldn't help but think: "wow, you had that in 1989?". There was no comparison with the PCs of its time.
Thank you; interesting machines. A modern, compact industrial-grade metal keyboard case PC with an external PSU, and some quality of life goodies, built around a Pentium MMX or Pentium II CPU, is really something I would fork over money for. Essentially something along the lines of C64EVOs planned MODLR case. [1]
> But still I couldn't help but think: "wow, you had that in 1989?". There was no comparison with the PCs of its time.
This speaks more of local market realities, e. g. "demo software" on hardware in an actual computer shop or, as in your example, at a friend's home, especially in the form of 2D arcade action games, then at its peak of popularity on 8- and 16-bit home computers... and yet to shine on PCs (and as opposed to glossies and the like, where the first 486 machines stepped onto the stage turn of the year 89/90).
But at that time I wasn't thinking about computers that much, still digesting the beginning of the end of the GDR.
OP's grievances are spot on for the period 1985 to 1990. After that, PC's did indeed gain enough power (386 were mass and 486 just came out), also VGA started to become. This means that your perception built especially after 1990 is right but doesn't contradict OP's list as Atari's and Amiga's were indeed much much more advanced and useful than XP's and 286 under MS-DOS/CGA.
hnlmorg 2 days ago [-]
I was predominantly a PC user. In fact I had a side hustle repairing PCs in the 80s and 90s. It wasn’t fun but it was tax free pocket money.
Every time I got to play on non-PC home computers I’d be blown away by how much better those machines were.
These days I collect retro hardware and the Atari STs and Amigas are still easier to maintain.
So my opinions aren’t that of a Amiga fanboy. PCs in the 80s were really that shit.
I do think a lot of the problem was Microsoft though. I never liked any of Microsoft’s software even back then. And that was long before they gained the reputation that so many older timers like myself still remind people about. I actually wrote my own windowing system in Pascal because I was fed up with early Windows. It wasn’t a patch on GEM but back then I didn’t know you could run GEM on a PC.
spankibalt 2 days ago [-]
Fair enough. I never had serious problems with my PCs... or Microsoft's OS offerings. And all that maintenance song and dance around these machines, before Plug & Play became reliable, came almost naturally to us. No surprise here, we were insatiably curious, able to read the manuals, and not afraid to ask.
TheOtherHobbes 2 days ago [-]
The PC industry of the late 80s was a disaster.
We had affordable windowing and GUIs on the Atari and the Amiga, with instant boot from ROM and tons of RAM. The Amiga had the beginnings of multitasking and hardware acceleration for graphics and sound.
Then suddenly the industry decided to go back to a cut-down version of late 70s S-100 computing, with insanely unaffordable prices, crippled specs, bizarre semi-manual memory management, ugly hardware, and a command line interface that was basically CP/M but not as good.
Infuriating.
os2warpman 4 hours ago [-]
> with instant boot from ROM
This was a mistake. It slowed upgrades. The several second's worth of speed increase was irrelevant when to upgrade an Amiga you had to ship out 2x 40-pin DIPs and floppy disks as opposed to just floppies.
And due to incompatibilities it was common to install a ROM switcher in your system, especially in Amigaland when kickstart 2.0 came out and you wanted to keep 1.3 so your games would still run. So you had to buy and install a switcher like a MultiStart AND buy and install new ROMs and manage two sets of floppies. This led to a schism in the market where normal people were stuck with an A500 running 1.2/1.3 and its 1980s featureset and power users who wanted space-age luxuries like IDE hard drives were running 2.0.
Practically every word uttered or printed by Commodore about compatibility between the two was an outright lie.
Microsoft had an obsessive focus on backwards compatibility so MS-DOS 5.0 was adopted by practically everybody: just insert the disk, switch to a:, type "install". Done. There were compatibility issues, but they were on a scale that was irrelevant compared to the Kickstart disaster.
Could you imagine in 1992 having a PC and having to install a PCB in the BIOS sockets so you could have both versions 3.02.111 and 4.84.1932 of your BIOS and keep DOS 3.3 and 5.0 boot disks on hand so that you could run Commander Keen and use the newest version of WordPerfect?
I did all of that on the non-PC side of the house, and many others did too. I had (and still have) an A2000 with hard card, flicker fixer, accelerator, ram expansion, rom switcher, and other upgrades. I spent thousands of hours tinkering and having fun with the system.
Tinkerers don't make for successful ecosystems.
Fun, yes.
Sustainable, no.
danieldk 2 days ago [-]
My cousin had an Amiga in the late 80s/early 90s. As a kid, I was so incredibly jelous. Everything was better, sound, graphics, the OS, the hardware itself. Even well into halfway the nineties we were messing with tire fires like VESA Local Bus on the PC side.
hnlmorg 2 days ago [-]
Not just Atari and Amiga. Apple and Acorn had excellent machines too.
It was basically just the old 8-bit micros that kept IBM compatibles looking good.
badc0ffee 2 days ago [-]
> a command line interface that was basically CP/M but not as good.
I hate to go to bat for MS-DOS, but it had at least one real advantage over CP/M: a single disk format. As doomed to failure as the various non-PC DOS machines (e.g. Tandy 2000 and DEC Rainbow) were, they could at least share disks.
badc0ffee 2 days ago [-]
I just remembered that the Rainbow only supported single sided disks. While a PC could format and use those, people normally used double sided disks. So, kind of technically compatible but not completely.
rjsw 2 days ago [-]
You could have a GUI on a PC as well. I developed GEM applications on an Olivetti M24 before I got my first Atari ST.
The Olivetti had a B&W 640x400 monitor and a Logitech mouse that plugged into the back of the keyboard. You could replace the 8086 CPU with an NEC V30 for a bit more speed.
hulitu 2 days ago [-]
> Then suddenly the industry decided to go back to a cut-down version of late 70s S-100 computing,
Just like today's GUIs. They all look like Windows 1.0
tom_ 2 days ago [-]
Cheap and ubiquitous is what people want, and if PCs became successful because people other than IBM started producing the same hardware, then perhaps Commodore should have done that too. Software seems to have proven itself the better moat, and so maybe AmigaOS could have been the thing that would tie this hypothetical Amiga-compatible market together, keeping Commodore alive.
They'd have to have been a bit more careful about it than IBM were.
I am confident it would still feel like everything is terrible.
crq-yml 1 days ago [-]
The thing that really drove the PC era was that the commodity desktop spec was rapidly gaining capability, compilers for them were getting good enough to depend on high-level languages, and the most affordable way to meet the market where it was, was not to build custom(as had been the case in the 80's when looking at stuff like video editing systems, CGI, digital audio, etc.) but to use Microsoft as a go-between. All the heralded architectures of the 80's were doing things more custom, but it amounted to a few years of advantage before that convergence machinery came along and stripped away both customers and developers.
Apple did survive in that era, though not unassisted, and the differentiation they landed on(particularly after Jobs came back) was to market a premium experience as an entry point. I think that is probably going to be the exit from today's slop.
In this era, spec is not a barrier - and you can make <$100 integrated boards that are competent small computers, albeit light on I/O - and that means that there's a lot more leeway to return to the kinds of specialty, task-specific boxes that the PC had converged. There's demand for them, at least at a hobbyist level.
For example, instead of an ST and outboard synths for music, you could now get an open-source device like the Shorepine Tulip - an ESP32 touchscreen board set up with Micropython and some polished DSP code for synths and effects. It's not powerful enough to compete with a DAW for recording, but as an instrument for live use, it smashes the PC and its nefarious complexities.
msk-lywenn 2 days ago [-]
U+D for a button means you can’t go up or down at the same or is there a trick?
tom_ 2 days ago [-]
You can't go up and down simultaneously on a joystick anyway! The stick can't go both ways at once. So this gives you the option of reusing these signal combinations for other kinds of input.
msk-lywenn 2 days ago [-]
I meant pressing the u+d button and actually going up (pressing just u, which means not d) at the same time.
tom_ 2 days ago [-]
Oh, good point, yes. It's just a combination of inputs rather than some kind of stream of states isn't it. I'm rusty.
The 9 pin pinout has 2 spare button inputs anyway. Maybe it'd be feasible to use those.
msephton 2 days ago [-]
I think what was meant was using fire+directions to trigger additional virtual buttons. Many games did this in software anyway.
alexisread 2 days ago [-]
Just that you can't go up and down at the same time, so it's a valid electrical signal for another button, and importantly standard so games would take advantage of it.
toast0 2 days ago [-]
Sure, but if U+D is button 2, you can't register up or down and button 2 together. You need another wire, a serial protocol, or whatever sega did for the 6-button genesis/mega drive controller (i think a toggle?)
brulard 2 days ago [-]
This would be my question as well. If i go up, and pressing this button 2, then the output signal does not know if i'm going up or down or none of those.
Obviously it'd be better to have a protocol like the megadrive, but given the setup in the ST, this is a hack without having to change the ST hardware.
toast0 23 hours ago [-]
That works for Metal Slug for the grenade, because a) you don't hold down the grenade button, b) you don't have a lot of grenades so you don't use them that often so you won't really notice any weirdness like if you duck and grenade simultaneously the duck doesn't happen until you let go of the grenade button.
If the up+down button was used for firing the main weapon, this wouldn't work, especially when you've got the machine gun and you can just hold the button down. You wouldn't be able to duck or aim up while shooting, or it would be unreliable.
If you were trying to use it for a three button fighting game, I think you'd have problems too. Especially if you are doing 'negative edge' techniques where you would hold a button while inputting the directional sequence and release the button to trigger the special.
2 days ago [-]
cmrdporcupine 2 days ago [-]
I mean the whole point of the ST was "rock bottom price" and a lot of the things you're talking about would have raised the BOM significantly, or delayed its introduction by precious few weeks or months.
Beating the Amiga to market, and beating it on price were super important.
But I do think there was a serious problem with follow through. The Blitter and GDOS and then the STe came too long to come after. The Blitter never became standard, and the games and software suffered for it. And updates on the operating system were slow and thin until it was way too late.
I do agree that the cartridge port thing -- it being limited to 128kB expansion -- was needless. One more pin, even, would at least allow for a proper OS upgrade via cartridge port! Definitely one of the stupidest design decisions on the machine.
alexisread 2 days ago [-]
You have a point, but the ST bottleneck during development appeared to be the software, so there was possibly space for hardware tweaks. The BOM would go up slightly but remember they would have saved on developing and shipping a separate disk drive which would cover a lot of these changes.
Realistically it's amazing the ST was as good as it was, given the 6 month development time and the kings of penny pinching at the helm :)
rbanffy 2 days ago [-]
> move the joystick ports to the right
I think they’d be better on the back unless you are supposed to plug them out all the time.
> Finally, they should have included the blitter socket
That would be hard without having a functioning one first. The blitter would be also handy for a number of things, from PCM sound to network and disk transfers.
alexisread 2 days ago [-]
On the right (ie. Mirror of the STE 15pin ports) would mean that they could keep the electrical layout ie. Connected to the keyboard, and hence a separate keyboard unit. For 2 player games you would be swapping it a lot- there were adapters just to help with this.
I agree it would be difficult to design a correct socket, but from interviews it was always the plan to have a blitter, and a socket as standard would have helped adoption.
The main thing is that the T212 is a great coprocessor, faster than the 68881 fpu and with a 2k cache. Introducing the transputer as a coprocessor would potentially have changed the computing landscape
jlokier 2 days ago [-]
Wow, there are a lot of C compilers for Atari ST!
I was astonished to find about 22 distinct C compilers, including their own libraries, assemblers, linkers etc. for the Atari ST and its successors. That's not counting separate versions, just distinct products from different vendors.
From what I can see now looking at archive sites, there was a huge amount of activity in developer tools on the ST back in the day. Much more than I thought at the time. It might have been a serious contender for the dominant architecture (along with the m68k CPU), if IBM PC-compatibles and x86 hadn't won.
Recently I looked for Atari ST C compilers, out of curiosity to test portability of a C program I'm working on.
I've been testing C code for diverse Unix systems.
As I used to own an Atari 520ST (with 1MB RAM beautifully piggy-backed and hand-soldered on the existing RAM chips :-), it seemed like a good idea to peek at C on an ST emulator. I didn't use C when I had a real Atari ST (no C books in my local library), so I expected to find one or two C compilers, not 22!
gramie 2 days ago [-]
I think that I used the Megamax C compiler back in 1987-8. I was just messing around and experimenting, not programming professionally, but it worked well for me.
chiph 1 days ago [-]
I had that. Later renamed Laser C. Being a poor college student I used a Radio Shack floppy disk drive for a second 3.5", hand-soldering the DIN connector the ST used. Afterwards I could have my compiler tools on one and source/object code on the other - a huge time saver.
Try and get a compiler and linker to fit in 360k these days!
cmrdporcupine 2 days ago [-]
There were plenty of C compilers but only a couple were in common use.
If I recall, Lattice C was popular. Mark Williams was another one. "Alcyon C" was included I think in the ST development kit, but was considered poor.
I think people use "Pure C" these days, but of course also GCC is likely best:
Is maintained by Vincent Rivière, who is a major contributor on EmuTOS.
pjmlp 2 days ago [-]
Atari vs Amiga was such an interesting time in computing history.
When I see generations that grew up with game consoles, and talk about the current uptake on desktop games, they really have no idea what they missed out in home computing and the first wave of indie game development,
from bedroom coders .
vanderZwan 2 days ago [-]
> they really have no idea what they missed out
Tangent: the older I get, the more it annoys me that this expression kind of implies a failure of young people to study history, when I feel like it's more the responsibility of previous generations to preserve and pass down history for them to learn from. Especially because it's usually people in power in some form who are trying to keep the newer generations naive here so they can be fooled again.
Not saying that this interpretation was your intent (in fact I suspect it's the opposite), just directly expressing my annoyance at the expression itself.
flohofwoe 2 days ago [-]
> when I feel like it's more the responsibility of previous generations to preserve and pass down history for them to learn from
But everything has been preserved and passed down. The entire home computing phenomenon has been archived and is available on the internet thanks to the rampant 'software piracy' which was common at the time, and detailed schematics and manuals coming with the computers (which have all been digitized and are available on the internet). Even my obscure KC85 games I wrote as a teenager and 'distributed' on cassette tapes by snail mail are available as download because some kind person(s) digitized all that stuff during the early 90s and put it on some 'underground' game download portals.
The 80s and early 90s home computer era will be better preserved than anything that came after it.
mrandish 2 days ago [-]
> The 80s and early 90s home computer era will be better preserved than anything that came after it.
Indeed. Sadly, many more recent games will probably be lost to time forever due to DRM, online service components going offline or never being distributed on physical media in the first place. As someone into vintage computers and preservation, I worry that future generations may look back and see the late 2010s and certainly the 2020s as a 'dark age' when surveying the history and evolution of digital gaming. All we'll have are YouTube videos (assuming those survive future business model tectonic shifts) but no ability to actually experience the game play first-hand.
Recently I've been exploring the back catalog of more obscure PS3 and X360 games via emulation and have found some absolutely terrific titles I never even knew existed. Some of them were only ever sold through the console's online store and never available on physical media. With the XBox 360 and Nintendo Wii stores now long offline, only the PS3 store remains available - and who knows for how much longer, since Sony already announced its closure once and then changed their mind. There's now a race to preserve many of these titles
mrandish 2 days ago [-]
> a failure of young people to study history
The good news is that not only was almost all of it preserved, teenagers today are really interested in retro gaming. My 15 year-old daughter, who's not into computers more than any other 15 year-old girl, just asked if she could go with me to the vintage computer festival this Summer. She tells me her friends at school are all interested in running emulators to play classic games from arcade to SNES to PS2 and N64.
I guess the 'dark lining' to that silver cloud is that this interest from teens in retro gaming is partly thanks to the increasing downsides of modern gaming (cost, DLC, ads, hour-long download/installs, etc). While game graphics continue to get more and more impressive, stuff like real-time path tracing doesn't seem to excite teens as much as does me. Ultimately, it's about game play more than visuals. Lately I've been exploring the immense back catalog of N64, PS2, PS3 and x360 games via emulation and there are some incredible gems I never even heard about back in the day. It's especially great now thanks to the huge variety of mods, enhancements, texture packs, decompilations/recompilations and fan translations. And current emulators can upscale and anti-alias those games even on a potato desktop or laptop with a low-end discrete GPU.
pjmlp 2 days ago [-]
Understandable, hence why many of my comments kind of look like mini-history lessons, and I tend to be pedantic.
However curiosity also plays a big role.
If I know so much about computing history since the 1950's, is because I do my research, and take advantage of all archives that have been placed online, certainly I wasn't around to live all of it.
kenjackson 2 days ago [-]
I never thought that statement was actually about having an “idea”, but more about not actually having lived through the experience. Quite the opposite from your belief, no amount of study would allow them to understand what is was like.
msephton 2 days ago [-]
The Atari ST was foundational for me. I loved it so much, learned a great deal, discovered and played many great video games, designed page layouts, made my own software and games. In December 2023 one of my recent games was listed in Ars Technica's "Best Games of 2023" alongside Super Mario Wonder and I can draw the line right the way back to my time on the Atari ST. https://news.ycombinator.com/item?id=38372936
zabzonk 2 days ago [-]
I had a 520ST back in the mid 80s. I would have killed for a Mega ST, but I couldn't afford one and realistically needed an IBM-compatible PC, which I eventually got.
Things I remember about about the 520ST:
- Those horrible diagonal function keys. There was no reason for them to be diagonal, rather than normal keys as they were on the IBM. But I've always hated function keys.
- Games like Dungeon Master (really still quite a good game today).
- Not a bad C compiler, but I can't remember who by - LightSomething?
- The GEM GUI was not so bad, but using it with a floppy disk was.
But all-in-all I was quite happy to get my PC-compatible to do serious work with.
jjbinx007 2 days ago [-]
Those function keys were bad but why have the joystick and mouse ports underneath in that location? Awful.
robinsonb5 2 days ago [-]
Ironically, speaking as an Amiga guy, those diagonal function keys were an aspect of the ST I really liked!
I don't know if they were consistent with the other keys in terms of feel, but they were a striking, unique design feature that instantly identified the machine as being Atari without compromising practicality.
zabzonk 2 days ago [-]
Yeah, I forgot about that. But I suppose you didn't need to replug them very often, and it wasn't much worse than plugging into an IBM PC before USB came along. And at least the Atari had lots of useful ports.
cmrdporcupine 2 days ago [-]
Mouse cables and ports routinely broke on those machines because of the poor design.
I guess the idea was to have a clean design with cables out of the way, but it really was a bad place for them.
doop 2 days ago [-]
Laser C or Lattice C maybe?
throw_m239339 2 days ago [-]
I did prefer the Amiga, but I still got an Atari ST for an very obvious reason: It had MIDI DIN port and was way cheaper than most digital sequencers at the time.
It's funny how some young producers today wonder "how did people do it without a computer before the 2000?"...well guess what, we did used computers! I cannot however remember what software sequencer I was using, I know it had MIDI effects (like MIDI echo), that's all I remember.
And by 1998, Logic was fairly advanced anyway and even had plenty of plugins.
Ylpertnodi 2 days ago [-]
>I cannot however remember what software sequencer I was using, I know it had MIDI effects (like MIDI echo),
Possibly/ probably Cubase.
Anyone remember the Mike Hunt version?
I'm still using cubase on a nice pc, but I miss the stability of the atari.
cmrdporcupine 2 days ago [-]
The Mega ST (not the Mega STe) is the best quality machine of the series. Mechanical cherry keyswitches on the keyboard. Easy access in the case to expand (though the Mega STe/TT had standardized VMEbus and the Mega ST was its own custom thing).
The Mega STe had a funkier case, VMEbus, and upgraded specs, but mushy rubber dome keyboard, more brittle plastics.
I like to collect the Mega as the best of the bunch, personally.
schlupa 24 hours ago [-]
The Mega ST keyboard is indeed the best keyboard of the whole Atari family. My TT keyboard has its rubber dome getting stiff with age. This said, the Mega ST keyboard has one big flaw, its plastic is getting extremely brittle and fragile with age. I had one keyboard drop 1m (3ft) to the floor and it exploded like a glas vase. So if you have a Mega ST keyboard, be very careful to handle it gently.
cmrdporcupine 19 hours ago [-]
The Mega ST keyboard I have is German QWERTZ unfortunately, so despite having built a USB adapter for it, I never use it.
schlupa 24 hours ago [-]
The expansion slot of Mega-ST is just 2 rows of 32 pins that are 1:1 connected to the CPU pins. Any extension that was supposed to be solderedon the CPU could be put in the slot with a simple adapter (see f.ex. the Volksfarben ISA slot adapter for ET4000 VGA cards).
b800h 2 days ago [-]
Every time I see anything about the varieties of ST, it gives me PTSD over the woeful keyboard. It was like a trampoline made of fudge.
This says that the keyboard on the Mega ST was better. And yet still not good enough. Egads, that ST mess was a terrible keyboard.
hashmash 2 days ago [-]
The main problem with the keyboard was the non-standard size of the kepcaps. The standard distance between keycaps is 0.75 inches, and the standard top width is 0.5 inches. The Atari ST keycap distance is standard, but the top width is 0.625 inches. Because of this, if your finger isn't exactly centered over the top of the key, it hits the adjacent key too, leading to key jam.
whobre 2 days ago [-]
Well, my first computer was a 48k ZX Spectrum, and after that experience Atari’s keyboard looked like heaven.
Still liked the Speccy better…
rjsw 2 days ago [-]
The Mega keyboard was good, comparable to a current mechanical one.
cmrdporcupine 2 days ago [-]
Mega ST keyboard is Cherry MX switches. Good quality.
I have an adapter on mine that converts it to USB and I can use it on a modern computer.
Though I never do. Mainly because it's got Ctrl/Alt[Meta] but nothing I could map to Hyper/Super.
Mega STe and TT reverted to terrible mushy rubberdome.
Ozarkian 2 days ago [-]
I owned one of those! I sold my 1040ST to my friend to get some of the money for a Mega 2.
I liked how the keyboard was detachable and the hard drive was the same size at the motherboard case, so you could stack them.
layer8 2 days ago [-]
The positioning of the cursor keys on the Atari STs is interesting [0]. It arguably makes sense for the cursor block to be located more in the vertical middle rather than at the bottom edge of the keyboard.
I once played Ultima 6 from a RAM disk on an ST with 4MB RAM. Game install from 'Hard disk' to RAM disk - it didn't realise. And then I used bigger size floppy disks (940KB I think) and a fast copy utility to get those 3 disks to start the game and when done, save the game and save it all off to 3 floppies. It was totally fast!
pavlov 2 days ago [-]
”I don’t recall seeing Atari specifically market the Mega ST to developers, but I suspect a lot of developers found the better keyboard and extra RAM to be worth the upgrade.”
There wasn’t such a thing as a general developer market.
When you didn’t have internet and cloud services and free Unix, how could you develop for something else than a specific platform and device?
If you bought a Mega ST to write programs, your target audience were still only the people who had a regular ST. You couldn’t reach anyone else. So the advantage was minimal.
The idea that there can be a developer market separate from the baseline end-user platform is quite new. It emerged around 2007-2010 when web apps became a realistic option and you didn’t have to be on Windows to target the 90% of people who are on Windows.
schlupa 24 hours ago [-]
I bought a Mega ST2 because I studied CS and wanted to become a developper. I sold the Amiga 500 my father had bought me. The ST was cheaper for programming than the Amiga 500 as you would need to add, at least a second floppy and a lot of memory. Furthermore, I hated Workbench the GUI of the Amiga (for the same reason I hated also Windows 3.1, you had to use a special program to access the files on the drives, you had icons in the windows only if you had drawn specifically a special icon, I preferred how on GEM and the Mac Finder, windows would directly show what's on the disk).
rocky1138 2 days ago [-]
As a counter point, Doom was developed on a different platform than the target. Let's say that the lead dev was ahead of the curve.
bdcravens 2 days ago [-]
To say that John Carmack was ahead of the curve is quite an understatement lol
bartread 2 days ago [-]
I have to say, I love the industrial design of these 80s machines from Atari, and also from contemporary Commodore offerings. As an example, the original Amiga 1000 is beautiful, and I'd be incredibly happy to have a machine with that form factor, but equipped with modern internals, today.
Honestly, I don't think even Apple could touch the best of Atari and Commodore industrial design in the back half of the 1980s. To be blunt, the early Macintoshs simply weren't practical in their design: for starters, a tiny monitor - that was originally black and white (which in 1984 was already kind of a joke) - and very limited upgradeability, relatively poor multimedia capabilities (speech synthesis was no more than a gimmick that was also available on other platforms), and then the whole aesthetic just wasn't that pleasant.
And I say this as someone who, personally, has only owned Apple machines for the past 15ish years, so I'm not exactly coming at this from a "not a fanboi" perspective. I'd still take 1980s Atari or Commodore aesthetic over modern Apple, or modern anything else for that matter[0].
Also, as an aside, I really enjoyed seeing "Atari Means Business with the Mega ST" as the top headline on Hacker News in 2025. Even on a Sunday when content typically tends to be more varied and interesting this was still an entertaining surprise.
[0] I suspect the reality may be that I'm an "anything but Wintel" kind of person, although not at any cost, because I did run PCs exclusively for 11 or 12 years. They never really helped me enjoy computing in the way the other machines have though.
icedchai 2 days ago [-]
I always felt the Amiga 3000 was Commodore's high point in terms of industrial design. Still, the keyboard garage in the 1000 was neat!
II2II 2 days ago [-]
The industrial design of PCd may have been lacking in beauty, but it was almost always practical.
For example: I cannot think of any desktop models that lacked internal expansion. They may have used a riser card to stack in two or three slots sideways, but the slots were there. The design may have been crude, but at least your desktop wasn't turned into a disaster every time the technological landscape shifted: when hard drives became affordable, the world switched to 3.5" floppies, if you decided to use online services or send faxes directly from your computer, get a CD-ROM, or cable Internet.
ggm 1 days ago [-]
My memory is that one of the Atari units was capable of being treated as a Jerq or Gnot, I recall people running core-wars on it. The keyboard looks familiar but I believe there as a packaging which had the CPU inside that unit, not just in a pizza box.
rbanffy 2 days ago [-]
Someone should make the Atari ST Max built into that keyboard.
And with emulated VME graphics, with an HDMI output and a USB-C port. And 3-button mouse. Able to run Atari Unix.
j45 2 days ago [-]
The Atari’s still look like they’re a great first computer for kids.
Beijinger 1 days ago [-]
But does it run MidiMace?
djmips 1 days ago [-]
(1987) ;-)
fdspopofe 2 days ago [-]
[dead]
sheepscreek 2 days ago [-]
Atari “means” business..? I can’t get over the present tense in the heading. I double-checked if they had actually released something new. It feels like clickbait, and I wish it weren’t.
Is this in any way related to the general "speed is going up but latency is getting worse" phenomenon of hardware in the last decades?
The Atari 2600 for instance was known for "racing the beam", updating the image while it was being drawn on the CRT monitor. A latency measured in pixels rather than frames! It was necessary because the console didn't have enough memory for a framebuffer.
The Atari ST is special for the inclusion built-in of MIDI ports, and it was made cheaply, which at the time meant direct connections and it resulted it low latency.
We like being able to plug everything anywhere. And I admit it is damn cool being able to connect a display, most kinds of storage devices, keyboard and mouse, all while charging my laptop on a single port, at any time. I may even be able to disconnect my laptop and put my phone instead and it will do something sensible. If you did that back in the day, there was a good chance for one of the devices to turn into a smoke machine.
It comes at a cost though.
Back in the day, you would not have been able to do any of this with one port. Each type of device had it's own uniquely shaped connector/pins combo. You were not going to connect your SCSI devices into the VGA monitor port accidentally. Closest I ever saw was someone attempting plug in a Mac ADB cable to the S-Video port, but that just resulted in bent pins. It just so happened those pins were on an Avid Film Composer dongle instead of a replaceable cable.
I think there's a very strong future in emulation of achieving FPGA-like latency by using a Raspberry Pi Pico/Pico2 to emulate each of the target machine's subsystems/chips.
antirez mentioned running some of these on RP2040's
Oh wow! I remember hearing that oculus were doing this on their devices and thinking it was new.
Latency would be a Basic Feature. Once you cross 7 ms (or 5 ms, or even 3 ms if you absolutely insist) you're happy, above that everything is absolutely unusable.
The Atari has an absolute stable and extremely low jitter. Some guy measured it to 1µs. Cannot find the link though, sorry.
So the Atari has low latency around 2-4ms with an extremely low jitter. This is execatly what you want from a MIDI clock and sequencer driving multiple MIDI devices.
Look, I'm not trying to convince you to get rid of your Ataris, quite the contrary. I'm just disagreeing that it's impossible to have low jitter nowadays, but I fully agree that things used to be simpler before everything was done via USB.
Here's a review from a nice scotsman explaining how this works:
https://www.youtube.com/watch?v=XCZqkSH9peI
or a walkthrough from the creator:
https://www.youtube.com/watch?v=hkw9dmLfkZQ
Note that this is an old version, I just saw the there's now the "Nome II", and at least for Mac, he has actually developed a USB protocol to provide a stable clock (which as you've already written is totally possible via USB, it's just nobody cared enough):
https://midi.org/innovation-award/u-sync
For Windows, the sync is still done via audio through a special VST.
The YT channel by the creator has many more interesting stuff, he also has done very precise jitter measurements, see for instance:
https://www.youtube.com/watch?v=zGU336yKyEM
Regarding "midi notes" Sim'n Tonic himself is saying this to the Midronome: "Note that only these MIDI messages are simply forwarded when they are received, their timing is not changed. So if your DAW sends them with a lot of latency and/or jitter, the Midronome will forward them with the same latency/jitter. Actually this is a problem I plan on tackling as well [...]"
So the Midronome does not solve the problem of inaccurate midi notes coming from a modern DAW. The USAMO does by the way.. But only with one midi channel at once. And of course, coming back to the actual topic, the Atari hasn't a problem at all with accurate midi notes, it is absolutely tight at all 16 channels. So it seems there is indeed nothing comparable to the Atari nowadays. Maybe it will in the future.
Can Nome II send MIDI Notes?
Nome II is like a MIDI hub, you can ask it to forward any MIDI sent over USB to one of its MIDI outputs. It will not only forward these instantly but merge them smartly with the MIDI Clock, without affecting it.
This is simply not true. Many performers use Windows laptops and MIDI to control their stage equipment without issue.
MIDI is a serial protocol.
At any given time only one message can be sent down the wire. [1]
So on the beat, an implementation can send either the clock pulse or note on or something else. [2]
If you send the clock everything else has to wait. If you send something else, the clock has to wait.
Now with modern computers, you are also dealing with USB which is a low priority parallel protocol and has to coordinate with everything else a modern kernel does.
Music is hard.
[1] premium hardware sequencers sometimes have two or more Midi Out to reduce contention.
[2] Midi Time Code solves this by encoding monotonic time into Midi and is how serious sync is done over Midi, e.g. in Hollywoood
I’ve got a full studio at home, but tbh i never know what people mean by this
You are concerned about a 9600 baud protocol.
There is zero 'shame' on the 'present future' when it comes to music production tools. It is like one of the hugest bright spots/biggest equalizers. Best thing I did was go ITB. No headaches. No hardware maintenance on obscure hardware. No MIDI limitations or even considering of my MIDI chains. Just music making.
No crashes what so ever and also no distractions by updates or "phone home applications"
which is something I would guess most people would indeed see as shameful regarding our present future in software, but OTOH, this is HN, so who knows.
I am much happier with my setup that I could have never afforded than my much lessor previous setup. This being HN I'm sure that are people than can afford to spend much more than me for gear so they might prioritize the minor differences you list over no access at all, but I much prefer having access.
As an aside, all-digital workflows take the joy out of music being made in the moment, by ear and by feel. There is no replacement, for example, for a professional sound engineer adjusting a mix strictly by the sound in their headphones and the feel of the sliders under their fingers.
No they cannot.
Regarding jitter, this is the worst, because the brain cannot adapt to the changes, whereas constant latency can be regulated somehow by the brain.
If the all-in-one design was used from the off (save on moulds and shipping disc drives) they could done a pro conversion kit:
https://youtu.be/atw3FYKzog4 Also, move the joystick ports to the right rather than under the keyboard.
A few tweaks here and there would have pushed it a lot more:
Unified clocks for genlock and scrolling: https://youtu.be/yexNdSLEpIY?si=pa46sJOr_9Fin4LC
Stereo output like the CPC464: https://m.youtube.com/watch?v=0yY4BlPfLf4
AMY included, along with a 1bit PWM on the DMA chip for DMA sound.
The ST had a janky RTC anyway: https://atariage.com/forums/topic/303859-battery-pack-inside...
3 button joypads from the start, using U+D and L+R combinations for 2 more buttons
Double-sided drive from the start.
Finally, they should have included the blitter socket and a full 2x32pin expansion instead of the cartridge port. The blitter socket especially would be handy to drive a T212 transputer in 87, when the blitter was available, instead of producing the ATW.
I quite liked the STe. The mono monitor was great, RAM upgrades were easy, and they'd improved some of the display hardware's worst limitations. Even though TOS was never especially good, they'd fixed all the worst bits by that point.
Still could have benefited from some other extra hardware and OS tweaks though I think.
- 800 KB disk format supported directly by the OS
- blitter is not as useful as it could be, due to sharing bus time with the CPU. It should be able to use ACSI bandwidth if not in use/Shifter bandwidth during non-display periods, so it can run in parallel with the CPU
- 256 px 5 bitplane mode (so still 40 words per line), probably an EHB kind of affair if 32 palette entries would be too much
- something to improve endless scrolling? No carry out of bit 15 when computing Shifter address? You'd end up wrapping before the display period was finished if increasing the display stride, but you can work around that in software...
- put the 9 pin joystick ports on the side
- write signal for that stupid cartridge port that is almost (but not quite) useful for general purpose expansion
I saw Atari ST in music studios well into the late 90s/early 2000s because back then quiet beige PCs weren't a thing yet: PCs virtually all came with super noisy fans, which was a big no-no for music studios.
A buddy would bring his Korg synth to my neighbour's house and hook it to their Atari ST. Another dude I remember would play drums from Dire Straits song from his Atari ST hooked to some MIDI gear and then he'd take his guitar and play Dire Straits songs.
These were the days.
I'm not surprised some musicians still use them. If I'm not mistaken Kavinsky (who became famous after the movie Drive came out but recently had renewed interest for he performed at the Olympics games' ceremony) started doing music at a late age, on an Atari ST a friend of his game him.
As an anecdote PCs were so noisy that I asked my neighbour (an electrical engineer) if it was possible to come up with a system where the fan would slow down when the CPU wasn't too hot: and sure enough we were then modding our PSUs with "thermistors" and we'd be calibrating our tiny hack, no shit, with boiling water in the kitchen (ah clueless teenagers). Funnily enough about 10 years later every single PSU had variable fan speed.
That's the thing: we were used to quiet 8-bit and then 16-bit computers and when we had to move to these piece-of-shit PCs (but with fast CPUs / FPUs and that were upgradeable), we had to endure these painful ultra noisy CPU/PSU fans (and HDDs).
So the Atari ST just made sense. You could have these super fast (compared to Atari ST) PCs but they were noisy, fugly, piece of unbearable shits that the cool guys in music studios simply wouldn't tolerate back then.
Now of course at some point PCs became just too good and several brands started focusing on quietness and it was then possible to have a totally silent PC, both looking cool and being literally cool (big heatsink, quiet fans, etc.).
But yeah the Atari ST was and still is certainly for some a thing for creating music.
Lots of respect to the Atari ST for his MIDI port (and that comes from a Commodore Amiga owner and fan).
I never really understood why people thought this was a big deal. I had my Amiga hooked to a DX7 synth with a serial to MIDI cable that had a couple active parts in it. MIDI is a serial protocol and the Amigas had full RS232 ports with hardware interrupts, +12v, -12v, as well as audio in and out on unused pins. The serial to MIDI In/Out cable cost around $15 more than two MIDI cables. You can still buy them today: https://retroready.one/products/ka12-serial-port-midi-interf....
While the Amiga could do multi-tasking, applications could also optionally take over the entire machine, which many games obviously did but so did real-time video applications like the Video Toaster, animation, audio and music apps. Lots of Amiga games and video apps "raced the beam" in real-time without ever dropping a field of 60 fps video. So, at least in principle, the Amiga hardware was as capable as the Atari ST in terms of ability to respond to interrupts at MIDI rates. The Amiga also had video editing software, which back then, involved controlling three professional VTRs in real-time over that single serial port to do A/B roll editing on precise timecodes 29.97 times a second.
So, yeah, I agree that the Atari totally won the MIDI production music market because it had much better software applications. If you were primarily a music producer or studio, there was certainly no reason to pay more for an Amiga's custom graphics capabilities - and if you were serious, the Amiga's more limited music app selection made the choice for you. My only quibble is that, IMHO, the claims of 'MIDI jitter' somehow being endemic to the Amiga were more Atari marketing FUD than reality. I don't doubt that some user at some point did something on an Amiga that caused MIDI timing issues, but it wasn't because of some fundamental hardware limit. It was due to configuration, some app incompatibility, bug or some other solvable issue - because similar timing issues could occasionally happen on the ST too - and then would be solved.
There were ways to do non-interlaced video on the Amiga, but just like having to buy external MIDI adapter ... more $$, more hassle.
That and floppy format compatibility with MS-DOS made it easier to move data around.
DOS was crap, when you had tGEM and Amiga OS.
Windows 1 and 2 was beyond terrible.
They were shit for games
They were bulky
They were slow
They crashed all the time.
They were ugly
They were noisy.
They were hard to manage (autoexec.bat, no OS in ROM, stupidly tricky partitioning tools, incompatible drivers for the same hardware but in different applications, etc)
But IBM lost control of the hardware market so they became cheap, ubiquitous crap.
And that’s literally the only reason we got stuck with PCs.
Please don't take it too harshly, but your list of grievances is almost radically different to my experiences of personal computing in the late Eighties to mid-Nineties... to me somewhat of a faszinosum all on its own. In my litle corner of the world the Amiga was virtually nonexistant [1], largely undesireable, and prophesized to be a corpse already as early as 1991.
I'll give you one thing, tho: A mostly non-plastic, compact keyboard computer case (Amiga 600-like) for a suitably powerful IBM compatible would've been awfully nice. Still would be, for a vintage bird that is. We only got "Schneiderplastik", the Euro-PC to be more precise [2], and that one wasn't updated in a satisfying fashion.
1. The only people I knew that had Commodores were two of my best friends, one with a Commodore 64, the other with a 128. The demosceners I grew up with were Atari ST guys, all of them (becoming) musicians.
2. [https://de.wikipedia.org/wiki/Schneider_Euro_PC]
People got accustomed to whatever personal computer they used every day and many grew fond of it. After all, the power and capability of a desktop computer in the 80s was unprecedented and, for many, revelatory. That said, in the mid-to-late 80s, the PC platform was generally under-powered dollar for dollar compared to most of its leading competitors, most of which were based on Motorola 680x0 CPUs. The strength of the PC during this time was it's rapidly expanding library of business software applications and hardware expansion options in the form of add-in cards (something which Apple had with the Apple II but abandoned for a while with the Mac, the Atari ST never really had and only the "professional" series Amiga's had (A2000, A3000, A4000).
Being underpowered per dollar doesn't mean the PC couldn't be extremely useful or the best platform for a given scenario and it certainly doesn't mean there weren't hobbyists who used and loved their late 80s PCs as dearly as any other 80s personal computer owner. Of course, this power balance was largely addressed by the mid-90s - which is why the PC juggernaut then extinguished the Amiga, Atari and (very nearly) the Mac.
Anyway, off to some specifics:
> "The strength of the PC during this time was it's rapidly expanding library of business software applications and hardware expansion options in the form of add-in cards [...]".
A standardized general-purpose computing platform "for the future". Exactly what spoke to me, as disseminated in the publications I read as a kid in 1991.
> "Of course, this power balance was largely addressed by the mid-90s - which is why the PC juggernaut then extinguished the Amiga, Atari and (very nearly) the Mac."
"Power balance"? I didn't think in such abstracts when I made my choice, and conducted the long and merciless attrition-lobbying campaign for financial support, to buy a PC. The Amigas and the Ataris were simply not a factor for a variety of different, but very tangible and practical reasons:
Atari ST? I was not (on my way to become) a musician with the need for a precise and affordable backpack-portable computer instrument.
Amigas? The big birds were seen, outside of some specialist-niches, as uneconomical compared to their IBM-compatible brethren.
The vanilla home computers were seen as affordable, but extremely limited, racking-up (hidden) secondary costs to make them more usable. Often enough they carried a certain cultural stigma as well, being perceived by our financiers as gaming toys and therefore time wasters. And most importantly? No one I personally knew had an Amiga. Who to swap software with, where to find a mentor? Yeah...
The Atari guys I befriended used their machines almost exclusively for dabbling in electronic music, later as part of the emerging East German EBM and hard techno scene.
Games? The titles I was interested in either didn't exist on M68k platforms (flight simulations à la Aces of the Pacific, wargames such as the Harpoon series, or adventures like The Lost Files of Sherlock Holmes), were practically unplayable (e. g. Red Baron), considered inferior (e. g. Wing Commander)... or just came out too late.
By listening to stories of some Britons of my age, it only recently came to my attention how privileged I have actually been. Some of these people told me stories of buying their first A600s or A1200s only in 1993! At that time it was hard to separate me from my trusty, second-hand PC... a machine with a CPU-type nearing its eighth (!) birthday (386DX-25).
PCs in the 80s were so bad that most homes still ran 8-bit micros with a BASIC ROM.
Windows 3.0 wasn’t even released until 1990.
And let’s be honest, the problems with autoexec.bat, config.sys and different incompatible DOS drivers for games existed throughout. It was only really the uptake of DirectX that fixed that (and to an extent, Rage / OpenGL, but you could technically get DOS drivers for them too).
But that was a whole generation of computers away from the 3.x era of PCs, and another generation again from the 80s.
But by the mid-90s, we had a whole new set of problems with PCs: the OS silently corrupting itself. It was so bad that best practice advice was to reformat and reinstalling Windows every 6 months (I did it much less regularly than that though). And this was a common idiom throughout the entire life of the 9x era of Windows too. But to be fair, that was also a common idiom with pre-OSX Macs too. Apple had become painfully shit at the point too.
If The ST and Amiga were still evolving like PCs were, then by the late 90s I’m sure Amigas might have suffered from the same longevity problems too. But in the 80s, they (and STs, Macintoshes, and Acorn Archimedes) were stood head and shoulders above what PCs could do at that point in almost every regard.
I'm East German; you people got a headstart. My relevant hands-on experience is centered around a 386DX system, technology introduced in the mid-Eighties. 1987 brought VGA, AdLib, and MT-32s to the table, with games support gearing up in '89, the year the Sound Blaster was released. Fall 1990 saw the release of Wing Commander. Of course that's just technology, economical realities tell a different story.
> Windows 3.0 wasn’t even released until 1990.
Windows was as relevant to me as an Amiga. GUIs didn't do much for me until much later. Still prefer CLIs, TUIs (and minimal GUIs that come as close to the latter as possible).
> And let’s be honest, the problems with autoexec.bat, config.sys and different incompatible DOS drivers for games existed throughout.
I never experienced serious troubles in DOS. The first two, and only, games I could not get to work were two infamously bug-ridden Windows titles of the late 90s: Falcon 4.0 and Privateer: The Darkening. By the time they fixed 'em with a litany of patches I was busy with other things.
> But by the mid-90s, we had a whole new set of problems with PCs: the OS silently corrupting itself.
News to me. How bizarre!
> But in the 80s, they (and STs, Macintoshes, and Acorn Archimedes) were stood head and shoulders above what PCs could do at that point in almost every regard.
Hardware? Until '87. Games? Until late '90 I'd say, at the earliest, accounting for a strong genre bias. [1] Then, outside of niches (Video Toaster, cheap DTP, music production) and certain "creature comforts", it was over; the eco system began to atrophy.
1. The first two DOS-platformers that wowed me visually were Prince of Persia 2 ('93) and Aladdin ('93/'94); all my other genre preferences were, to put it diplomatically, underserved on 16-bit home computers.
That doesn’t mean PCs were somehow more capable in the 80s though ;)
> Windows was as relevant to me as an Amiga.
Same point as above.
> I never experienced serious troubles in DOS.
I think that’s probably rose tinted glasses on your part then. The pain was real. Games seldom “just worked” and you often had to be really knowledgeable in your system to get stuff working.
To this day, I’ve never heard the music in Transport Tycoon because that game refused to work with whatever midi drivers I threw at it.
> > But by the mid-90s, we had a whole new set of problems with PCs: the OS silently corrupting itself.
> News to me. How bizarre!
I’d be amazed if you’ve never once heard about the old problem of computers getting slower or buggier over time and a reinstall fixing things.
> Hardware? Until '87.
You’re comparing theoretical top of the line PC hardware (which nobody actually owned and no one had written software for yet) with commodity Amigas and STs.
And even those top of the line PCs still missed a few tricks that made some genres of games better on Amigas and Atari STs, like fast blitters.
It wasn’t until the era of ray casting 2.5D 1st person shooter that PCs started looking better than their counterparts. And then things really accelerated (no pun intended) with 3D hardware graphics acceleration. Which, to be fair, was available for Amigas too, but the only software that targeted them were 3D rendering farms.
It clarifies specifics relating to my personal experiences with the discussion matter, addressing (perceived) realities of a local market. How people use computers is of the utmost relevance; a fact which you, given your lamentations here, certainly must have internalized.
> I think that’s probably rose tinted glasses on your part then. The pain was real. Games seldom “just worked” and you often had to be really knowledgeable in your system to get stuff working.
No rose-tinted glasses here. And I believe you that your and others' pain was real. Many people could not work their heads around a PC; many of 'em fell for cheap SX clunkers with other substandard components, ffs. That's obviously an inherent problem of such an open platform: PCs are highly individual in many subtle ways; a trade-off one had, and still has, to negotiate in one fashion or another.
> You’re comparing theoretical top of the line PC hardware (which nobody actually owned and no one had written software for yet) with commodity Amigas and STs.
I'm comparing hardware available on the market (with key system components coming together in 1987/88, and games supporting such top-of-the-line hardware showing up in numbers from '88 onwards). I also spoke to economical realities in nearly every post in this disc; I am well aware that 16-bit home birds had a technical lead for a short while, and were an even better value proposition for many people a while longer. For some, just as valid, this still holds true.
> And even those top of the line PCs still missed a few tricks that made some genres of games better on Amigas and Atari STs, like fast blitters.
Yes, already addressed by referring to Prince of Persia 2 and Aladdin (1993/94!).
> It wasn’t until the era of ray casting 2.5D 1st person shooter that PCs started looking better than their counterparts.
So, your stylistic (genre) preference maps it into the time between 1991 (with Hovertank 3D in April as well as Catacomb 3-D in November) and Wolfenstein 3D (May 1992). Okay.
With mine it begins earlier, largely because of proper 3D-titles: Deathtrack (1989, PC-exclusive), LHX: Attack Chopper (1990, no Amiga/Atari port), and Red Baron (1990, got the Amiga slideshow in 1992), as well as the odd non-3D action title here and there, e. g. Silpheed (1989, no Amiga/Atari port).
One can probably go even back to 1988, for at least parity in certain markets and their segments, if one compares the technological edge in an intellectually honest fashion, i. e. what the platform, hardware and software, was really technically capable of.
And productivity software, part of the deal, is of course its very own world.
But in the late 1980s, oh my. An Amiga 500 in 1987 was really a lot better than a PC of the time for many things. It was also a lot cheaper. Maybe half the price. The Amiga and the Atari ST didn't improve enough by 1991. By then a PC was better.
But by 1988 the PC was so far outselling everything else that the writing was on the wall.
This article has a graph of market share by year.
https://arstechnica.com/features/2005/12/total-share/
People who had Amigas and Atari STs couldn't quite understand how their machines, that they perceived as so much better, were being outclassed by PCs running MS-DOS. On an Amiga 500 in 1987 you had a decent GUI. Until Windows 3 PCs didn't.
For example, Pro-Write on the Amiga having real time spell checking and being WYSIWYG in the late 1980s. It wasn't until Word 6 in 1993 that Word was really much better.
Competent enough people on both ends, end-users and programmers alike, simply worked around that. In the end, it still allowed for a platform of industry-leading applications and games, many of them not available on Amigas or Ataris.
If youve only used a PC in the 90s then it’s easy to see the Atari and Amiga crowd as rose-tinted fanboys. But they’re comparing 90s IBM PCs with 80s competitors.
Really, that says more about how IBM PCs were 10 years behind the competition than about how great IBM-compatibles were.
Yes, the "bang for the buck" made all the difference. For a while.
I remember my PC1 fondly. Well, I still have it. I learned to code in GW-BASIC, Turbo Pascal and C (in that order) with it. I was using it for a long time, until 1997, for serious work (coding and university assignments), when I finally had the money to upgrade to a Pentium PC.
As much as my world was PC-centric, the first time I saw an Atari ST and what it could do, my jaw dropped. I knew of the Amiga from magazines, but the first time I actually saw one was several years later, after I acquired my Pentium PC and I admit it wasn't that impressive then. But still I couldn't help but think: "wow, you had that in 1989?". There was no comparison with the PCs of its time.
[1] https://www.seasip.info/VintagePC/prodestpc1.html
[2] https://en.wikipedia.org/wiki/Sinclair_PC200
> But still I couldn't help but think: "wow, you had that in 1989?". There was no comparison with the PCs of its time.
This speaks more of local market realities, e. g. "demo software" on hardware in an actual computer shop or, as in your example, at a friend's home, especially in the form of 2D arcade action games, then at its peak of popularity on 8- and 16-bit home computers... and yet to shine on PCs (and as opposed to glossies and the like, where the first 486 machines stepped onto the stage turn of the year 89/90).
But at that time I wasn't thinking about computers that much, still digesting the beginning of the end of the GDR.
1. [https://xcancel.com/C64EVO/status/1893158033917669785]
Every time I got to play on non-PC home computers I’d be blown away by how much better those machines were.
These days I collect retro hardware and the Atari STs and Amigas are still easier to maintain.
So my opinions aren’t that of a Amiga fanboy. PCs in the 80s were really that shit.
I do think a lot of the problem was Microsoft though. I never liked any of Microsoft’s software even back then. And that was long before they gained the reputation that so many older timers like myself still remind people about. I actually wrote my own windowing system in Pascal because I was fed up with early Windows. It wasn’t a patch on GEM but back then I didn’t know you could run GEM on a PC.
We had affordable windowing and GUIs on the Atari and the Amiga, with instant boot from ROM and tons of RAM. The Amiga had the beginnings of multitasking and hardware acceleration for graphics and sound.
Then suddenly the industry decided to go back to a cut-down version of late 70s S-100 computing, with insanely unaffordable prices, crippled specs, bizarre semi-manual memory management, ugly hardware, and a command line interface that was basically CP/M but not as good.
Infuriating.
This was a mistake. It slowed upgrades. The several second's worth of speed increase was irrelevant when to upgrade an Amiga you had to ship out 2x 40-pin DIPs and floppy disks as opposed to just floppies.
And due to incompatibilities it was common to install a ROM switcher in your system, especially in Amigaland when kickstart 2.0 came out and you wanted to keep 1.3 so your games would still run. So you had to buy and install a switcher like a MultiStart AND buy and install new ROMs and manage two sets of floppies. This led to a schism in the market where normal people were stuck with an A500 running 1.2/1.3 and its 1980s featureset and power users who wanted space-age luxuries like IDE hard drives were running 2.0.
Practically every word uttered or printed by Commodore about compatibility between the two was an outright lie.
Microsoft had an obsessive focus on backwards compatibility so MS-DOS 5.0 was adopted by practically everybody: just insert the disk, switch to a:, type "install". Done. There were compatibility issues, but they were on a scale that was irrelevant compared to the Kickstart disaster.
Could you imagine in 1992 having a PC and having to install a PCB in the BIOS sockets so you could have both versions 3.02.111 and 4.84.1932 of your BIOS and keep DOS 3.3 and 5.0 boot disks on hand so that you could run Commander Keen and use the newest version of WordPerfect?
I did all of that on the non-PC side of the house, and many others did too. I had (and still have) an A2000 with hard card, flicker fixer, accelerator, ram expansion, rom switcher, and other upgrades. I spent thousands of hours tinkering and having fun with the system.
Tinkerers don't make for successful ecosystems.
Fun, yes.
Sustainable, no.
It was basically just the old 8-bit micros that kept IBM compatibles looking good.
I hate to go to bat for MS-DOS, but it had at least one real advantage over CP/M: a single disk format. As doomed to failure as the various non-PC DOS machines (e.g. Tandy 2000 and DEC Rainbow) were, they could at least share disks.
The Olivetti had a B&W 640x400 monitor and a Logitech mouse that plugged into the back of the keyboard. You could replace the 8086 CPU with an NEC V30 for a bit more speed.
Just like today's GUIs. They all look like Windows 1.0
They'd have to have been a bit more careful about it than IBM were.
I am confident it would still feel like everything is terrible.
Apple did survive in that era, though not unassisted, and the differentiation they landed on(particularly after Jobs came back) was to market a premium experience as an entry point. I think that is probably going to be the exit from today's slop.
In this era, spec is not a barrier - and you can make <$100 integrated boards that are competent small computers, albeit light on I/O - and that means that there's a lot more leeway to return to the kinds of specialty, task-specific boxes that the PC had converged. There's demand for them, at least at a hobbyist level.
For example, instead of an ST and outboard synths for music, you could now get an open-source device like the Shorepine Tulip - an ESP32 touchscreen board set up with Micropython and some polished DSP code for synths and effects. It's not powerful enough to compete with a DAW for recording, but as an instrument for live use, it smashes the PC and its nefarious complexities.
The 9 pin pinout has 2 spare button inputs anyway. Maybe it'd be feasible to use those.
Obviously it'd be better to have a protocol like the megadrive, but given the setup in the ST, this is a hack without having to change the ST hardware.
If the up+down button was used for firing the main weapon, this wouldn't work, especially when you've got the machine gun and you can just hold the button down. You wouldn't be able to duck or aim up while shooting, or it would be unreliable.
If you were trying to use it for a three button fighting game, I think you'd have problems too. Especially if you are doing 'negative edge' techniques where you would hold a button while inputting the directional sequence and release the button to trigger the special.
Beating the Amiga to market, and beating it on price were super important.
But I do think there was a serious problem with follow through. The Blitter and GDOS and then the STe came too long to come after. The Blitter never became standard, and the games and software suffered for it. And updates on the operating system were slow and thin until it was way too late.
I do agree that the cartridge port thing -- it being limited to 128kB expansion -- was needless. One more pin, even, would at least allow for a proper OS upgrade via cartridge port! Definitely one of the stupidest design decisions on the machine.
Realistically it's amazing the ST was as good as it was, given the 6 month development time and the kings of penny pinching at the helm :)
I think they’d be better on the back unless you are supposed to plug them out all the time.
> Finally, they should have included the blitter socket
That would be hard without having a functioning one first. The blitter would be also handy for a number of things, from PCM sound to network and disk transfers.
I agree it would be difficult to design a correct socket, but from interviews it was always the plan to have a blitter, and a socket as standard would have helped adoption.
The main thing is that the T212 is a great coprocessor, faster than the 68881 fpu and with a 2k cache. Introducing the transputer as a coprocessor would potentially have changed the computing landscape
I was astonished to find about 22 distinct C compilers, including their own libraries, assemblers, linkers etc. for the Atari ST and its successors. That's not counting separate versions, just distinct products from different vendors.
From what I can see now looking at archive sites, there was a huge amount of activity in developer tools on the ST back in the day. Much more than I thought at the time. It might have been a serious contender for the dominant architecture (along with the m68k CPU), if IBM PC-compatibles and x86 hadn't won.
Recently I looked for Atari ST C compilers, out of curiosity to test portability of a C program I'm working on.
I've been testing C code for diverse Unix systems.
As I used to own an Atari 520ST (with 1MB RAM beautifully piggy-backed and hand-soldered on the existing RAM chips :-), it seemed like a good idea to peek at C on an ST emulator. I didn't use C when I had a real Atari ST (no C books in my local library), so I expected to find one or two C compilers, not 22!
Try and get a compiler and linker to fit in 360k these days!
If I recall, Lattice C was popular. Mark Williams was another one. "Alcyon C" was included I think in the ST development kit, but was considered poor.
I think people use "Pure C" these days, but of course also GCC is likely best:
http://vincent.riviere.free.fr/soft/m68k-atari-mintelf/
Is maintained by Vincent Rivière, who is a major contributor on EmuTOS.
When I see generations that grew up with game consoles, and talk about the current uptake on desktop games, they really have no idea what they missed out in home computing and the first wave of indie game development, from bedroom coders .
Tangent: the older I get, the more it annoys me that this expression kind of implies a failure of young people to study history, when I feel like it's more the responsibility of previous generations to preserve and pass down history for them to learn from. Especially because it's usually people in power in some form who are trying to keep the newer generations naive here so they can be fooled again.
Not saying that this interpretation was your intent (in fact I suspect it's the opposite), just directly expressing my annoyance at the expression itself.
But everything has been preserved and passed down. The entire home computing phenomenon has been archived and is available on the internet thanks to the rampant 'software piracy' which was common at the time, and detailed schematics and manuals coming with the computers (which have all been digitized and are available on the internet). Even my obscure KC85 games I wrote as a teenager and 'distributed' on cassette tapes by snail mail are available as download because some kind person(s) digitized all that stuff during the early 90s and put it on some 'underground' game download portals.
The 80s and early 90s home computer era will be better preserved than anything that came after it.
Indeed. Sadly, many more recent games will probably be lost to time forever due to DRM, online service components going offline or never being distributed on physical media in the first place. As someone into vintage computers and preservation, I worry that future generations may look back and see the late 2010s and certainly the 2020s as a 'dark age' when surveying the history and evolution of digital gaming. All we'll have are YouTube videos (assuming those survive future business model tectonic shifts) but no ability to actually experience the game play first-hand.
Recently I've been exploring the back catalog of more obscure PS3 and X360 games via emulation and have found some absolutely terrific titles I never even knew existed. Some of them were only ever sold through the console's online store and never available on physical media. With the XBox 360 and Nintendo Wii stores now long offline, only the PS3 store remains available - and who knows for how much longer, since Sony already announced its closure once and then changed their mind. There's now a race to preserve many of these titles
The good news is that not only was almost all of it preserved, teenagers today are really interested in retro gaming. My 15 year-old daughter, who's not into computers more than any other 15 year-old girl, just asked if she could go with me to the vintage computer festival this Summer. She tells me her friends at school are all interested in running emulators to play classic games from arcade to SNES to PS2 and N64.
I guess the 'dark lining' to that silver cloud is that this interest from teens in retro gaming is partly thanks to the increasing downsides of modern gaming (cost, DLC, ads, hour-long download/installs, etc). While game graphics continue to get more and more impressive, stuff like real-time path tracing doesn't seem to excite teens as much as does me. Ultimately, it's about game play more than visuals. Lately I've been exploring the immense back catalog of N64, PS2, PS3 and x360 games via emulation and there are some incredible gems I never even heard about back in the day. It's especially great now thanks to the huge variety of mods, enhancements, texture packs, decompilations/recompilations and fan translations. And current emulators can upscale and anti-alias those games even on a potato desktop or laptop with a low-end discrete GPU.
However curiosity also plays a big role.
If I know so much about computing history since the 1950's, is because I do my research, and take advantage of all archives that have been placed online, certainly I wasn't around to live all of it.
Things I remember about about the 520ST:
- Those horrible diagonal function keys. There was no reason for them to be diagonal, rather than normal keys as they were on the IBM. But I've always hated function keys.
- Games like Dungeon Master (really still quite a good game today).
- Not a bad C compiler, but I can't remember who by - LightSomething?
- The GEM GUI was not so bad, but using it with a floppy disk was.
But all-in-all I was quite happy to get my PC-compatible to do serious work with.
I don't know if they were consistent with the other keys in terms of feel, but they were a striking, unique design feature that instantly identified the machine as being Atari without compromising practicality.
I guess the idea was to have a clean design with cables out of the way, but it really was a bad place for them.
It's funny how some young producers today wonder "how did people do it without a computer before the 2000?"...well guess what, we did used computers! I cannot however remember what software sequencer I was using, I know it had MIDI effects (like MIDI echo), that's all I remember.
And by 1998, Logic was fairly advanced anyway and even had plenty of plugins.
Possibly/ probably Cubase. Anyone remember the Mike Hunt version? I'm still using cubase on a nice pc, but I miss the stability of the atari.
The Mega STe had a funkier case, VMEbus, and upgraded specs, but mushy rubber dome keyboard, more brittle plastics.
I like to collect the Mega as the best of the bunch, personally.
This says that the keyboard on the Mega ST was better. And yet still not good enough. Egads, that ST mess was a terrible keyboard.
Still liked the Speccy better…
I have an adapter on mine that converts it to USB and I can use it on a modern computer.
Though I never do. Mainly because it's got Ctrl/Alt[Meta] but nothing I could map to Hyper/Super.
Mega STe and TT reverted to terrible mushy rubberdome.
I liked how the keyboard was detachable and the hard drive was the same size at the motherboard case, so you could stack them.
[0] https://upload.wikimedia.org/wikipedia/commons/5/54/Atari_10...
There wasn’t such a thing as a general developer market.
When you didn’t have internet and cloud services and free Unix, how could you develop for something else than a specific platform and device?
If you bought a Mega ST to write programs, your target audience were still only the people who had a regular ST. You couldn’t reach anyone else. So the advantage was minimal.
The idea that there can be a developer market separate from the baseline end-user platform is quite new. It emerged around 2007-2010 when web apps became a realistic option and you didn’t have to be on Windows to target the 90% of people who are on Windows.
Honestly, I don't think even Apple could touch the best of Atari and Commodore industrial design in the back half of the 1980s. To be blunt, the early Macintoshs simply weren't practical in their design: for starters, a tiny monitor - that was originally black and white (which in 1984 was already kind of a joke) - and very limited upgradeability, relatively poor multimedia capabilities (speech synthesis was no more than a gimmick that was also available on other platforms), and then the whole aesthetic just wasn't that pleasant.
And I say this as someone who, personally, has only owned Apple machines for the past 15ish years, so I'm not exactly coming at this from a "not a fanboi" perspective. I'd still take 1980s Atari or Commodore aesthetic over modern Apple, or modern anything else for that matter[0].
Also, as an aside, I really enjoyed seeing "Atari Means Business with the Mega ST" as the top headline on Hacker News in 2025. Even on a Sunday when content typically tends to be more varied and interesting this was still an entertaining surprise.
[0] I suspect the reality may be that I'm an "anything but Wintel" kind of person, although not at any cost, because I did run PCs exclusively for 11 or 12 years. They never really helped me enjoy computing in the way the other machines have though.
For example: I cannot think of any desktop models that lacked internal expansion. They may have used a riser card to stack in two or three slots sideways, but the slots were there. The design may have been crude, but at least your desktop wasn't turned into a disaster every time the technological landscape shifted: when hard drives became affordable, the world switched to 3.5" floppies, if you decided to use online services or send faxes directly from your computer, get a CD-ROM, or cable Internet.
And with emulated VME graphics, with an HDMI output and a USB-C port. And 3-button mouse. Able to run Atari Unix.