Jump to content
IGNORED

Were the Atari ST's big for gaming or just the 8 bit line?


Recommended Posts

Once again excuse my ignorance having grown up a Commodore fan and now trying to come to grips with all things Atari. I am getting there having recently acquired an 5200 lol! In regards to the Atari line of computers though, it seems at least to me that most of the "buzz" and conversation as far as gaming on Atari Computers revolves around the 8bit line? Since I loved Amiga gaming back in the day I would of though that 16 bit Atari gaming would of been pretty cool. However I don't hear much about gaming on Atari ST's, was it not a big thing? We're the ST's more for applications or music? Did anyone particularly love or prefer gaming on the ST? Please enlighten me, did I miss anything not gaming on an ST back in the day instead of or in addition to my beloved Amiga? Your opinion and thoughts welcomed as always! :thumbsup:

ST for many years was much larger and outsold Amiga 2-1, wasnt until Atari reallocated supplies to europe and commodore brought out the a500 did that situation change. People wanted st's but when they weren't available Amiga 500 was there with a good price point and relativly reliable

 

Amiga 500 struggled badly in 1987-89ish. Problem was it was 499+VAT AND you had to buy a TV Modulator on top for £25 or so. That made it nearly £200 more than the ST. Very few games were also developed with the Amiga in mind, more like a re-compile of the ST code. What happened was in the EU Atari had lowered the price of the ST to £299 and Commodore to £499 (inc VAT sales tax) at some point BUT when the japanese factory burned down and RAM prices rose Atari had to hike the ST back up to £399 and then Commodore went for the jugular and dropped the A500 to £399 also. Around this time Shadow of the Beast came out and well that was that really. The ST went back down to £299 soon enough but the damage was done and people invested the extra £100 in Amiga.

 

I have an issue of ACE (Advanced Computer Entertainment) magazine where the £399 A500 price cut is in the news section but the ST Silica Shop adverts state that the ST is going up to £399 from £299 next month.

Link to comment
Share on other sites

I think we should all go out and buy a Dragon 32. It never really took off here in the UK despite being a decent system. :ponder:

 

Either that or a SAM Coupe , which was a car with built in Spectrum computer system for onboard computer (fuel consumption, parking aid etc) :D

 

The Dragon was not very good compared to A8/C64 or Amstrad. And it was loosely based on the TRS-80 BUT didn't have the NTSC artifact screenmode naturally being a PAL machine and this was the screen mode that Donkey King/The King used to be such a fantastic rip-off of the arcade machine ;) There was a poke that would overclock it slightly on early models (before frying itself) I think.

Link to comment
Share on other sites

As far as I'm concerned the only mistake or bad decision was not including scrolling in the ST design. ( Having a packed pixel mode rather than bitplanes like the Apple 2GS would have been nice as well ) No scrolling meant that the CPU had to waste a lot of time moving the background - in many cases a 50fps game on the Amiga would be at least twice as slow on an ST - as almost one frame of time would be needed to repaint the background to simulate the scroll.

 

This is an interesting comment and hits on the crux of the problem

 

ST had.....

 

8mhz state of the art (for 85) 68000 16/32bit CPU

No hardware horizontal scrolling

No hardware sprites

No sample based sound chip

 

Now had they either put in scrolling OR a stereo DAC it would have saved the CPU a lot of time from horizontally scrolling 16 colour screens or doing sample playback, both at the same time is not good for CPU bandwidth. Just one of those omissions being addressed in the initial design would have been enough really. Gauntlet 1 is probably pushing it quite a lot with 16 colours to shift, many software sprites to draw and samples to play back.

Link to comment
Share on other sites

Amiga 500 struggled badly in 1987-89ish. Problem was it was 499+VAT AND you had to buy a TV Modulator on top for £25 or so. That made it nearly £200 more than the ST. Very few games were also developed with the Amiga in mind, more like a re-compile of the ST code. What happened was in the EU Atari had lowered the price of the ST to £299 and Commodore to £499 (inc VAT sales tax) at some point BUT when the japanese factory burned down and RAM prices rose Atari had to hike the ST back up to £399 and then Commodore went for the jugular and dropped the A500 to £399 also. Around this time Shadow of the Beast came out and well that was that really. The ST went back down to £299 soon enough but the damage was done and people invested the extra £100 in Amiga.

 

I have an issue of ACE (Advanced Computer Entertainment) magazine where the £399 A500 price cut is in the news section but the ST Silica Shop adverts state that the ST is going up to £399 from £299 next month.

There were SCART cables available for the Amiga, right?

And composite video would have been an option had CBM not used the luma only composite out on the A500 (great for getting a crisp image via composite but no color... would have been nice to be switchable -would have been great to have such a switch for composite monitor A1000 users too). I assume the luma only thing was a gimmick to sell separate RF modulators or such (the crisp grayscale probably an unintentional bonus) and not even allow people (in the US or EU/UK) to use color composite video via A/V port or VCR to RF. (granted you had RGB via SCART as well, though much more common in mainland Europe -mandated in some cases- than in the UK from what I understand -with the UK also using RCA AV ports and seeming to have favored RF only TVs longer than the US as well -I remember anecdotes about Playstation users in the mid 90s not getting and RF adapter pack-in, and then there's the fact that the UK standard bundle of the N64 had an RF adapter while the US one had composite+stereo AV cables)

 

Some of the games that show off the Amiga best also make the ST look bad due to sheer poor porting. SoTB is a prime example of that. (albeit such cases probably would have been far less frequent with at least hardware scrolling, even with nothing else)

 

 

 

 

I think we should all go out and buy a Dragon 32. It never really took off here in the UK despite being a decent system. :ponder:

 

Either that or a SAM Coupe , which was a car with built in Spectrum computer system for onboard computer (fuel consumption, parking aid etc) :D

 

The Dragon was not very good compared to A8/C64 or Amstrad. And it was loosely based on the TRS-80 BUT didn't have the NTSC artifact screenmode naturally being a PAL machine and this was the screen mode that Donkey King/The King used to be such a fantastic rip-off of the arcade machine ;) There was a poke that would overclock it slightly on early models (before frying itself) I think.

The NTSC color artifacts aren't that important. The only artifact mode really useful was the black/white(buff) 256x192 mode which could artifact to allow white/black/red/blue at an effective 128x192. (very few games used it AFIK) The normal 128x192 4-color mode could select any 4 colors from 2 banks so you could have 4 selected from green, yellow, blue, red, or black, or selected from buff(white), orange, cyan, pink, or black.

Orange+cyan+buff+black was fairly commonly used and is particularly useful as it directly corresponds to the apple II's common 140x192 artifact color mode. (you actually get 6 colors with purple+green+orange+cyan+white+black but you're limited to purple+green OR orange+cyan on a 7 pixel basis so all 6 colors are only normally used for splash screens or in the boarder while the in-game window sticks to 4 colors with green+purple+white+black or most commonly orange+cyan+white+black -white is an artifact color created by both green+purple or blue+orange so it's available in both cases while black is always the off state) The closest to the purple+green+black+white for the CoCo would probably be red+green+yellow+black. (actually that matches one of the default CGA 4-color palettes ;))

 

With similar display (slightly lower res, more or less similar color -weaker splash screens), and significantly more CPU resouce and a DAC rather than 1-bit Toggle, the CoCo was significantly better than the Apple II technically and at a small fraction of the price. (not to mention OS-9... though not standard until the CoCo 3) Obviously with other mitigating factors though. (I think it was also the first sub $399 home computer in the US with the $399 price of the 4k model in 1980, though actually dropping to $299 for the 1980 holiday season with the 16k model at $399 and the 32k model just under $550)

 

 

What's interesting is that the GIME of the CoCo 3 gave a very substantial upgrade that put it ahead of the ST in some respects: it offered a 256x196 mode with 16 colors indexed from 6-bit RGB, so not as good as the ST there, but it had horizontal and vertical scroll registers (offloading a ton of CPU resource) and doubled the 6809 speed to 1.79 MHz. (the DAC was slightly upgraded with some added features to aid with software driven sample playback, a shame it didn't at least get a simple PSG like the tiny SN76489 or a clone, or a simple PSG sound circuit in the GIME ASIC... let alone actually adding a DMA circuit to the DAC which would have done wonders, preferably also pushing beyond the 6-bit resolution as well)

You also still had the dot clock at exactly double that for the NTSC color clock and thus got very consistent artifact colors, thus creating the so-called 128x192 256 color mode. (you could do more than that with raster interrupts -some demos showing 1000s of colors on-screen) That would not only mean more colors on-screen, but breaking the limits of 6-bit RGB by a considerable margin.

 

The use of the multitasking OS-9 with GUI was certainly a nice feature as well. (and something the ST didn't offer -multitasking- and only the Mac was just getting in '86 and the IIGS also got)

 

And of course the further upgraded CoCo 4 was never released. (further enhanced graphics, 3.58 MHz 6309, and maybe a proper sound upgrade)

 

It least the 6-bit DAC allowed far more (with less resource) than the 1-bit toggle of the Apple or Speccy (you could use the GTIA keyclick line for that too, potentially), or the PC for that matter, but that at least had square wave generation via the interval timers. (too bad the CoCo didn't have interval timers to work with like that... could have allowed for some interesting stuff with the DAC be it a single square wave channel with 6-bit volume, or software mixed square wave channels with volume -for that matter, a bare resistor DAC on the PC could have allowed that too... ie a built-in counterpart to the Covox/Disney/DIY 8-bit DAC parallel port sound boards for the PC)

Hmm, had they used a RIOT instead of PIA for the keyboard I/O, they'd have gotten a programmable interval timer along with it... and 128 bytes of RAM. (ie allowing 2x speed mode in a scratchpad rather than only ROM)

Hmm, that would have been interesting for the A8 too, RIOT instead of PIA (obviously slightly more expensive with the RAM and timer, but they WERE already stocking it for the VCS)... both a timer independent of those tied to POKEY's square wave channels, and a 128-byte scratchpad (allowing full 1.79 MHz CPU speed... or using it for 128 CRAM entries had CTIA been designed a little differintly). Hell that timer could have allowed the CTIA/GTIA keyclick channel to be used for square waves (no volume control) for a 5th sound channel without undue CPU resource to drive the 1-bit toggle.

 

 

As far as I'm concerned the only mistake or bad decision was not including scrolling in the ST design. ( Having a packed pixel mode rather than bitplanes like the Apple 2GS would have been nice as well ) No scrolling meant that the CPU had to waste a lot of time moving the background - in many cases a 50fps game on the Amiga would be at least twice as slow on an ST - as almost one frame of time would be needed to repaint the background to simulate the scroll.

 

This is an interesting comment and hits on the crux of the problem

 

ST had.....

 

8mhz state of the art (for 85) 68000 16/32bit CPU

No hardware horizontal scrolling

No hardware sprites

No sample based sound chip

 

Now had they either put in scrolling OR a stereo DAC it would have saved the CPU a lot of time from horizontally scrolling 16 colour screens or doing sample playback, both at the same time is not good for CPU bandwidth. Just one of those omissions being addressed in the initial design would have been enough really. Gauntlet 1 is probably pushing it quite a lot with 16 colours to shift, many software sprites to draw and samples to play back.

The graphics acceleration is almost certainly more important... and you have many more options for sound upgrade before pushing for a DMA circuit (you wouldn't need stereo, a simple mono 8-bit DAC with DMA loading would have been great), but there's certainly other options. Short of actual DMA you could offer a simpler IC with built-in DAC and FIFO with timer input (or internal timer) to help with CPU driven PCM playback (as it is, the best option with the 68k via the PSG -using it as a bare DAC, so same for any similar software playback- is managing pseudo DMA with the stack -facilitated by the 68k always running in supervisor mode). With some other platforms, software driven PCM can be efficiently managed with pure timed interrupts, especially with CPUs with extremely efficient interrupt handling (ie 650x), but the relatively heavy overhead of interrupts for the 68k. (hence why the PC Engine can manage 7 kHz interrupt PCM playback for only ~5% CPU use but the Genesis would have a tougher time managing even close to that with the pseudo DMA trick -let alone plain interrupts- though it had the Z80 handling PCM playback in most cases -albeit with added overhead of 32k serial bank switching for addressing multiple sample streams- and didn't have the option to use interrupts as the YM2612's interval timers or even the hblank timer were left unconnected)

 

A bare DAC would be barely any use over the existing PSG beyond allowing both to be used simultaneously and not having to deal with nonlinear PCM. (albeit for some things the 4-bit log based stuff works well -like voice- and can actually be used as a for of compression over linear PCM -in other cases 4-bit linear PCM of similar sample rates is preferable) That and I think some of the MOD/Sample trackers actually used the 3 square wave channels as separate DACs rather than software mixing, and I'd imagine the similar Spectrum demos use a similar method. (the A8 also had a good bit of potential for that with 4 separate DACs, and 4-bit linear at that with interval timers for 3 of them -the lynx has 4 8-bit DACs to use in a similar manner of course) None with DMA like the Amiga or PC Sound Blaster cards of course. (or Tandy DAC, or Sound Master, or STe)

 

But PCM is only one side of things and you had options for better sound chips in general: one nice option I already mentioned was the YM2203 which was fully compatible with the YM2149 but added 3 4-op FM synthesis channels (ie 1/2 of what the MD's YM2612 offers and 3/8 that of the arcade standard YM2151) and that was available back in 1985 (introduced on PC8801 models that year). Only a 40 pin DIP at that, though requiring an external DAC (tiny 8-pin DIP). Given Atari was already buying from Yamaha, that would seem like a particularly good option. (there was also the super low-end YM2413, a cut-down derivative of the YM3812 used in Adlib/SB cards and it had 9 channels and only a 18-pin DIP, but was only 2-op FM and only allowed 1 user programmable instrument at a time plus 15 presets, so rather limited, plus it wasn't available until ~1987 and would need to be included in addition to the YM2149).

FM has trade-offs with PCM, but you have a crisp, clean sound (~50 kHz sample rate), no added use of RAM for samples, and a good deal of flexibility with 4-op FM.

It would have been really nice to have both and eventually you'd want more PCM support for sure, maybe the FM chip and a simpler PCM-aiding IC with DAC (with FIFO but not a proper DMA circuit) and then later adding DMA as well, or a very simple DMA set-up like the Mac used. (fixed place in RAM that gets DMA'd to the DAC each frame)

 

There was also the direct successor to the AY8910 with the AY8930, but unlike the YM2203 it only added variable pulse and a dedicated ADSR envelope per channel along with more flexible noise generation and higher resolution frequency range (beyond the 12-bit of the AY8910/YM2149), so more like the SID without the filter or ring modulation and no saw or triangle wave (though some other flexibility), but still only 3 channels and generally less desirable (and later) than the 1985 YM2203 with the 3 normal SSG (Yamaha term for square wave with ADSR) channels plus the 3 new 4-op FM channels. (the Sound Master used that chip along with a mono DAC with DMA loading, but it was released in '89, the same year as the Sound Blaster with Adlib compatibility and 8-bit DAC+DMA as well -ADPCM decoding too iirc- and there weren't any previous AY8910 based PC sound cards to cater to backwards compatibility with either oddly enough though that's exactly what the Apple II got with the mockingboard -no sound cards until '87 is rather odd-

 

And you'd want horizontal and vertical scrolling (not just for games, but useful for managing the GUI, scrolling text, etc), but packed pixel graphics would save a good bit of overhead (and/or allow more flexibility) as well... and a 160x200 8bpp mode would have nice. (actually, more flexible resolution modes in general would have been nice, like a 160x200 16 color mode -better than clipping to a small window for sure and trade-off vs choppy framerate at higher res -that could have been a useful mode even with the scrolless planar shifter)

I wonder if the SHIFTER would have taken any longer to develop had they opted for packed pixels instead.

 

In any case, it definitely seem like a more modest (and timely) upgrade to the shifter with scroll registers by '86 would have been better than the much longer delay for the BLiTTER alone. (especially with faster CPUs being offered too) And again, service center upgrades for some things should have been possible even without using socketed chips or a good expansion port. (ie a clip-on SHIFTER upgrade maybe with a couple jumpers -sort of like what Cyrix did with 386SX upgrades, same for the YM2203 upgrade or adding the PCM chip for that matter, or even the blitter -some things would probably directly piggyback on the 68k bus like some A500 upgrades did -albeit clip-on rather than socketed and maybe needing jumpers to be soldered)

Edited by kool kitty89
Link to comment
Share on other sites

SCART cable + Sony 14" portable TV was superior to ALL monitors be it Atari, Commodore or Phillips. And they sold them in a fetching white colour to go with Amiga 1000s and STs too :) The cable cost £20 when they were eventually produced anyway though and many portable TVs in 1987 and 88 did not have RGB SCART only composhite input or S-Video if you were very lucky except top end brand new TVs like the Sony KV series in the mid-late 80s period. But most games looked better on composite or RF (remember Sega and Nintendo games only had RF usually as an option too) as all the dithered pixels merged nicely. For hi-res lace SCART or dedicated monitor was essential though for sure.

 

All true but by late 80s non sample based sound hardware was deemed last gen' relly. Sega's Megadrive being the odd one out ith 1 scratchy sample channel. Amiga's problem is they never upped the sound channel total, the quality was limited much more by available memory than hardware, you can even playback two 14bit channels sort of. But memory was expensive in the 80s let alone 90s. A 4mb 386 was super expensive compared to 1mb.

 

Massive difference between Dragon 32 and TRS-80 version of the same game Donkey King, only looks crap on Dragon 32 though as one screen mode is missing....guess which one ;) But even the TRS-80 version is inferior to the Coleco/C64(Ocean software NOT Atarisoft)/Amstrad versions. The King/Donkey King had various screen modes selectable on start up IIRC, the Dragon just didn't have the artifacting screen option for colour that's all, other than that identical game from Microdeal or in the USA.

 

http://www.youtube.com/watch?v=KFglKUn-ioI

 

http://www.youtube.com/watch?v=MKINyiqUdVc

Link to comment
Share on other sites

SCART cable + Sony 14" portable TV was superior to ALL monitors be it Atari, Commodore or Phillips. And they sold them in a fetching white colour to go with Amiga 1000s and STs too :) The cable cost £20 when they were eventually produced anyway though and many portable TVs in 1987 and 88 did not have RGB SCART only composhite input or S-Video if you were very lucky except top end brand new TVs like the Sony KV series in the mid-late 80s period. But most games looked better on composite or RF (remember Sega and Nintendo games only had RF usually as an option too) as all the dithered pixels merged nicely. For hi-res lace SCART or dedicated monitor was essential though for sure.

I think that would have been different for other parts of Europe (especially with France mandating SCART in the early 80s -and I think a couple others following). The UK was a bit separate from all of that from what I understand and they got RCA composite/audio and S-video ports while mainland europe was almost exclusively SCART once they went beyond RF.

 

All true but by late 80s non sample based sound hardware was deemed last gen' relly. Sega's Megadrive being the odd one out ith 1 scratchy sample channel. Amiga's problem is they never upped the sound channel total, the quality was limited much more by available memory than hardware, you can even playback two 14bit channels sort of. But memory was expensive in the 80s let alone 90s. A 4mb 386 was super expensive compared to 1mb.

I'm not going to get in another argument on this, but that's hugely subjective. The SNES was the only console of the generation to be purely sample based (and made heavy use of FM instrument samples in many games anyway) while the PC engine used simple programmable waveforms techincally samples but only 32 words long and 5 bit resolution, so the waveforms possible were limited), and the Genesis's PCM playback was dependent on code quality as it was software managed like the ST but with a nice 8-bit DAC to work with rather than getting stuck with the PSG only. And the SNES wasn't released until 1990 (or '91 in the US) when FM was still the standard in the arcades with PCM/ADPCM used for supporting instruments or occassionally lead, hence the rather high-end X68000 prominently featuring a YM2151.

Yamaha's FM chips were cutting edge in the mid 80s and in fact were brand new in 1985 with the YM2151 and YM2203 being introduced followed by the lower end OPL chip with 2-op FM slightly later. DMA driven PCM was nice but it also necessitated a lot of memory to be used for decent sound quality or compromising that to crappy low-res samples or very few samples. (a ton of amiga games used 8 kHz stuff and it would have been worse with consoles as you had to cram that all into feasible ROM space with 512kB generally being the maximum for the first couple years of the MD's life -with a few exceptions-)

On top of that you have the common use of AHX stuff on the Amiga or samples analog/FM synth stuff that is not impressive at all compared to what good FM synth could manage, but those cases did still make for nice chiptunes (for those who liked that) and avoided the memory hogging samples.

 

The quality of sample playback on the MD was largely due to poor coding for the Z80 making for uneven playback, but also due to use of very low quality samples (4 kHz was extremely common) and even worse with poor code AND crappy software mixing, let alone poorly interleaved V-DMA on top of that. In the properly utilized cases the MD is highly competitive with the Amiga, but most cases necessitate low sample rates such that even the Amiga's common 8 kHz stuff sounds impressive. Granted the SNES was often no better but hides it a bit better with interpolation and low-pass filtering masking the compression artifacting and aliasing but also resulting in horribly muffled sound, not crisp or bright at all. (the forced low pass filtering also hurts higher sample rate stuff) And with the SNES, if you pay close attention, you can see heavy cuts made to save space in many cases: truncated samples, use of pitch shifting rather than new sfx, etc, and while you have compression, you also need samples for music and no option for hardware synthesis. (and the DSP either isn't useful for that or the tools are too rigid to allow it)

 

I fully agree that the MD should have had sample playback support more than it did, especially since the Z80 is a waste in the manner it's used and could have been omitted entirely had it not been for compatibility (albeit they could have made the PBC an active adapter and it would have been no less convenient to users... the VDP probably could have been beefed up then too without the parasitic SMS compatibility block eating up die space). But that's no argument against using FM altogether especially with the amount of ROM space samples would consume. (that would change as time went on and hardware sample playback would avoid the pitfalls of sloppy programmers, and the Famicom did have hardware 7-bit DPCM playback back in 1983 and the Mac had the simple DMA to 8-bit DAC back in '84... that an even a single 8-bit DAC with DMA loading would allow a lot more options for software mixing and decompression -10-bit or higher res would be more useful to allow mixing of multiple 8-bit channels without loss)

 

It really is a matter of taste, but there's a ton of awesome sounding FM stuff and it WAS the arcade standard into the mid 90s. You also had other forms of synthesis being popular before memory got cheap enough to facilitate common samples stuff, like the MT-32 with its use of additive and subtractive synthesis with envelope control and filtering. (sort of a digital successor to older analog synthesizers from Roland and a direct response to the gaining popularity of Yamaha's FM synth)

And there's the professional DX-7 keyboard, but that's 6-op FM... though extremely capable. (even for complex percussion and such)

 

But really, there's absolutely no reason to disregard FM for the ST in the mid 80s, especially as it's a more practical off the shelf addition alongside pushing for more capable PCM capabllities. It's not an either-or thing. ;)

 

Massive difference between Dragon 32 and TRS-80 version of the same game Donkey King, only looks crap on Dragon 32 though as one screen mode is missing....guess which one ;) But even the TRS-80 version is inferior to the Coleco/C64(Ocean software NOT Atarisoft)/Amstrad versions. The King/Donkey King had various screen modes selectable on start up IIRC, the Dragon just didn't have the artifacting screen option for colour that's all, other than that identical game from Microdeal or in the USA.

The Dragon 32 is missing both the 4-color 128x192 modes or just CSS1?

http://hcvgm.org/VDG_Colours.html

Even with only CSS0 they should have been able to map some decent colors from yellow, green, red, blue, or black. (red, blue, black, and yellow probably with red in place of orange, blue for cyan, yellow for white, and black the same)

If it's missing both of those... yeah that ruins the machine for gaming in PAL regions and not even artifact colors to at least allow red/white/black/blue due to the different color carrier.

 

 

If it is only the CSS1 mode that's broken, they implies that the game simply wasn't re-done for the PAL conversion... and it implies that in any case as there's no optimization to the dithering at all, just vertical bars as it would be if a 2bpp framebuffer was interpreted as 1bpp. (or like Apple II games on an RGB monitor or CGA games in composite mode on an RGB monitor)

Link to comment
Share on other sites

I'm not arguing, just stating my opinion about sound hardware types. DAC based sound chips were as much of a revolution as hardware sprites and scrolling via TI/Atari/Commodore when Amiga came out.

 

You can do anything if you have samples for it, the same is not true the other way round. I'm sure Amiga's Paula chip was costing Commodore peanuts to manufacture in 1990 after 5 years of doing it themselves at MOS/CSG. Pokey sounds like Pokey, YM chip like YM, SID like a SID chip or analogue synth from 1981, PC-E sounds like PC-E, Megadrive sounds like Megadrive, SNES like a cheap shit MIDI tune card on PC ( ;) ) and that's it really.

 

Amiga sounds like whatever you want it to sound like and the quality comes down to how much chip ram you assign to your samples for the MOD. That's my reasoning, others obviously will have their own opinion, but dual Paula chips in Amiga 1200 and 4000 was the way to go for costs....giving 12 channels with CPU assist or 8 without.

 

Atari should have put an off the shelf DAC in the design, even one channel, to save CPU time. Simple enough to do too and I'm sure it would be cost effective.

 

 

We had SCART Euroconnector here in the UK just as fast as the silly garlic munchers :lol: The problem is portable TVs were not well supported for RGB SCART, and nobody bought a 28 inch TV (which most did have RGB SCART onboard as part of the teletext adaptor ie a BBC micro on a card inside the TV) to use as a monitor so it doesn't matter who introduced what when, I know from real life experience that getting a SCART RGB TV in the mid 80s was very rare (ie Sony or Panasonic only probably) and cost an absolute fortune of about £300 really. Composhite was around from even 1981/2 on TVs though so one has nothing to do with the other.

 

Why do you think Megadrives and SNES machines Europe wide all only had an RF cable in the box ;)

Link to comment
Share on other sites

 

We had SCART Euroconnector here in the UK just as fast as the silly garlic munchers icon_lol.gif The problem is portable TVs were not well supported for RGB SCART, and nobody bought a 28 inch TV (which most did have RGB SCART onboard as part of the teletext adaptor ie a BBC micro on a card inside the TV) to use as a monitor so it doesn't matter who introduced what when, I know from real life experience that getting a SCART RGB TV in the mid 80s was very rare (ie Sony or Panasonic only probably) and cost an absolute fortune of about £300 really. Composhite was around from even 1981/2 on TVs though so one has nothing to do with the other.

 

 

In May 1985 I bought a Panasonic 14" Colour TV which I forget the model number now, but it had a similar CRT tube system name to the Sony; it also had a long, narrow and slim remote that could be pushed into the TV set to store it, but it doubled as the TVs main controls too. An extra door opened to reveal a BNC input for the S-Video or maybe just composite I dunno; you had to pull out a small button dial to engage this video mode. The picture quality was really exceptional and FAR superior to any LCD portable around now; There was also a stereo version of this TV but not NICAM in 1985 of course, but I only had the mono version of this TV.

Link to comment
Share on other sites

SID like a SID chip or analogue synth from 1981,

 

 

Well, I admit to most of your Post, but not here. SID sounds by far not anywhere near to an analogue synth from 1981 ;) In the 70s existed analogue synths that sounded even far better, with a higher frequency range and real Sinewave based modulations.

Link to comment
Share on other sites

I'm not arguing, just stating my opinion about sound hardware types. DAC based sound chips were as much of a revolution as hardware sprites and scrolling via TI/Atari/Commodore when Amiga came out.

Yes, but you had the ability to use DACs (be it direct write or a "trick") back in the 70s consoles and computers... with the VCS you didn't have much CPU resource (at least not with the screen displaying anything) and no interrupts with TIA not offering direct write control either, but you still could do a demo for PCM played through TIA with software timed loops and approximating a 4-bit DAC. (TIA does use linear 4-bit DACs but from what I understand you don't have direct access like the A8, so you set the square wave output to a high playback rate and modulate volume)

 

With POKEY you get 4 direct write 4-bit linear DACs, 3 of which have interval timers tied to the channels as well, so you could do interrupt driven PCM playback with some freedom on 3 independent channels as long as CPU resource allowed. (and the 6502 does have very fast interrupt handling -3 4 kHz PCM channels should have been manageable with ~50% CPU resource use from comparisons I've seen, though you'd have added resource for resampling to different pitches etc -resampling would generally be preferable than varying the playback rate due to unnecessary resource use of the latter for higher pitch playback -unless perhaps you were only going to play up to ~8 kHz)

I'm quite surprised there aren't a good deal of sample tracker demos for the A8 like the Speccy, ST, or even CoCo have. (hell, you even have some sample tracker stuff using PWM via the speccy beeper!) The PCE has direct write modes for its 5-bit DACs with all CPU driven playback still (again with interrupts, ~5% CPU resource for a 7 kHz channel -too bad there's not hardware support for using the internal 32x5-bit wave RAM as a FIFO buffer for PCM playback... that would have cut down overhead substantially as you'd load 32 samples at a time rather than 1 per interrupt). The ST and Speccy 128k (or SMS, etc) have to use logarithmic 4-bit samples due to the volume used on those sound chips, but they're still effectively 3-channel DACs (and the log based volume is better for some things, worse for others -and that's aside from more complex algorithms that map higher depth -like 8-bit- PCM to play though all 3 channels at different volume steps -you could technically do that with POKEY too, but only up to 6-bit depth as they're linear DACs, with the log based stuff you have trade-offs but can approximate better than 8-bit resolution). The Genesis has the SMS's PSG (very few games used that for PCM) as well as access to an 8-bit linear DAC in the YM2612 (when disabling channel 6) and that's normally what's used for playback. (any software mod players would be purely limited to demos or title screens and such, like with the ST -same for others like doing software MOD players on the PC though a parallel port DAC, though with soundblaster you've got DMA and variable sample rates for 1 channel, so software mixing and in-game use got far more realistic -still, you didn't see in-game use of MOD music via the SB until the early 90s and generally with 386 minimum requirements)

 

But having hardware support is another issue and obviously some dedicated hardware is a good deal more cost effective than tossing in a CPU/MCU and a bare DAC as many arcade games did (and the CoCo, of course) or using another sound chip to do the same (or software PWM, but that's really wasteful). The Amiga uses 4 DMA channels to load PCM along with variable sample rates and 6-bit hardware volume per channel, so it's pretty flexible (I think the variable playback rate has enough range and fine enough steps to allow that to function as pitch control as well -ie avoiding CPU resource to resample notes- compared to the STe with only 4 playback rates supported in hardware -in the case of software mixing you'd have to do resampling in any case and the same for using high resolution samples on the Amiga -as you'd hit the ~28 kHz limit, for the common 8 kHz stuff you'd have a lot of flexibility, but for something like Psygnosis's 22 kHz samples you'd definitely need resampling to have full range -unless the sample itself was mainly intended to be played at only slightly higher pitches and much lower pitches).

A mono DAC with DMA loading would be a lot by itself though, or even something simpler like the MAC's fixed buffer region in RAM getting DMA'd to the DAC at the v-sync rate, or (again) a simple FIFO buffer mechanism rather than just a bare DAC.

 

 

And it definitely wouldn't be mutually exclusive: you could add better synth/PSG sound chips AND add hardware PCM assistance. (and you could do both gradually: for the ST you could bump it up to a YM2203 and then add a simple IC with DAC, FIFO, and timer input for setting the playback rate as well as interrupt generation -though the option to simply poll the full/empty flags as well depending on the circumstances) And later expanded that to add proper DMA and stereo: what the STe had for PCM would have been OK up until the Falcon's sound in '92, much more so with the YM2203 boosting things alongside. (you'd have a progression of software support as well, and in the best cases a range of settings to take advantage of the added hardware or cater to lower-end/older machines, in the worst cases you'd have games predominantly catering to the lowest common denominator)

 

Having enough PCM support to manage good 1 or 2 channel sampled sfx and the occasional use of percussion is an important factor (proabably some cases of lead instruments too, but less critical -it would be most useful for instruments that sound poor in FM like violin or certain wind instruments or piano or orchestral hits). That was the biggest percieved weakness of the MD's sound and again due in fair part to poor programming. (and Sega's non-foolproof implementation) You can mange good quality 2-channel 22 kHz PCM playback using the Z80 alone, but no developer managed that very well (albeit very few used 22 kHz samples anyway due to space concerns) and likewise software decompression was rarely if ever used at all (in spite of some low-intensity ADPCM direvatives used by Covox and such -Covox released then in the late 80s intended for software decompression on PCs -and that's just one example). Doing single channel on-the-fly decompression with the Z80 is very feasible and while the 22 kHz playback achieved with 3 or 4:1 compression with current homebrew is arguably less realistic for developers back in the 90s, they could/should have managed at least something decent. (there is one game I know of which opted for a single sample synth channel used concurrently with sfx and percussion loops -without mixing- and that's Skitchin' with a single 11 kHz guitar sample used -the concurrent managment is actully quite good and I was initially fooled into thinking there was software mixing)

 

You can do anything if you have samples for it, the same is not true the other way round. I'm sure Amiga's Paula chip was costing Commodore peanuts to manufacture in 1990 after 5 years of doing it themselves at MOS/CSG.

Well, all of those can sound like approximations of instruments they're mimicking with varying results, with the PCE you can mange some simpler instruments rather well, but it's limited by the 32 word waveforms: 4-op FM allows a lot more flexibility and you can recreate a number of sounds exceptionally well (especially chimes, organ, brass, some wind, and single string sounds, while most percussion and multi-string instruments are tougher to do well) and it of course depends on the programmer and sound engine to make use of that, lots of flexibility though with very crisp output (so long as you have good analog circuitry). And of course you can mange some very synthy style stuff as well including mimmicking analog synth sounds.

 

There are limits of course, and examples of poor usage as well, but there's examples of that in any case. (poor usage of samples or poor trade-offs made for using samples: too high sample rate with long samples using too much space and limiting variety -or competing for graphics- or too low quality, or simply poorly optimzied samples with poor preprocessing, etc)

 

Amiga sounds like whatever you want it to sound like and the quality comes down to how much chip ram you assign to your samples for the MOD. That's my reasoning, others obviously will have their own opinion, but dual Paula chips in Amiga 1200 and 4000 was the way to go for costs....giving 12 channels with CPU assist or 8 without.

Well to be honest it sounds fairly close to anything it samples but within limits of the sample rate and bit depth used. Lower sample rate stuff will sound muffled and artifacted... and while the amiga's aggressive low pass filtering masks the aliasing is also ruins clarity of higher resolution samples (SNES has a similar problem), though the filter issue varies somewhat as well with there being more than one filter and one of them varying by the model in question. Plenty of trade-offs, especially early on (with more limited RAM) and then the limits and cost/benefits of available hardware channels, etc, etc. (again, plenty of advantages of including FM AND sample capabilities)

 

I agree in general though, sample based stuff did become the standard by the mid 90s and maintained ever since, and not just that but for all the complex and feature-rich multi-channel PCM/ADPCM+DSP+MCU/CPU systems with hardware effects and dedicated resources it's all come back to simple DMA sound for the most part. (modern PC onboard sound and sound systems in consoles is just multi-channel DMA, the N64 was the first to fully go that route when Sega and Sony would put unnecessary resources toward sound for another generation still -Nintendo seemed to learn their lesson with the SNES's system being used as a glorified MOD player with some added hardware effects -panning, reverb/echo, etc)

And the more RAM the amiga got, the more useful it would be... they definitely should have upgraded it as you said, though dual PAULA chips would be rather wasteful, it would make more sense to consolidate the DAC+DMA logic block copied into a single chip retaining the floppy signaling/controlling hardware as normal... it would also be nice to add at least per-channel left/right/center panning if not full pan stereo per channel) 8 DMA audio channels by the late 80s would have been great though, and then jumping again to 12.

It would get a bit wasteful to keep copying the DAC like that though and would make far more sense to have a hardware mixer to take multiple channels and output through a single, high-resolution DAC (ie 16-bit), and eventually adding 16-bit sample support would be nice. (8/12/16-bit would be really flexible too and nice to have... though back in '85 4-bit support would have been nice too for simple "compression" as such -though taking 4-bit samples and padding to 8-bit in software isn't very intensive at all, so not so bad)

 

 

Pokey sounds like Pokey, YM chip like YM, SID like a SID chip or analogue synth from 1981, PC-E sounds like PC-E, Megadrive sounds like Megadrive, SNES like a cheap shit MIDI tune card on PC ( ;) ) and that's it really.

I mist disagree on SNES as that's hugely variable and you have cases where it's considerably superior to what the Amiga could ever do and other cases where generic samples and mediocre compositions don't demonstrate it at all. Of course you have bland and generic sounding Amiga MOD as well. (or some AHX stuff as well -or all AHX stuff for those who don't like the chiptuny sound)

The MD also sounds like a cheap PC midi tune (OPL2 adlib/SB) at times due to poor utilization and in many cases a good deal poorer than the better Adlib/OPL2 stuff on PC though none as bad as the worst Adlib stuff. (and you have the adlib/SB stuff pulled down more by poor analog circuitry on many SB/compatibles though you also have that problem on many MD2s and the final MD1s -hiss, distortion, muddy sound, straining, etc)

 

And you could say the same about the Amiga (garbage in garbage out -not just the sample set used, but the composition quality and having instruments that sound good at the resolutions used). The SNES has the limit of a 64k sample bank, but that's effectively 227kB due to 3.56:1 compression (equivalent to ~114 kB of uncompressed 8-bit PCM at similar sample rate) and you get 16-bit resolution output up to 32 kHz and added effects. (8-channel stereo, reverb, echo, interpolation, etc -the latter is a double edged sword but most prefer it to tinny aliasing -the analog low bass filtering is more of a problem though and varies somewhat on different models)

And while there isn't DMA to ROM for the sound chip, or to CPU RAM, there is a good amount of bandwidth to update/stream added data on the fly. (you'd want to reserve that for longer and less used sounds of course)

There's more features to that and a lot of wasted resource with the SPC700 and DSP and wasted features due to the limited format (disabling the echo buffer could give you 32 channels rather than 8 and there are demos that do just that by writing directly to the echo buffer). In general Nintendo probably could have easily gone with Ricoh's rather nice (simpler) 8-channel PCM chip available by 1989 (used in FM Towns, Sega System 18/32, Sega CD, and others) with 8-bit samples scaled up to 32 kHz, 4-bit stereo panning, 8x oversampling with digital filtering, and 16-bit resolution output. (no compression, but bumping RAM up to 128 kB would have addressed that at far lower cost than the overkill SPC unit and Ricoh was Nintendo's primary chip vendor already)

That too would have been a good deal more capable than the Amiga though with 8 stereo channels, 32 kHz, oversampling and digital filtering. (the CD managed some good sounding stuff with only 64k too, though it also had FM+PSG to work with for even more flexibility -as with Silpheed or Sonic CD's past levels)

 

 

Atari should have put an off the shelf DAC in the design, even one channel, to save CPU time. Simple enough to do too and I'm sure it would be cost effective.

A DAC wouldn't have helped at all over what's already there beyond more easily allowing plain 8-bit linear PCM... using 4-bit log PCM on the AY chip is no more resource intensive than a plain resistor DAC (software PWM otoh if very intensive, so you could certainly argue for the PC, Speccy, Apple II, etc including a simple resistor DAC for such -the CoCo did just that of course).

You'd eat up CPU resource just as badly with a plain DAC as the ST already does with the PSG chip and exactly what the CoCo and Genesis already do use. (6-bit for coco, 8-bit for MD) With the Sound Blaster you got a nice DMA circuit as well (plus an MCU with ADPCM decoding support) so a nice single hardware PCM channel with minimal CPU overhead. (especially compared to PWM via the PC speaker but still very dramatic compared to the parallel port DAC supported by many late 80s/early 90s PC games -8-bit DAC could sound as good as the SB but used a lot of CPU resource just like the CoCo, ST, Genesis, PCE, etc -Genesis had the Z80 to offload overhead and PCE had a very interrupt friendly CPU -and rather fast at that)

 

The Amiga's PAULA has MUCH more than just some DACs: it's got 4 DMA channels dedicated to audio and hardware support to load 8-bit signed PCM via those DMA channels along with programmable sample rate and 64 volume levels per channel. AFIK there was nothing on the mass market anything like that at all. (some custom chips in the arcade, but that's about it) Maybe some off the shelf ICs that supported PCM, but I'm not sure, and I highly doubt they were particularly cost-effective.

That's why I mentioned some simpler alternatives to something as capable as PAULA: you had the set-up the MAC used with simple DMA from a fixed point in RAM feeding into the DAC (up to the CPU/software to fill the RAM in vblank) or a simple custom IC with a DAC built-in and FIFO buffer with full/empty flags and timer input (or internal timer).

 

 

 

 

We had SCART Euroconnector here in the UK just as fast as the silly garlic munchers :lol: The problem is portable TVs were not well supported for RGB SCART, and nobody bought a 28 inch TV (which most did have RGB SCART onboard as part of the teletext adaptor ie a BBC micro on a card inside the TV) to use as a monitor so it doesn't matter who introduced what when, I know from real life experience that getting a SCART RGB TV in the mid 80s was very rare (ie Sony or Panasonic only probably) and cost an absolute fortune of about £300 really. Composhite was around from even 1981/2 on TVs though so one has nothing to do with the other.

That's what I meant, I meant that in France SCART was mandated on ALL TVs being manufactured from 1980 onward. The standard was introduced in '77 and I'm sure it spread readily, but wasn't made mandatory for ALL TVs in many countries. (hence why in France, Sega removed the RF modulator from the SMS with the SMSII rather than removing the AV port -the composite video line is also disconnected with only RGB+sync+5V and mono audio connected)

 

Why do you think Megadrives and SNES machines Europe wide all only had an RF cable in the box ;)

Standardization. ;) Sega sure didn't do that for France, skipping to SCART only for the SMSII and that almost certianly simplified the issue of SECAM as well. (I wonder if any such was done with the MD, SMS, or N64 in france... especially if there were MD1s like the JP models with no RF modulator, and the model 2 systems obviously had no RF so easily could have been bundled with SCART rather than RF... the N64 supposedly didn't have any models including RGB standard, but I wonder if France was an exception or if Nintendo was just odd about that -could have saved money by not including the RF modulator or composite video encoder, but it's quite odd that the N64 didn't connect the RGB lines in general with JP and EU having easy access to RGB compatible sets... and even stranger that the NTSC models happen to have the buffered RGB lines present and easy to mod but not on EU models)

 

In the US there were a mix of bundles from versions with RF only to AV only (normally both) just as most if not all NES sets came with a pair of RCA cabled in the box (save the RF only NES2). And of course the N64 shipped with only AV cables, as did the SNES2. (not sure about the genesis, I know the RF modulator was rather common to include with the model 2, but I'm not sure if that held for late models or the model 3)

Edited by kool kitty89
Link to comment
Share on other sites

 

 

The Amiga's PAULA has MUCH more than just some DACs: it's got 4 DMA channels dedicated to audio and hardware support to load 8-bit signed PCM via those DMA channels along with programmable sample rate and 64 volume levels per channel. AFIK there was nothing on the mass market anything like that at all. (some custom chips in the arcade, but that's about it) Maybe some off the shelf ICs that supported PCM, but I'm not sure, and I highly doubt they were particularly cost-effective.

 

 

One thing disturbed my fun in listening to Music on the Amiga. Just play one instrument on the left and one instrument on the right, do it another time. ;)

Amiga wasn't a cheap machine, but it was not possible to Commodore to add some small circuitry for doing a L-R balance programming on each channel.

Link to comment
Share on other sites

Yep, that was a sorely missing feature, not to mention them not bothering to improve it to 16-bit playback on later machines.

 

I doubt it was the first though... Apple 2 had addon ADC cards, probably fairly early on. I remember Atari 800 and Apple II being credited for much of the sound effects in Tron, but can't recall recognising many Atari sounds when I saw the movie.

Link to comment
Share on other sites

 

 

The Amiga's PAULA has MUCH more than just some DACs: it's got 4 DMA channels dedicated to audio and hardware support to load 8-bit signed PCM via those DMA channels along with programmable sample rate and 64 volume levels per channel. AFIK there was nothing on the mass market anything like that at all. (some custom chips in the arcade, but that's about it) Maybe some off the shelf ICs that supported PCM, but I'm not sure, and I highly doubt they were particularly cost-effective.

 

 

One thing disturbed my fun in listening to Music on the Amiga. Just play one instrument on the left and one instrument on the right, do it another time. ;)

Amiga wasn't a cheap machine, but it was not possible to Commodore to add some small circuitry for doing a L-R balance programming on each channel.

Well 2 instruments on the left and 2 on the right... or 2 total with full pan stereo if used as such.

 

But yes, most treated it as 4 channel mono (more or less) and the lack of a simple L/C/R toggle for hard-pan "simple" stereo was a rather unfortunate omission... or even a plain mono mode so all 4 channels play on both speakers rather than only the hardwired stereo mode. (rather than 2 bits of control per channel, just 1 bit per channel pair to configure as 2 mono channels or 1 channel left and one right -so you could have 4 mono, 2 mono+2stereo, or 4 hardwired stereo, but simple L/C/R would be much preferable especially since most stereo effects could be reasonably achieved with hard panning and not full stereo volume control)

 

I assume the original intention was for the set-up to be treated as 2 stereo channels and not 4 independent channels.

 

Did the STe's DMA sound allow full panning for each of the 2 channels or just hard panning?

 

Yep, that was a sorely missing feature, not to mention them not bothering to improve it to 16-bit playback on later machines.

 

I doubt it was the first though... Apple 2 had addon ADC cards, probably fairly early on. I remember Atari 800 and Apple II being credited for much of the sound effects in Tron, but can't recall recognising many Atari sounds when I saw the movie.

16-bit AND more hardware channels (not necessarily more DACs as you could mix to a single DAC in hardware), and higher max sample rate without being tied only to the higher res video modes... (8 8-bit hardware channels with L/R/center panning and independent volume control mixed to stereo 16-bit output would be great though you could go beyond that too -full pan, oversampling, digital filtering, interpolation, more channels, higher max bit depth and variable depth, hardware decompression, etc).

I think the falcon just had plain 16-bit 8-channel playback, not sure about variable depth or such. (I'd assume it at least supported 8-bit samples... other wise you'd eat some CPU resource to pad 8-bit samples on the fly or use 2x the RAM space for all samples)

Link to comment
Share on other sites

SID like a SID chip or analogue synth from 1981,

 

 

Well, I admit to most of your Post, but not here. SID sounds by far not anywhere near to an analogue synth from 1981 ;) In the 70s existed analogue synths that sounded even far better, with a higher frequency range and real Sinewave based modulations.

 

Depends which ones you are thinking of, am thinking of build your own kits used by very early synth pop experimental groups around 1979-81. If you go on youtube and see if you can find the 9 part upload of the BBC TV (UK) show called Synth Britannia if it hasn't been taken down. In it you will see and hear the real mono synths from 1979-81ish that sound like a SID, obviously they have far greater control of ADSR but the sound is pretty much the same on some. SID sounds the way it does simply because Bob Yannes put a phase accumulator on a chip, it is not actually digital at all it is pretty much an analogue synth, the guy should know he left C= a year later and started the keyboard company Ensoniq ;)

 

Even if you don't like the SID but do like that pioneering early synth hardware it's worth a watch anyway, and same goes to anyone else interested in early 80s technology or that lovely raw sound the early groups had before it all got a bit too commercial.

Link to comment
Share on other sites

 

 

The Amiga's PAULA has MUCH more than just some DACs: it's got 4 DMA channels dedicated to audio and hardware support to load 8-bit signed PCM via those DMA channels along with programmable sample rate and 64 volume levels per channel. AFIK there was nothing on the mass market anything like that at all. (some custom chips in the arcade, but that's about it) Maybe some off the shelf ICs that supported PCM, but I'm not sure, and I highly doubt they were particularly cost-effective.

 

 

One thing disturbed my fun in listening to Music on the Amiga. Just play one instrument on the left and one instrument on the right, do it another time. ;)

Amiga wasn't a cheap machine, but it was not possible to Commodore to add some small circuitry for doing a L-R balance programming on each channel.

Well 2 instruments on the left and 2 on the right... or 2 total with full pan stereo if used as such.

 

But yes, most treated it as 4 channel mono (more or less) and the lack of a simple L/C/R toggle for hard-pan "simple" stereo was a rather unfortunate omission... or even a plain mono mode so all 4 channels play on both speakers rather than only the hardwired stereo mode. (rather than 2 bits of control per channel, just 1 bit per channel pair to configure as 2 mono channels or 1 channel left and one right -so you could have 4 mono, 2 mono+2stereo, or 4 hardwired stereo, but simple L/C/R would be much preferable especially since most stereo effects could be reasonably achieved with hard panning and not full stereo volume control)

 

I assume the original intention was for the set-up to be treated as 2 stereo channels and not 4 independent channels.

 

Did the STe's DMA sound allow full panning for each of the 2 channels or just hard panning?

 

Yep, that was a sorely missing feature, not to mention them not bothering to improve it to 16-bit playback on later machines.

 

I doubt it was the first though... Apple 2 had addon ADC cards, probably fairly early on. I remember Atari 800 and Apple II being credited for much of the sound effects in Tron, but can't recall recognising many Atari sounds when I saw the movie.

16-bit AND more hardware channels (not necessarily more DACs as you could mix to a single DAC in hardware), and higher max sample rate without being tied only to the higher res video modes... (8 8-bit hardware channels with L/R/center panning and independent volume control mixed to stereo 16-bit output would be great though you could go beyond that too -full pan, oversampling, digital filtering, interpolation, more channels, higher max bit depth and variable depth, hardware decompression, etc).

I think the falcon just had plain 16-bit 8-channel playback, not sure about variable depth or such. (I'd assume it at least supported 8-bit samples... other wise you'd eat some CPU resource to pad 8-bit samples on the fly or use 2x the RAM space for all samples)

 

OK well if anyone thinks 8bit DACs are not sufficient even for professional use then check out the songs by Inner City from the late 80s/early 90s. All used the first Ensoniq (Mirage I believe it is called) sample based keyboard, they used stock samples as supplied by Ensoniq too and it was an 8bit sample based machine. And it sounded fantastic in both quality and effects.

 

As to the Amiga not having stereo panning bits to control the 4 channels, well let's see in 1985 when the design was committed to silicon after prototyping the competition was...

 

PC Speaker, Adlib, Mac sound, YM chips, SID, Pokey, TI/Coleco/MSX sounds. I think we can all agree that Paula was good enough for its release schedule 4 channels and DMA sample playback WIN. And remember portable TVs, hell ALL TVs at the time in most countries were mono, and the chipset was designed for a disk based games console to be played on the TV. This means 4 channel gaming sounds of awesome versatility via the 4 channel DAC. The stereo jacks hard wiring the 1,3 and 2,4 to L and R phono sockets was a bonus for stereo sampler use. And remember the A500/2000 modulator was mixed left and right phono sockets for sound into 1 channel via a Y cable between Amiga sound output and modulator's sound input too.

 

Now the STE has fixed frequencies it can play samples back at and is only 2 channels, so two distinct sound but it does have panning I believe. The Archimedes had the same stereo panning controls, and 8 distinct sounds maximum (going from 2 to 8 channels = seriously less CPU time though so this was not DMA!) but both are 24 months later around 87. And don't forget if Hi-Toro were supported probably the Amiga would have been out the same time as the Mac in 84 probably, maybe Jan 85. The chipset changed very little from 84 to 85 it was a cost and production optimisation problem not a technical design issue that Jay and co were working on.

 

Commodore did nothing with Paula, guess after the cock up they made with 8xxx series SID they decided not to leave it. But there was no excuse not to put two Paula chips inside Amiga 600/1200/3000/4000. They had dual SID in Commodore 65 working prototypes for 6 channel sound, so why not 8 channel dual Paula? Because Commodore, run by some thick bastid called Medhi Ali, were being run into the ground.

Link to comment
Share on other sites

Depends which ones you are thinking of, am thinking of build your own kits used by very early synth pop experimental groups around 1979-81. If you go on youtube and see if you can find the 9 part upload of the BBC TV (UK) show called Synth Britannia if it hasn't been taken down. In it you will see and hear the real mono synths from 1979-81ish that sound like a SID, obviously they have far greater control of ADSR but the sound is pretty much the same on some. SID sounds the way it does simply because Bob Yannes put a phase accumulator on a chip, it is not actually digital at all it is pretty much an analogue synth, the guy should know he left C= a year later and started the keyboard company Ensoniq ;)

Not just that, but the SID sounds the way it does because of the waveforms it can manage. Even without the ADSR or filtering (or ring modulation) the flexibility of saw/triangle/pulse (variable duty cycle) was very useful. In fact you could probably argue that 6 channels with those waveforms (plus noise) available without the ADSR/filter etc would be more useful in many cases than the 3 channels with the added effects. (lots of trade-offs, but more channels would be very useful for most games, especially given the sacrifice of sfx that most games had to make in favor of music on the C64)

Pulse wave aloe is very flexible and one of the most useful capabilities. (it can approximate saw fairly well at some duty cycles but not tirangle -pulse wave is the one typically used for electric guitar sounds among other things)

 

POKEY can manage much of that with a bit of tweaking and/or CPU resource (often vblank ints), but the resolution limits it for some things (without pairing channels) and it does use CPU time to do that... (more an issue with saw/triangle stuff, pulse is OK at 8-bit I believe)

OTOH you have the NES's somewhat more limited set-up with 2 pulse channels of 4 duty cycle settings, a triangle wave channel (fixed volume but OK for most uses), dedicated noise channel, and 7-bit DPCM playback channel. (the pulse channels can also bse used as 4-bit linear DACs and the DPCM channel can be used as a 7-bit DAC -all CPU driven of course, DPCM has DMA loading though, albeit with a limted sample bank size -expanded by bank switching which wasn't much of a limiting factor, but ROM space for holding samples was a limit -several publishers routinely used percussion samples in music nonetheless and in some cases voice/sfx as well)

 

I think most would agree that that set-up is superior for general video game use compared to the SID (mainly due to channel limitations), but the hypothetical 6-channel simplified incarnation above would be much more arguable. Even for pure music there are a good deal of trade-offs in the NES's embedded audio and the SID. (more simultaneous channels, hardware sample playback at fairly high resolution, etc)

 

 

But this is all getting off topic ;) ... on Ensoniq though, it's rather interesting that they put a lot of focus on additive synthesizers, especially wavetable based stuff (not to be confused with sample synthesis) and in some ways very similar to what Atari was doing with the AMY in '83/84. ;) (actually Atari Corp continued it into 1985 pushing it to the XEM after it was realized it wouldn't be ready in time for launch model STs... not sure why they later sold it off rather than using it though, dropping the XEM made sense due to the general decline of the 8-bits, but why not add it to the ST?)

 

Looking int AMY more, it really is interesting, not wavetable synthesis generally speaking (as it uses only sine wave oscillators not small wave samples, so in that sense somewhat akin to Yamaha's FM synth stuff but much more additice oriented without the complex FM algorithms of Yamaha's stuff -though it was more than plain additive synth). Lots of flexibility and potential for some very nice speech (and some other digitized sound) playback even at very low bitrates (below the point of being acceptable with even 4-bit PCM, or CVSD even) but not ideal for all types of sound playback. It was only a 40-pin DIP too and even that was largely due to the original design using an external DAC and thus having 16 digital audio out lines... so a revision with internal DAC could be cut to 24 pins assuming they could sacrifice one of the otherwise unused "reserved" pins as well. (otherwise definitely down to a 28-pin package -and a narrow DIP if they could shrink the die)

Given it was already engineered and pretty much ready for production, that would likely be very attractive compared to designing an all new custom DMA audio circuit, especially anything near what PAULA was given the size of that package (not sure how many of the 56 pins were used for the floppy control lines, but regardless it's far more than what Amy could be cut down to... unless you opted for a far simpler single mono DAC set-up, but that's still a new chip to design and implement when you had the very nice AMY chip pretty much ready for use -again, I'm not sure of the total context, but I think there's been some additional information coming to light on this as well)

Granted, AMY still may have been a more expensive option (even as a 24-pin chip) than an off the shelf sound chip with reasonably enhanced capabilities like the YM2203. (Atari had already gone with the YM2149 initially, so they'd have had to add AMY on top of that to the board with more traces added, etc, etc, while the YM2203 would only add component cost over the low-end YM2149 and a tiny 8-pin DIP serial DAC accompanying it -still very few traces needed over the plain YM2149/AY8910, and probably work on getting at least a mono DMA PCM circuit together as well, especially embedding it in another custom chip -when the STe introduced it into the SHIFTER, and they could do it in steps, with 1 channel mono or with panning, and then add more channels and eventually higher depth as well, or maybe use a small dedicated IC before integrating it -if they used the YM2203 they'd also have a pair of high-res interval timers they could input for the sample rate -and the 2nd could be used to set the tempo/beat timing as well)

 

 

OK well if anyone thinks 8bit DACs are not sufficient even for professional use then check out the songs by Inner City from the late 80s/early 90s. All used the first Ensoniq (Mirage I believe it is called) sample based keyboard, they used stock samples as supplied by Ensoniq too and it was an 8bit sample based machine. And it sounded fantastic in both quality and effects.

Noone ever said that. :P The original criticism was of the 2/4 channels available hardwired to stereo without even hard panning for the mixed output or at least adding that feature later. Plain 8-bit PCM was fine (and more or less standard) up to the early 1990s and it was you yourself who commented on how CBM should have added more channels later on.

 

As to the Amiga not having stereo panning bits to control the 4 channels, well let's see in 1985 when the design was committed to silicon after prototyping the competition was...

Yes, but not even a simple mixer with all left, all right, or center selection per channel, or not even a mono setting to at least allow each channel to be set to output to both speakers or stick to the hardwired stereo position? (and not fixing that later on or adding more channels)

 

But yes, some form of stereo was more useful than only mono... most of the time, and you could probably mix it to mono with external speakers if the left/right separation really annoyed you. (even an all mono mode or switch would have been nice though... listening to a lot of amiga tuns and games, I know exactly what the complaints refer to) And if used through the TV it did play as mono... but the Amiga 1000 was hardly going to often be using a TV given the price range and the A500 needed an external RF/composite adapter to even work on non-RGB TVs in color.

 

PC Speaker, Adlib, Mac sound, YM chips, SID, Pokey, TI/Coleco/MSX sounds. I think we can all agree that Paula was good enough for its release schedule 4 channels and DMA sample playback WIN. And remember portable TVs, hell ALL TVs at the time in most countries were mono, and the chipset was designed for a disk based games console to be played on the TV. This means 4 channel gaming sounds of awesome versatility via the 4 channel DAC. The stereo jacks hard wiring the 1,3 and 2,4 to L and R phono sockets was a bonus for stereo sampler use. And remember the A500/2000 modulator was mixed left and right phono sockets for sound into 1 channel via a Y cable between Amiga sound output and modulator's sound input too.

I's heavily dispute the Yamaha chips thing as some of those FM chips were REALLY nice at the time, even only looking at the 1985 models (mainly just the YM2203 and the arcade standard YM2151 -and 8 4-op FM channels is a very nice set-up). You have trade-offs with samples of course and there's some things you can't do with 4-op FM (much more than 2-op though), but 8 channels allows a lot and low-quality digitized samples with heavy filtering has a lot of disadvantages if you compare skilled use in both cases and RAM/ROM capacity limitations of the time. (we're talking about a time where many arcade games were just starting to use PCM/ADPCM due to space restrictions with many still opting for voice synthesizers, albeit Williams and a few others had opted for the much higher compression ratio CVSD format in the early 80s -all the 6809 boards used it and some later games as well though Sinistar and Smash TV are probably the most notable for use of speech)

 

And not to mention the connotations that Jay's original 128k RAM set-up would have implied. :P (for the MICKEY console with Atari you'd at least have ROM to expand on that though) In a general practical sense for in-game use on the Amiga you're heavily limited on use of samples if you want to fit any reasonable amount of graphics data into the 512k. (less than that due to framebuffer space as well -and that's for games intended to be self booting when the OS/WB wasn't loaded)

That's one thing that made AHX so attractive, very small file size due to the lack of samples. (from what I understand, the CPU generated the waveforms for the DACs to play... unless it is samples, but just very short looped ones -the latter would make more sense and effectively make it into a 4-channel software chip synth set-up sort of like the PCE has but with fewer channels, up to 8-bit resolution and no hard limit on sample length aside from memory constraints -I think there's more overhead for looping very short samples though)

Again, the forced filtering on the Amiga is a downside as well. It helps for heavily aliased low sample rate stuff, but it kills all the crispness (muffles) of any cases pushing higher rates. (one of the issues with the SNES as well)

 

Now the STE has fixed frequencies it can play samples back at and is only 2 channels, so two distinct sound but it does have panning I believe. The Archimedes had the same stereo panning controls, and 8 distinct sounds maximum (going from 2 to 8 channels = seriously less CPU time though so this was not DMA!)

I believe the STE had simple 2 channel 8-bit DMA sound with the 6.25/12.5/25/50 kHz rates (so less flexible than the Amiga and necessitating software pitch bending/resampling rather than the many cases where you could vary playback for pitch control on the amiga), the stereo was handled externally with a dedicated sound mixer IC. (in theory that could have been used to pan the 3 independent YM2149 channels as well given the 3 output lines -one of the nice features of that chip actually lost on the YM2203, sensibly though given almost all implementations wired the older chips as mono)

And what makes you think the Archimedes isn't using DMA??? Why couldn't it simply be using 8 DMA channels with hardware volume and variable playback rate like the Amiga? (but added stereo mixing capabilities... unless you know for a fact that it could do hardware sample looping as well -pretty much the only thing the CPU had to do on the amiga if you weren't doing software mixing or resampling rather than using playback rate to control pitch)

 

Had the PAULA originally been designed with 4 independent audio out lines, later models could easily have added a stereo mixer with panning control without even modifying PAULA. (or does PAULA already do that and the stereo is wired externally, not internally? -I'd gotten the impression that paula output plain stereo lines, not 4 general audio lines)

 

Commodore did nothing with Paula, guess after the cock up they made with 8xxx series SID they decided not to leave it. But there was no excuse not to put two Paula chips inside Amiga 600/1200/3000/4000. They had dual SID in Commodore 65 working prototypes for 6 channel sound, so why not 8 channel dual Paula? Because Commodore, run by some thick bastid called Medhi Ali, were being run into the ground.

You mean the bug fix on the later 8xxx chips that removed the PCM exploit? (which otherwise would have one of the main sound channels to be sacrificed -ie setting square wave to idle high and modulating the 4-bit volume directly just as you'd do with POKEY -and somewhat like with SN/AY/YM PSGs but those are nonlinear, so PCM format would be different)

 

That's not a good reason at all, and adding more PAULA chips would be prohibitively expensive: mass production pushes silicon prices down, but the large package was more of a fixed cost.

If they were really worried about having some sort of compatibility issues (a stupid excuse, especially as it would prohibit consolidation of the chipset and general enhancements) they could at least add another, different chip rather than PAULA... the simplest would be a direct derivative of PAULA with the floppy I/O lines removed and only the audio lines retained, beyond that would be actually cutting out the FD control/IO circuitry and actually shrink the die and add stereo pan circuitry or at least 4 separate audio lines to facilitate more flexible mixing, and beyond that you could copy the die across the chip for an 8-channel paula. (a die shrunk doubled paula, with or without added stereo hardware and with the vestigial floppy hardware removed would be great, and on top of the old PAULA for backwards compatibility)

Really though, they should have shrunk the die of the main paula and copied the audio block, that would be a cheaper and cleaner option. (adding some stereo mixing hardware would be great too)

Link to comment
Share on other sites

I realize I was thinking somewhat heavily in the context of cartridge/ROM based platforms rather than RAM/disk based platforms in the context of PCM sample usage, but the same trade-offs are there for RAM/disk, but to different extents.

With disks you have a fixed space limit too and the cost of using multiple disks (albeit still far less limiting than ROM costs), but then you have the limited space in RAM meaning you're limited on how much you can fit in per load for both graphics and sound (and have direct trade-offs there -including sample rate or bit depth of audio), and you also have the potential concern for how often the game loads again. (if you break it down per level or do it further at the expense of annoying the player by disrupting flow)

 

But the addition of more flexible hardware synth capabilities in addition to sample playback is significant nevertheless, and there's the issue of designing custom hardware for the PCM support given the lack of common off the shelf support for it. (and again, potential for several gradual steps in advancing PCM playback support, not black and white -using a timer interrupt to drive sample rate would be nice in any case and what's normally done for full software playback but you have progression to a simple FIFO buffer circuit using a timer or multiple timers if you had more than 1 hardware channel, then progressing to DMA, adding hardware volume control -might come before DMA- then more hardware channels along with stereo panning support -might come earlier- and finally higher resolution DAC output)

What the falcon added was pretty nice for 1992, and in '89 the STe's sound could have been OK with a bit more synth hardware to back it up (and more CPU resource especially) and more so with fully variable sample rate, but they really could have used something more earlier on for audio as well as graphics and CPU. (hardware scrolling to packed pixels to blitter to higher color depth and 12 or 16 MHz CPUs offered in the late 80s)

Link to comment
Share on other sites

I don't know why Atari didn't just do a proper variable rate PCM system, the fixed frequencies were not a good idea. Like the 4096 palette but still 16 colour restriction and identical screen resolutions the stereo sound was too much of a last minute tack-on rush job. Atari couldn't really get AMY to work so it was never really used, AMY in the STE could have been interesting but it was the 80s, and it was all na-na-na-nineteen samples everywhere in the music of the time and we all wanted real engine sounds and real gunshots and real electric guitars and drums in our game soundtracks :)

 

My comments about sound chips though were comparing home computers available, almost every home computer before Amiga had greater limitations. There are no real effects on Paula though. And my comment about Paula was creating dual Paula motherboards for Amiga 4000/1200/CD32 would have cost peanuts and made a huge difference.

 

As for the C64 SID revision, they broke a very powerful sample playback routine, just listen to things like Arkanoid or BMX kidz etc, 3 channels AND sample channel so really it becomes a 4 channel system. And pumped through the childrens portable TV via RF it sounded good enough for a budget machine (compared to ST and Amiga)

 

But to go back to the original question, in the 80s the ST in the EU was massive for gaming, there were plenty of games titles Amiga never saw at all or very much later but seemed to always make it to the ST. Had the ST not come out in 1985 Magnetic Scrolls probably wouldn't exist at all because nobody was going to be buying too many copies of The Pawn if only £1250 Amiga 1000s were their target market. I have to say though there was a lot of crap games too in the early years, worse than some polished C64 games, I always had both. But when I got my Amiga the C64 gathered dust pretty soon.

 

The ST made 16bit gaming a viable proposition, it was one of the first machines that saw a huge release of games for a 16bit system so in many ways it was the future...a couple of years early. I suppose the A8 was also early 8bit gaming push too.

 

Sure X68000 and Amiga and Megadrive were superior but here we are talking 3-4 years after the first ST. Macintosh did nothing for games with that pukey monochrome screen the size of a peanut, and Amiga was more 1988/89 onwards really. Sales of A500 were quite slow in 87...software hardware chicken and egg situation really.

 

Anyway I like my STM and it doesn't matter what parts make up the machine, overall it is a nice machine to own, and even nicer to own in early 1986 with a copy of Neochrome and a huge desk for the low DPI mouse :)

Link to comment
Share on other sites

I don't know why Atari didn't just do a proper variable rate PCM system, the fixed frequencies were not a good idea. Like the 4096 palette but still 16 colour restriction and identical screen resolutions the stereo sound was too much of a last minute tack-on rush job. Atari couldn't really get AMY to work so it was never really used, AMY in the STE could have been interesting but it was the 80s, and it was all na-na-na-nineteen samples everywhere in the music of the time and we all wanted real engine sounds and real gunshots and real electric guitars and drums in our game soundtracks :)

That's the thing, I've read that in some places about Amy, but mixed info in others. It seems definitive that it wasn't going to be ready in time to launch on the ST hence getting moved over to the XEM with a more flexible release date, but the fact it was moved onto the 8-bit line at all is rather odd IMO, slating it as an upgrade for the ST would have made far more sense. (if they were going to enhance the audio of the 8-bits, dual POKEYs would have been a far more realistic option... let alone actually coming out with a cut-down 24-pin POKEY with just the audio+timer functionality and the pot/key/SIO lines removed even if the die itself wasn't changed, the packaging is a big chunk of the cost -without a shrunken due you couldn't drop to a narrow DIP though, but a normal 24-pin DIP would be very significant -more so in the context of the 7800, either making an onboard POKEY more feasible or at least cheaper to add to the cart)

 

I'm sure Curt and Marty have more info on it, but I'm not sure sure that getting AMY to work was the primary problem and may have been confused in that context with simply not getting it to work in time for the ST's initial release.

They apparently had the LSI version of the chip completed and available in small quantities for prototyping and it was successfully demonstrated in a 65XEM at the January 1985 CES. (there's also conflicting information on whether the original designers stayed on or continued under consultation to complete the chip -especially the chip designer -who would have been the last active member, tasked to turn the TTL design into an LSI chip)

 

I could perhaps see problems with interfacing it with the ST if it was designed for a different bus design, but it had been intended for 68k based machines in the first place. (if worst came to worst they could add more GLUE logic)

And one interesting benefit in the case of using an external DAC would be potential direct access to a bare 16-bit linear audio DAC, no DMA of course, but much nicer than hacking through the YM chip... (that could have become useful later one for things AMY wasn't best suited for -in the event of actual DMA audio not appearing in a timely manner- it would also mean you could have some nice flexibility for 8-bit channels mixed to a higher than 8-bit output resolution as well as software volume control without clipping the samples -4 8-bit channels would mix to 10 bits with 6 bits left to manage volume, though having a few separate DACs would be a lot more useful in reducing overhead for multi-channel playback as well as facilitating simple MOD players -vary the playback rate rather than having to resample and mix- but that's far less important with Amy in the picture in any case)

 

My comments about sound chips though were comparing home computers available, almost every home computer before Amiga had greater limitations. There are no real effects on Paula though. And my comment about Paula was creating dual Paula motherboards for Amiga 4000/1200/CD32 would have cost peanuts and made a huge difference.

Yes, that would have been a quick and dirty option, but there's no reason they should have had to do that rather than doing a PAULA II with an on-chip upgrade in the simplest (and most fool proof) sense simply doubling the audio portion of the chip and shrinking the die. Short of that they could keep the standard PAULA and use a copy of it (preferable die shrunk and with the audio portion only, but using the same die in worst case) and using a smaller package with only the audio lines.

Putting an entire second PAULA is unnecessarily wasteful, especially in the timeline you suggest. (maybe as an interim thing when first introduced on a higher-end model Amiga like the A3000... though you could argue they should have done it back with the A2000... though what that could have really used was a double speed CPU and maybe an FPU socket)

 

As for the C64 SID revision, they broke a very powerful sample playback routine, just listen to things like Arkanoid or BMX kidz etc, 3 channels AND sample channel so really it becomes a 4 channel system. And pumped through the childrens portable TV via RF it sounded good enough for a budget machine (compared to ST and Amiga)

Yes, they broke the bug with the bias that allowed ~4-bit PCM playback... not sure that was an intentional bug fix or integral to the redesign, but it was a notable oversight in general given how often that exploit was used. (it was neither the first nor the last time that would happen though -newer revisions of hardware fixing bugs that had programmers had taken advantage of originally-) And it didn't remove it entirely, but made it far more quiet.

It's just 4-bit linear playback though, just like POKEY can do without a hack and just like the SID could have done using of the 3 square wave channels by modulating the volume (but yes, you lose a channel), much nicer than the PWM crap the Apple II or Speccy did (or PC for that matter via the PC speaker) and generally nicer than the less optimized routines used to play through the AY/YM/SN PSGs (performance varied a lot, worst case is low playback rate with poorly converted samples using only 1 channel, best case is higher rates with all 3 channels modulated together -you can get a decent approximation of 8-bit PCM that way, or even a bit more, but the higher res you go the more granular it gets and the ST normally used samples converted from 8-bit sources)

 

POKEY had some real potential for doing nice PCM playback, a real shame that wasn't pushed more. (3 hardware timers tied to the sample rate of 3 of the channels, though software timed loops could be more efficient in some cases) For in-game stuff they probably wouldn't ever do more than a single channel (probably 4 kHz), maybe variable rate for some pitch control (be it for SFX or music) for probably ~17% CPU time (assuming video DMA is enabled thus knocking the CPU to an effective ~1.2 MHz), of course RAM limit would limit that as well. Beyond in-game use you could have some nice title demos with mixed chip/sample sound to varying extents like some C64 and much ST stuff. (given you had timer for 3 channels, perhaps a primitive 3 channel MOD player with no volume and the 4th channel using chip sounds)

 

The ST made 16bit gaming a viable proposition, it was one of the first machines that saw a huge release of games for a 16bit system so in many ways it was the future...a couple of years early. I suppose the A8 was also early 8bit gaming push too.

That A8 wasn't so much early as marketed a bit off (in different ways in Europe and the US) with some other related and external problems, but that's already been discussed through enough. ;)

 

The ST wasn't such much early either... but a bit lacking in some key features that would have allowed far more longevity, or lack of timely upgrades to the architecture. (the specifics of which have been discussed thoroughly already -hardware scrolling, packed pixel, higher color depth, faster CPU, audio upgrades, etc)

 

Sure X68000 and Amiga and Megadrive were superior but here we are talking 3-4 years after the first ST. Macintosh did nothing for games with that pukey monochrome screen the size of a peanut, and Amiga was more 1988/89 onwards really. Sales of A500 were quite slow in 87...software hardware chicken and egg situation really.

And the X68000 was higher-end than the Amiga and 2 years newer. (and I didn't think it had much of a presence outside of Japan, like the later FM Towns -both were fairly niche gaming machines, more so than the MSX with the PC8801 and PC9801 dominating the home computer markets up to the early 90s)

 

I think the Amiga vs ST vs PC popularity timeline varied a lot, but I've seen mixed accounts in general. (both in terms of general market share, and in terms of specific sales/popularity for game playing in general) Neither were anything like in Europe though and console gaming was bigger, somewhat more like Japan relative to computer gaming at the time. (and like Japan to some extent you had some machines dominating the market that were not particularly gaming oriented -aside from the ST's burst of popularity in the mid 80s the PC was a massive seller especially after clones arrived, in Japan you had the PC8801 and PC9801, neither especially good in cost/performance for games -MSX was the best case for that in the early/mid 80s- though the PC8801 did have some nice audio hardware some models and a high resolution, but bitplane graphics, too high a resolution, only 8 RGB colors -later indexed and some more flexibility- but generally fairly limited with games tending to use tons of dithering to make the most of the 640x200 8-color display... lower res modes almost certainly would have been more popular had they existed, and it became a strong game platform because of the market saturation in general sort of like how the PC expanded -granted that was more significant after cheaper EGA based systems started appearing, more so with sound cards getting inexpensive, then VGA, faster/cheaper 3rd party CPUs, and then off the shelf parts allowing cheap homebrew/custom built systems -and used parts stores, but it took a fair bit longer in Europe with the ST and Amiga taking a lot more share -and for a time there was some real potential for a perpetuated standard)

Edited by kool kitty89
Link to comment
Share on other sites

The Sharp x68000 is a fantastic machine, but also quite restricted in what kind of arcade games it can perfectly recreate. It is an interesting comparison though for both the original ST (not STE) and Amiga.

 

ST had nothing but an 8mhz CPU and a basic sound chip from 8bit computers inferior to A8 and C64.

Amiga had custom chip functions it done OK but not as amazing as arcade machines from mid 80s.

X68000 was pretty much a direct copy of what was on 2D arcade game motherboards from people like Konami/Capcom.

 

What is interesting though is Sharp also improved CPU model and speed but not custom hardware, Amiga rarely improved CPU speed/model (A3000 and then in 92/93 with AGA) CDTV/A1000/A500/A600/A1500/A2000 ALL had same 7mhz 68000 design. Atari upgraded the CPU speed just once, in the top end Mega STE and the TT was totally different.

 

And just to add it into the mix the PC got yearly CPU improvements from mid 80s onwards (the time of ST/Amiga) but action/arcade games for VGA 256 color graphics as standard didn't happen until just before the Amiga 1200/4000 so 1991 sort of time frame.

 

Thing is Atari were in the perfect position in 1986 to decide which route to go....improve chipset and fracture the userbase into ST and STE specific programmed versions of games being required, or do nothing and just completely redesign the system bus to give you a full 100% performance increase over the original ST (ie more than a 16mhz CPU GLUE'd to an 8mhz bus from 1985 original as is the case with ALL CPU accelerators for the ST).

 

As for the x68000, it's rubbish at games like Lotus Turbo Challenge II, just have a good old laugh at the graphics in Chase HQ ;) They forgot to put in a blitter, so all it has are 128 16x16 pixel sprites and nice full colour parallax scrolling in hardware. So you were stuck just as the SNES was stuck and could not produce a single game as silky smooth as Lotus II running on a 1985 Amiga 1000!! Of course even an Amiga 1200 from 1992 can not replicate SNES 2D games like SF2 or any platformer that looks less childish then Super Mario etc (plenty of them like Mickey Mouse:The Magical Quest with exquisite 256 colour parallax levels).

 

Oh and one last thing to people who oppose my 16mhz 68000 vs STE enhancements debate. IF Atari had gone down the 'forget STE go with a pure 16mhz 68000/68020 design' the resulting machine would have had benefits for ALL users. Running most old games would show improvements BUT ALSO serious software used by business men would improve without a rewrite and so one re-design could have given rise to two totally different packaged machines (ie Mega style case with more memory etc and straight replacement of STFM for just £100 more). Strange but Atari not putting in extra stuff may have saved Atari! By 88 there were probably 500+ games for the ST released, most of these benefit in some way being run at 16mhz, NONE of these benefit from being run on an STE ;)

 

STE didn't sell because there were no STE games worth a mention, exclusive STE games were never programmed because STE wasn't selling. So there you go, chicken and egg. But 16mhz ST = Star Glider II/Gauntlet/Lotus Challenge runs better ;)

Link to comment
Share on other sites

ST had nothing but an 8mhz CPU and a basic sound chip from 8bit computers inferior to A8 and C64.

Amiga had custom chip functions it done OK but not as amazing as arcade machines from mid 80s.

X68000 was pretty much a direct copy of what was on 2D arcade game motherboards from people like Konami/Capcom.

The X68000 had the same design philosophy as standard arcade hardware and home consoles from the 80s and early 90s. Except it lacked the hardware sprite zooming common to even to many lower-end late 80s boards. (System 16 and Neo Geo for example -not high-end scaling like Sega's Scaler boards mind you) But with the RAM available you could mitigate that with animation far better than on the SNES or MD. (less so once you had 2-4 MB carts) You did have packed pixels and a nice array of color depths and a CPU powerful enough to manage some decent software rendering (better than the Amiga, STe, or MD -vanilla ST is screwed by planar bitmaps, STe and Amiga mitigate that somewht with the blitters but not for all things so a plain chunky display could be superior, let alone with 8bpp modes -MD is 4-bit chunky but you have the overhead of converting to the tilemap format if you want to render to the BG or managing the same for sprites if you wanted the output to be bigger than 32x32 -basically you're generating animation on the fly- the MCD did all that in hardware with a powerful blitter capable of affine rendering -only supporting scaled and rotated rectangles in hardware but set a line at a time it would texture map in 3D as well -ie filling a line of a polygon texture rather than just a solid color)

 

I wouldn't say the YM2149 is universally inferior to POKEY: POKEY has 4 channels and some nice other features, but a limited frequency range without paired channels while the YM has a 12-bit range, it also has the hardware ADSR envelope and can mix noise into any of the channels. You could also hack it to do better than the 4-6-bit PCM POKEY can but results vary. (and you could certainly argue that 4 4-bit linear DACs would allow a lot more flexibility with PCM and especially software mod players -no mixing and pitch controlled by playback with no resampling- plus 3 timers directly tied to playback -using the 3 YM channels separately tends to sound pretty nasty) But aside from software tricks, the trade-offs are much more straightforward. (POKEY can't do anything but square wave and noise -albeit some interesting variable pulse related noise along with white/random noise- without more CPU intervention)

 

What is interesting though is Sharp also improved CPU model and speed but not custom hardware, Amiga rarely improved CPU speed/model (A3000 and then in 92/93 with AGA) CDTV/A1000/A500/A600/A1500/A2000 ALL had same 7mhz 68000 design. Atari upgraded the CPU speed just once, in the top end Mega STE and the TT was totally different.

If you look at the dates though, the 16 MHz version wasn't released until 1991 (ie STe) and the 68030 versions not until after the Falcon. (aside from high-end machines like the A3000/4000 and TT) But they were likely still selling the older models as well and there were probably upgrades possible.

The X68000 was a 1987 machine originally though so the upgrades technically came sooner as such (1987 to 1991 vs 1985 to 1990) and the base model always had a 10 MHz CPU vs 7.09/7.16/8 MHz.

 

And just to add it into the mix the PC got yearly CPU improvements from mid 80s onwards (the time of ST/Amiga) but action/arcade games for VGA 256 color graphics as standard didn't happen until just before the Amiga 1200/4000 so 1991 sort of time frame.

You had it starting ~1989 and becoming more and more mainstream into 1991. Do remember that VGA helped performance all around, so a slower machine could render faster than in EGA by a good margin not only due to the packed pixels but (critically) due to the hardware V/H scrolling added -for scrolling games of course. So you could probably get notably better performance from a 8 MHz 8088 machine with VGA than an ST running a scrolling heavy game with comparable optimization on both machines. (perhaps still even with a 7 or 4.77 MHz 8088 in some cases) You even had the odd case where VGA was being supported before any sound cards. (like the DOS version of stormlord with only PC Speaker sound)

 

Thing is Atari were in the perfect position in 1986 to decide which route to go....improve chipset and fracture the userbase into ST and STE specific programmed versions of games being required, or do nothing and just completely redesign the system bus to give you a full 100% performance increase over the original ST (ie more than a 16mhz CPU GLUE'd to an 8mhz bus from 1985 original as is the case with ALL CPU accelerators for the ST).

They should have done both and as soon as possible to limit fracturing the userbase as much as possible. (EGA took a long time to become common/standard for PC games -nor until ~1987- but VGA followed that very quickly with some games appearing almost immediately and many by '89 with only really low-end/shareware games being EGA only by 1991 -many, many games supported both though and the better developers also supported Tandy/PCjr modes through the late 80s and in a few cases even in the early 1990s) They needed hardware scrolling, not a full blitter and should have pushed for it as soon as possible with a SHIFTER upgrade, faster CPUs, packed pixel, and 256 color modes would probably be more important than a full blitter as well. (ie follow the PC, not the Amiga ;) -a lot of US developers that could favor the ST even more for PC ports -including exports to Europe)

And increasing RAM or CPU speed would split the market as well with games that couldn't run on lower-end machines or (for CPUs) games assuming more CPU grunt and running poor on slow machines.

 

However, that's with upgrades via ISA bridging the gap, but I wouldn't write the ST off so quickly in that regard either. Even without a general purpose parallel expansion port (which is what the cart slot SHOULD have been -more like PBI or the CoCo, C64, or VIC's cart slot -with an external expansion caddy available as well and such slots built into the MEGA models -which should have been out by '86), aside from that, there was still the possibility of offering semi-hack upgrades via service centers: it would not be unreasonable at all to have a drop in BLiTTER or SHIFTER replacement or even a sound chip upgrade. No, the chips weren't socketed, but that wouldn't necessarily have helped anyway... you had piggyback upgrades for the A500 using the CPU socket, but with a soldered CPU you could still do that in reverse with a clip-on interface sitting on top of the CPU (either plugged in or soldered like the RAM upgrades) which is exactly what Cyrix offered for surface mounted 386SX upgrades: so that's almost certainly how the BLiTTER would have been interfaced if they offered it as an upgrade, but again I don't even think the BLiTTER should have been that huge of a priority. You could do the same thing with the SHIFTER with a plug-in piggyback replacement maybe needing a couple jumpers (still a pretty quick and clean upgrade for a service center to handle -far cleaner than piggybacking RAM -which I'm not even sure was offered officially) and the same for the Sound chip other than likely requiring a bit more remapping if it was to clip onto the YM2149 (the YM2203 would probably work well in that sense embedded in a small PCB with the DAC onboard and a connector to latch onto the YM2149). In all cases maybe requiring some other tweaks to avoid conflicts with the original hardware. (cutting some pins perhaps or soldering jumpers, but nothing huge)

Even an unrelated Sound chip (PCM based or otherwise) could be hacked in that way relatively reasonably. (in all cases ideally with install kit form factors with small riser boards and minimal or no soldering necessary)

 

Of course you'd need updates to TOS to make use of that in the OS. (self booting games bypassing the OS wouldn't matter -and you could have games offering enhanced and non enhanced modes as with PC games)

 

And once Atari realized their mistake of omitting expandability, they could at least fix that on later models (even the MEGA was weak in that regard) and the set-up used with the ECI on the XE line could have made sense. (keep the original cart port but add a companion slot to expand it to a proper expansion interface with a the full address range, provision for RAM, DMA, IRQ, etc, etc -an added 32 pins could probably manage that fine -and you'd end up with still a significantly smaller connector than 16-bit ISA) Again, have multiple slots inside the MEGA models (arranged to facilitate fitting cards inside the compact low profile case or simply intending cards to keep to a low profile form factor) The sooner they addressed that (and released the MEGA) the sooner that would simplify things all around. (including RAM upgrades -RAM and coprocessors would be ideal using that system, but video and audio would be a bit iffy as they'd need external mixing lines -unless you added audio in/out lines on the expansion ports)

 

As for the x68000, it's rubbish at games like Lotus Turbo Challenge II, just have a good old laugh at the graphics in Chase HQ ;) They forgot to put in a blitter, so all it has are 128 16x16 pixel sprites and nice full colour parallax scrolling in hardware. So you were stuck just as the SNES was stuck and could not produce a single game as silky smooth as Lotus II running on a 1985 Amiga 1000!! Of course even an Amiga 1200 from 1992 can not replicate SNES 2D games like SF2 or any platformer that looks less childish then Super Mario etc (plenty of them like Mickey Mouse:The Magical Quest with exquisite 256 colour parallax levels).

I fail to see how a blitter would matter at all, if not be inferior due to the color limitations of a bitmap display (especially if you used 4-bits or less per plane) compared to the powerful sprite abilities of the X68000. There's enough sprites possible that any decent engine would handle such games with no flicker better than the SNES or MD ever could (with or without software scaling), and with all that RAM and mass storage you'd likely need to relay on soft scaling far less often anyway (a few games like the Road Rash series or Skitchin did that on the MD -Domark's Superbike as well along with a few polys).

The amiga's blitter is only good for copying, filling and moving around rectangles, it doesn't help with scaling in a general sense though it helps a good deal over a CPU alone stuck with a planar display (packed pixel display is another story), and given the RAM based nature of the system you'd almost always rely on animation rather than scaling on the fly. (maybe some on the fly scaling buffering of animation as needed -more likely how much of the soft scaling on the MD is done, not on a frame by frame basis, but buffered depending on the objects needed on-screen -but much more flexible with a good bit of RAM to work with) A lot of different options for software scaling though.

And then there's the road/BG but that's generally all using line scrolling, or in some more advanced cases, doing line scaling as well. (1 dimensional horizontal scaling of each scanline to give more perspective -or for warping effects seen in a few 16-bit games like Axelay or the Sonic 3D Blast special stages on the MD -tactful use of that can even fake a mode-7 like texture layer without actually doing any form of texture rendering, like Panorama Cotton does for the ground in most cases and a few others do as well -along with tons of "scaling" animation for sprites)

 

Oh and one last thing to people who oppose my 16mhz 68000 vs STE enhancements debate. IF Atari had gone down the 'forget STE go with a pure 16mhz 68000/68020 design' the resulting machine would have had benefits for ALL users. Running most old games would show improvements BUT ALSO serious software used by business men would improve without a rewrite and so one re-design could have given rise to two totally different packaged machines (ie Mega style case with more memory etc and straight replacement of STFM for just £100 more). Strange but Atari not putting in extra stuff may have saved Atari! By 88 there were probably 500+ games for the ST released, most of these benefit in some way being run at 16mhz, NONE of these benefit from being run on an STE ;)

Quite true, but one thing to also consider is that a CPU upgrade would be among the toughest hacks to manage as the system speed would have to be changed as well (even if you had an upgrade board that included a 16 MHz oscillator or simple 2x multiplier circuit, you'd have an asynchronous overclocked CPU not mated to the bus speed and potentially screwing up interfacing with the GLU and general I/O operations and DMA management issues). So as a standard features of new machines, fine, but not feasible as an upgrade as other things could have been. (had Atari actually wanted to allow users a smoother path to upgrade and not force them to buy new machines -which may or may not have been the case)

 

It didn't matter if you split the userbase as long as people bought the new systems and developers supported them, but upgrades would have a far more realistic chance of doing that. (RAM splits the userbase too, and just as bad as the STe if Atari didn't offer RAM upgrades -you could void the warranty and piggyback, but that's different unless Atari service centers offered it) But the sooner the upgrades were standardized (if not on base units at least on higher end machines), the better chance you had for proper progression in software support. (you had the 1040ST pretty early on and RAM upgrades for the Amiga, but even then you had too many catering to the lowest common denominator for too long -though I suppose it wasn't horrible compared to PC games catering to 512-640k or less even in 1991/92 in some cases -Wolf3D was 512k minimum).

Plus a faster CPU might have been more expensive than using custom chips. (almost certainly for component costs, but not R&D and not PCB costs in the case of the BLiTTER, but all around in the case of SHIFTER or sound enhancements as those would take up very little added board space or none at all)

Investing a bit of R&D would be very smart alongside measured use of more off the shelf parts. (again, SHiFTER upgrades with scrolling, packed pixel, higher depth, etc were more important and likely simpler -in some cases definitely- than doing a BLiTTER, a PCM chip would almost certainly be a custom in-house thing, but the YM2203 should have been a very real possibility)

 

STE didn't sell because there were no STE games worth a mention, exclusive STE games were never programmed because STE wasn't selling. So there you go, chicken and egg. But 16mhz ST = Star Glider II/Gauntlet/Lotus Challenge runs better ;)

And it took far too long... if the STe game out in '86 or '87 they'd have had far longer for developers to start shifting and doing coss-supported stuff (same for RAM, SHIFTER, audio, or Blitter alone). Plus a faster CPU would conflict with any timing sensitive games and thus have to be switched to 8 MHz for all such cases anyway. (so you'd have to get lucky, have patches, or have developers take variable speed into account -you could also patch games for the Blitter but that wouldn't be as simple for the most part unless you had some super timing sensitive tricks that would need to be rewritten entirely for a 16 MHz CPU to take advantage of them)

 

But to an extent, a CPU upgrade is more foolproof as it will at least help with some (perhaps most) existing software to at least some extent while other upgrades won't.

 

But other than that it's a matter of getting developers to support it in general as well as people to buy it... but the latter would be solved by discontinuing the baseline models (be it making 1MB standard, or Blitter, DMA audio, YM2203, SHIFTER II, etc), but still highly dependent on timing (1989 was too late in any case). Adding V/H scroll to the SHIFTER and the YM2203 should have been very reasonable for 1986 and perhaps fully standardized on all 520 STs and higher by '87 (and you'd want to follow that with another SHIFTER upgrade to bump to a larger palette with a 8-bit packed pixel mode and probably expanded planar modes to match VGA). Faster CPUs are something they could/should have done from the start, offering high-end 12/16 MHz models (or other speeds, but 12 and 16 are easier to pair with 8 MHz from a common clock and integer multiplication/division -though a 48 MHz clock would divide nicely to 9.6 MHz as well as 8, 12, 16, and 24 MHz), and thus setting a precedent from the start and something for developers to cater to.

As it was they probably could have introduced the BLiTTER on the low-end STs at least a year earlier than they did, if not push for release in '87 as soon as available in quantity, but not nearly as good as upgrading the SHIFTER in '86 and having it on all new STs by '87. (same for the Sound chip)

 

PCs had a gradually expanding standard to work with and some things fell out of favor (some 3rd party sound cards for example appearing just after Adlib and not lasting long), but the defacto standards almost always getting backwards compatibility, at least for a long time. (CGA and EGA emulation eventually ran into problems, some fairly early on with SVGA cards in the early 90s, sound blaster compatibility was common to include on most sound cards into the late 90s at least -with varying results and some exceptions)

 

 

But as to video, yes the industry standard (due to PCs) was modest hardware acceleration and CPU grunt and hardware acceleration becoming common for GUI (windows accelerators) by the early 90s, but not really for games until win9x stuff and especially with 3D more than pushing 2D acceleration in games, so not really until 1995/96 did that become a major issue. (the Falcon had some significant potential for 3D acceleration using the DSP though, let alone potential to work the Jaguar chipset -or parts of it- into the ST line had it persisted -JERRY would be the main thing to include... actually if they had decent VGA like capabilities by the late 80s -something like the TT or slightly less and definitely 320x200 or 320x240 modes in addition to 320x480- they could have done well with that alone until the Jaguar chipset) Actually, if Atari had dropped the blitter entirely and just pushed CPU, audio, and SHIFTER upgrades there were other possibilities: Atari had a partnership with Flare on the Jaguar (bringing Martin Brennan onboard for the Panther in '89) and given the relationship it wouldn't be unreasonable to exploit their Flare 1/Multisystem chipset in general (which Konix never bought the IP to) so you had a nice Blitter, fairly useful DSP-like chip, and very fast multiplication coprocessor but needed a bit of tweaking to work at the ST speeds (it was oriented around 6/12 MHz rather than 8/16 MHz and perhaps some issues with interfacing with the 68k rather than the little endian Z80/8088 it was intended for -modifications to the GLU might have solved that). Probably drop the video generation block and design a new ASIC encorporating the Flare ships with the SHIFTER (which should have hardware scrolling and 256 color packed pixel support by then), something done in 1989/90 rather than 1992 with the Falcon. ;) Then again, with packed pixels and the 8 MHz ST blitter able to work with them, the Flare hardware would be less attractive in general (licensing/royalty costs tied to it), though maybe they'd have still taken some interest in the DSP or ALU coprocessor. (I think with an 8 MHz ST type blitter working with packed pixels and allowed fast page access to RAM, that should outperform the multisystem blitter and more so the Amiga blitter, I think -unless it's superior in some other ways the Slipstream Blitter's main advantages would be using a packed pixel display and thus working with word wise writes and fast page accesses -both of the others lack features of the Amiga like masking and such)

 

And as it was, Atari screwed up by not pushing a low-end machine with the TT video hardware as soon as possible. (say something like the MEGA STE with the TT SHIFTER in 1990) Actually, that would have been far more useful than the TT itself since it was too high-end for the mainstream market and too weak for the workstation market. (perhaps even have a lower-end console form factor STe-like derivative using that and a MEGA machine adding the FPU -both should have been 16 MHz by that point though) Assuming the TT SHIFTER worked with packed pixels for 256 colors (not sure, planar like AGA would suck) that could potentially accelerate the BLiTTER as well as it wouldn't have to deal with a bit at a time but a word at a time. (assuming it wasn't totally fixed purpose for planar graphics only -I doubt that given it was useful for driving the Falcon graphics too) Let alone if they doubled the blitter's clock speed. (or more useful, buffering to allow 64-bit reads/writes given the TT SHIFTER worked at 64-bits, that would be huge even with no clock speed change)

Link to comment
Share on other sites

You need a blitter to do sprite 'scaling' as in Lotus II on Amiga. Not sure how Outrun is done on Megadrive because that has no blitter either but far larger hardware sprites than the 16x16 in X68000. Play Chase HQ on x68000 it's worse than the Amstrad CPC version. Lotus II pisses all over any other sprite scaling game on any 68000 based machine end of story :)

 

As for STE vs ST, if you fraction your userbase into two machines that require massively different source code you are screwed, either STE in 85 or nothing. Look how crap all the ST ports like Robocop look on Amiga when it could have looked identical to the arcade, same thing would have happened with STE even if they tried to support it. UK software houses were in it for the money not the best possible conversions ;)

 

And 16mhz ST accelerator boards were around in 1987, I almost bought one at an Atari show! So the fact it took Atari 6 years to use one is nothing but incompetence on their part not Motorola's product catalogue really ;)

 

(Commodore also never used one either, more fool them too. A500 and A2000 needed 14.18mhz 68000 IMO to actually improve on 2 year old A1000 chipset which was identical in every way)

 

And the Falcon was locked into a 16bit bus = FAIL

 

To be honest they probably should have left the 520ST and 520ST+ as is, but held back on the £399 520STM and STFM until they had either more grunt for better games via 16mhz CPU or STE hardware. The ST was fine for business as it was, but for a home machine they could have waited and got it right so there was a better base standard. Maybe even just a 10mhz 68010 clocked to 12mhz would have been enough....but then TOS crashes on a 68010 right?

 

So to sum up, it took a long time for a true Amiga game to be developed and released, the first true Amiga game to make full use of all the chipset was probably Shadow of the Beast. Think how many STEs would need to be sold before a software house put that sort of effort into releasing a true STE game? The issue is always that software houses won't waste resource doing STE specific games before the sales of the STE hit critical mass. And if you owned an STE you should have pirated EVERYTHING that didn't use the blitter and DMA sound properly rather than mildly breathed over ST ports like us Amiga users had to endure. I have written better game engines in BASIC on Amiga than most commercial releases between 1985-1990 including arcade conversions of the time!

Edited by oky2000
Link to comment
Share on other sites

It wasn't that difficult to make an STe version of a game, but it also really wasn't worth it.

 

The ST should have had the scrolling ( forget sound, as that would have been a much bigger redesign - and the win wouldn't have been that much ) from day 1.

Then when the TT was announced the pallette should have gone to 6 bits per colour channel ( 262144 colours ) and a STe should have been a 16Mhz 68k with TT video modes ( and VGA monitor support )

That way there would be a high end machine and a 'new' base games/home platform.

Link to comment
Share on other sites

You need a blitter to do sprite 'scaling' as in Lotus II on Amiga. Not sure how Outrun is done on Megadrive because that has no blitter either but far larger hardware sprites than the 16x16 in X68000. Play Chase HQ on x68000 it's worse than the Amstrad CPC version. Lotus II pisses all over any other sprite scaling game on any 68000 based machine end of story :)

Nope, the Amiga blitter isn't useful for scaling at all, you'd have to do it with CPU grunt just like a VGA PC... the blitter itself would help due to the planar graphics though (ST would be worse off... VGA PC wouldn't). It's not like the blitter in the Sega CD which does actual affine texture rendering. (only rasterizing to flat or rotated rectangles though, needing CPU resource to do actual warping by setting each line separately)

 

But none of those systems commonly used actual scaling other than in title demos or such (PC started to in the late 80s and more so in the early 90s), the usual method was simply using animation tile to jump to the next size for scaling while also moving the sprites (or blitter objects) in intermediate frames as with F Zero, Mario Kart, Top Gear, Top Gear 2, etc. The same way Out Run was done on the SMS and MD (or C64), or Rad Racer on the NES. The only difference between the ST/Amiga/MD/SNES/PCE and the older consoles is that you had a lot more memory for animation (RAM or ROM), more and larger sprites (or blitter objects), and higher color. (except the SMS which used the same 4-bit pixel depth as the ST, NES, PCE, and MD but with only 2 subpalettes with only one usable by sprites but both by the BG and only 6-bit RGB for the master palette while the MD had 4 palettes and 9-bit RGB with all 4 usable on both sprite/BG -SNES had 8 for spr and 8 for GB and PCE had a whopping 16+16 palettes- though the ST only had 1 single 16 color palette and the Amiga using dual playfield mode only hhad 1 8 color BG, 1 7 color BG, and either sprites sharing 15 more colors is "attached" or using 4 sets of 3 colors iirc)

 

The X68000 may have been limited to 16x16 sprites, but that's no limit at all as you can simply build them up into larger objects... the MD only went up to 32x32 but many games used multiple sprites to go beyond that. (you obviously wouldn't have to use multiple 32x32 sprites if you didn't need 64 wide or tall, but 32+24, 24+24, 32+16, 32+8, 16+24, etc -and ideally, optimize color based on that as well as each sprite could use a different palette of the 4 15 color indexes available) You had 128 sprites on screen (ie hardware multiplexed) with 32 per scanline (vs 16 or 20 max for the MD in 256/320 wide modes -or 256/320 pixels across, whichever limit you hit first), though I think the SNES was a bit closer to the X68000 in sprites/pixels per scanline and the same for total on-screen and unlike the X68000 you could use larger hardware sprites (same pixel per scanline limit though, but you'd hit the 128 sprite limit later) though you can only have 2 sizes per frame vs the MD and PCE which can use any selection of sizes on a per-sprite basis (the MD is by far the most flexible with any combination of 8, 16, 24, or 32 wide/tall vs 16/32/64 tall and 16/32 wide for the PCE or 16/32/64 in either direction on the SNES -but only 2 sizes per screen).

Regardless, the X68000 easily has enough grunt to push such games with its sprite engine given you'd never need 128 on-screen and generally not need even close to 32 per line.

 

Powerful sprite engines can handily compete with blitters with a number of trade-offs. (you eventually hit the sprite limits and flicker or drop out while a blitter times out and drops frames like software rendering -the amiga has hardware sprites to offset that a little) Sprite engines also generally use indexed colors on a per sprite basis meaning far more flexibility of color at a similar bit depth. (most 2D arcade board only use 15 colors per tile/sprite including the Neo Geo and System 32, but many, many indexes -Neo Geo uses 8 kB dedicated to its 256 palettes with 4,096 indexes, the Saturn and PSX could also both do that with textures of 4/8/16-bit depth with indexed palettes with individual indexes per texture or in the Saturn's case you had 4 kB of palette RAM with 2,048 15-bit RGB colors or 1,024 24-bit RGB colors and an offset to select any consecutive set of 15 or 255 colors from within that range -much more efficient than fixed indexed palettes in palette RAM or tied to each texture- which could be used for textures or for the BG tiles -BG tiles could be 4, 8, or 16-bit indexed or 16 or 32-bit direct color using 15 or 24-bit RGB... though it was almost always 4-bit to maximize memory use -using indexed palettes offset in CRAM)

 

As for STE vs ST, if you fraction your userbase into two machines that require massively different source code you are screwed, either STE in 85 or nothing. Look how crap all the ST ports like Robocop look on Amiga when it could have looked identical to the arcade, same thing would have happened with STE even if they tried to support it. UK software houses were in it for the money not the best possible conversions ;)

Not really... that's the nature of successive advances and is what happens with every console and computing platform, even with backwards compatibility (with the exception of very basic hardware upgrades like the GC to Wii -which was basically just overclocking and adding RAM).

 

You simply need to do it in a timely and gradual fashion... they'd have to rework games to take advantage of a 256 color mode or packed pixels when those were added too, just like with VGA, but given an upgrade path to old users facilitates that being smoother as well. (and the lack of expansion ports hurt that, but didn't prevent it outright)

 

You can't just stick with old, limited hardware and expect it to get better with CPU resource alone, let alone the many cases where it could get WORSE due to CPU timing issues screwing things up (like old PC games that ran too fast on later machines -even things like playing Wing Commander on a 50 MHz 486 -anything faster than ~33 MHz was pushing it).

 

Of course, as games got more and more into high-level programming and away from hardware level commands and assembly language, you got a lot more flexible on how you could make use of added hardware. (driver/OS/API level access rather than direct hardware, but that didn't become commonplace for games until the mid 1990s -namely with win9x and applying to both software renderers and hardware acceleration) API level programming became the norm from the mid 90s onward.

 

But, again, a full blitter wasn't even needed: scroll registers and packed pixels would be more important for a number of cases alongside boosted CPU speed and RAM (and increased color depth), That's exactly what PCs did for the most part going forward with VGA, in fact VGA was probably the longest held common standard for PC games ever used (CGA wasn't well liked but pretty much stuck -aside from TGA- until EGA got affordable ~1987 but VGA picked up very rapidly just after that -not surprising with 256 colors indexed from 18-bit RGB with packed pixels and V/H hardware scrolling).

There's audio of course as well, but that's another thing that could have been upgraded over time. (and something a bit easier to program over a range for -ie support all the progressive enhancements but have lower-end modes as well -for example using DMA sound or allowing software driven PCM via the AY chip -or disabling that to save CPU resource)

OTOH, while sound and video upgrades (with integral hardware acceleration) became quickly accepted standards for games, 2D blitter type accelerators like the IBM 8514 (contemporary to VGA) didn't catch on for games but became quite popular for GUI acceleration and some other applications with embedded 2D acceleration getting more and more common in the early 90s and all 3 early 1995 3D accelerator chipsets had powerful 2D acceleration as well. (namely S3's ViRGE, Nvidia's NV-1, and ATI's RAGE -the latter probably the more balanced for 3D and also included MPEG-1 acceleration -the VooDoo add-on accelerator wasn't released until '96 and was a good deal more expensive and only 3D -using analog genlock mixing with daisy chained VGA to lock onto a 2D card, but by then you also had the Rage II -with the Rage Pro quickly following- with both being far more balanced with embedded 2D and a considerable jump in 3D performance -and added MPEG-2 support)

 

 

 

Splitting the market with new standards is inevitable, but how well accepted they are defines their success in transition. (STe in 1987 could have made more sense, but a simpler ST+scrolling+YM2203 in '86 would make even more sense -maybe a bare DAC or small array of DACs to do better than PSG playback, let alone something a bit better than that like a very simple DMA circuit and/or FIFO buffer -multiple hardware channels would really be nice for the latter, but single channel DMA with variable sample rate especially would be more useful in many respects, lots of trade-offs but lots of simpler options than engineering something like PAULA -again POKEY+6502 would be a good hack but not really cost effective at all and requiring more GLUE logic to mate the 6502 to the 68k bus)

 

Having packed pixel from the start probably wouldn't have compromised the release date (just a different design concept for the SHIFTER), but logic for hardware scrolling certainly could have. (using planar graphics is useful to save some memory as a primitiv compression scheme, but the performance for a purely CPU driven system probably wasn't worth it.... granted in 640x400 mono mode it would be no different as 1-bit packed pixel is 1 bitplane ;) -thus the Mac didn't even have to make the choice) Had the SHIFTER been more like the Apple IIGS's graphics chip, especially with the 12-bit RGB (cost notwithstanding), that could have been very significant... you could probably even argue dropping the 640x400/70 Hz mode would have been worth it in spite of the hit some applications would take (like certain business applications, CAD, and perhaps desktop publishing) and the IIgs's chip actually added scanline based subpalettes (16 separate 16 color palettes selectable a line at a time, so sort of like what Copper could do with palette reloading -without CPU intervention) but scroll registers would be more useful than such tricks. (not that they'd necessarily be as easy to add) But the 12-bit palette wasn't extremely necessary as such and packed pixels and scrolling were the main issues. (dropping to 6-bit RGB would be bad though)

 

And 16mhz ST accelerator boards were around in 1987, I almost bought one at an Atari show! So the fact it took Atari 6 years to use one is nothing but incompetence on their part not Motorola's product catalogue really ;)

It obviously would have been a price issue, but in hindsight a much lesser one than RAM could be. (and RAM always tended to fluctuate more than CPU prices and the 68k had an array of 2nd sources like Hitachi)

 

They should have offered it in high-end units from 1985 onward though, perhaps 12 MHz as well as 16 MHz (I don't think 16 MHz was available in '85) and definitely on the MEGA which should have been in '86, not '87 (if not '85), probably more important in the US for that PC-like professional look than Europe, though it wouldn't matter if they didn't get the publicity to gain interest from the masses. ('85-87 was the biggest period for potentially digging a niche before PCs took over even the lower-end mainstream market, and of course the more funds they managed to establish after they burned off the debt, the more they could afford to publicise... though the machine had a much better change in a dedicated niche than combating the PC for mass market -power without the price could only hold until off the shelf PC components became too far to match due to economies of scale -licesing the ST standard could have been an interesting option though and the thing was a hell of a lot easier to clone than the Amiga, let alone the OS being superior to the PC by far -at least until OS/2- though that was in part due to luck of TOS/GEM 68k not being castrated like GEM x86)

 

 

(Commodore also never used one either, more fool them too. A500 and A2000 needed 14.18mhz 68000 IMO to actually improve on 2 year old A1000 chipset which was identical in every way)

Yes, if nothing else the MEGA and A2000 should have bumped up to 16 MHz chips standard and maybe even included FPU sockets. (MEGA STe offered that far too late)

 

And the Falcon was locked into a 16bit bus = FAIL

That was almost certainly due to cost issues as well as compatibility (but for compatibility it would make more sense to have a 16-bit mode as well as full 32-bit), and if dropping to a 16-bit bus was a better cost savings than dropping to a 16 MHz 68EC020 on a 32-bit bus, that's a worthwhile trade-off for the most part with the 030's performance boost over the 020 (this came up before), but they could have offered a 32-bit high-end version with 16-bit modes and a lower end 16-bit only version... but honestly they should have done that back in 1990 with the TT rather than making it high-end only (though that was still a bit too late) or doing a 16 MHz 68000 version with the blitter and no FPU, but the TT SHIFTER's capabilities. (again, had the STe added a 320x200 8-bit packed pixel indexed color mode and 16 MHz 68k -possibly optional, it would have been fine for 1989, especially assuming the blitter could at least read and write 8-bit chunks of data if not buffer 2 8-bit pixels for 16-bit read/writes with fast page mode accesses it probably could have kicked the Amiga's ass in many respects -audio would still have tradeoffs, less so if they swapped in a YM2203, but making the DMA sound have proper variable sample rates would go a long way -I think the falcon fixed that with the 8 channel 16-bit DMA audio)

 

For 1992 the Falcon was pretty decent all around, but the ST line had stagnated too much by then... in fact you could probably argue that a faster Blitter and plain 16 MHz 68k and no DSP could have been preferable for a lower-end falcon counterpart. (not even necessarily a faster blitter -though 16 MHz would be nice- but one fully optimized for working with packed pixels and especially buffered for more efficient reads/writes -especially if video RAM was 32 or 64-bits wide -I know it is on the TT... though that would imply all memory is 64-bits wide if it's shared memory -and line buffers would be far more powerful than simple phrase buffers of course)

Then again a few months later they had the Jaguar chipset to work with. ;) (had the ST still been alive and kicking they might have put more focus on parallel development of the Jag chipset for use in later ST machines, namely the TOM ASIC, especially if they addressed the bugs, but on a system with a more powerful CPU it didn't matter as much, much more so if it actually had dedicated video RAM to work in -though the ST line tended to have shared memory like the Jaguar, unlike the Amiga's FastRAM+ChipRAM)

 

 

To be honest they probably should have left the 520ST and 520ST+ as is, but held back on the £399 520STM and STFM until they had either more grunt for better games via 16mhz CPU or STE hardware. The ST was fine for business as it was, but for a home machine they could have waited and got it right so there was a better base standard. Maybe even just a 10mhz 68010 clocked to 12mhz would have been enough....but then TOS crashes on a 68010 right?

They didn't need full STe level hardware either, just add scroll registers to the SHIFTER and swap our the YM2149 for a 2203 (preferable add some simple IC that aided with PCM -probably more than a bare DAC or array of DACs -better than the YM though- but less than Paula, maybe a FIFO buffer with multiple channels even or perhaps a basic DMA circuit like the MAC), then later add packed pixels (perhaps chained bitplanes a la VGA) as well as 8-bit color modes with a larger (at least 12-bit) palette. A 320x200 (or 320x240) 256 color mode would have been fine into the mid 1990s. (going higher would be fine as long as there was the resource to support it -if the TT SHIFTER was fixed to 320x480 for 256 colors, that would be limiting for games as lower res would be preferable, especially with square pixels)

 

But you're right, the ST was better a a business machine early on with its price point (at least in Europe with the 8-bits still dominating), though to that extent, a MEGA form factor machine would have been great from the launch with a 12 or 16 MHz CPU. (faster CPU from the start also means programmers taking that into account and not using timing sensitive routines) And changes like the YM2203, scrolling, and maybe a basic PCM chip of sorts would be very reasonable to get out within a year of the ST's release. (packed pixels and/or faster CPUs would be great too -even 4-bit packed would be great, or a 160x200 8-bit packed mode even before 12-bit RGB or 320x200/320x240 8bpp)

 

Overclocking a 10 MHz CPU to 12 MHz wouldn't make sense (there's a reason they grade them as such even if the vast majority can run faster -unless Atari was willing to take the risk), but they had 12.5 MHz versions for that anyway... and there's no reason to go with a 68010 other than MMU capabilities as the speed advantage is almost nil (it would make more sense to use a more flexible master oscillator in the ST to better match the full CPU speed ratings of Motorola's 10, 12.5, or 16.67 MHz rated chips), plus unlike the original 68k, Morotola had started backing out of widespread second sourcing (something that would cost them dearly against x86 and would not be repeated with PPC).

The slight changes to the ISA did cause some compatibility problems, but not ones that weren't usually easy to patch for (same for the 020/030 etc for the most part). Better to jump straight to the 020 and/or 030 for high-end stuff. (and later the EC020 and EC030 for some lower end stuff -maybe with 16-bit buses depending on system design/cost trade-offs -other than backwards compatibility modes)

 

 

 

So to sum up, it took a long time for a true Amiga game to be developed and released, the first true Amiga game to make full use of all the chipset was probably Shadow of the Beast. Think how many STEs would need to be sold before a software house put that sort of effort into releasing a true STE game? The issue is always that software houses won't waste resource doing STE specific games before the sales of the STE hit critical mass. And if you owned an STE you should have pirated EVERYTHING that didn't use the blitter and DMA sound properly rather than mildly breathed over ST ports like us Amiga users had to endure. I have written better game engines in BASIC on Amiga than most commercial releases between 1985-1990 including arcade conversions of the time!

It really depends on the popularity of the machine, and I think some much earlier games pushed the Amiga more for sure... but at very least it was using scrolling and either hardware sprites or blitter objects for a lot of stuff.

But there was also a significant difference from the US and EU development trends as well. In either case it wasn't until after the A500 that it really took off, but in the US you had PCs overshadowing it rather quickly, especially as VGA and Sound Blaster standardized. (more so with SB 2.0 -44 kHz 8-bit PCM, then pro -22 kHz stereo and dual OPL2s, then pro 2.0 with the OPL3, and SB-16 with 16-bit 44 kHz stereo -for plain SFX rather than pitch shifted stuff, the SB2.0 or higher all allowed a sample rate high enough to easily use simple interleaving to multi-channel audio -ie 4 11 kHz channels, 6 ~7 kHz channels, etc -catering to the 1.0/1.5 would eb a bit more limited with 23 kHz sampling -with interleaving you don't have to worry about overflow or loss of resolution from adding 8-bit to an 8-bit resolution and interleaving is also the most common method for SFX mixing on the Amiga -it would also thus be quite useful even for a fixed frequency DMA circuit in the ST or used for the STe audio for sfx/drums/etc at fixed frequencies -including prescaled instrument samples depending on how much RAM you allotted to PCM, or scaling on the fly to the desired sample rate and avoiding the overhead of adding -I wonder if any AY based ST or STe MOD players actually do that -at 50 kHz you could do 6 ~8 kHz channels mixed per hardware channel on the STe, but you'd onl have stereo on a hardware channel basis so selecting what instrument to play on which DAC and when to pann the output -unless you simply fixed them to L/R and treated it as 6 ~8 kHz stereo channels)

 

Fixed sample rates aren't great if you want to do simple Amiga-like pitch control, but if you're doing software mixing anyway, fixed sample rates (or limited steps) are fine, especially with high max playback rates facilitating interleaving. (remember the ST normally mixed to a 4 kHz output for it's software MOD stuff)

Link to comment
Share on other sites

At least in the UK I think a large Problem for Commodore, Atari & maybe even Acorn was that they were focussing there attention on their perceived competition i.e each other when the real threat were the rising dominance of PCs for serious stuff and consoles for gaming. They didn't help by splitting scarce resources to compete in the PC market when what they needed to do is focus on there strengths and weaknesses. In some respects I think Commodore UK towards the end new what needed to be done and it was unfortunate they weren't successful in their buyout proposal.

 

 

Barnie

Edited by barnieg
Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...