Jump to content
IGNORED

Atari Panther and the controversy of the bits of your processor


Recommended Posts

3 hours ago, Lostdragon said:

Going off a very hazy memory, didn't in seperate interviews, Sam and Leonard offer up polar views on the launch of the ST. 

 

Sam portraying it as a massive success at the time, Leonard feeling it hadn't made any real impact next to the PC?

I don't think those views are necessarily incompatible.   When the ST launched, it made quite a splash in its first year.   At the time PC clones were expensive and often not even 100% compatible.   But by 86, the clone market heated up,  prices were dropping below $1000 and compatibility issues were disappearing.  They quickly started eating the ST's lunch.

Link to comment
Share on other sites

35 minutes ago, zzip said:

I don't think those views are necessarily incompatible.   When the ST launched, it made quite a splash in its first year.   At the time PC clones were expensive and often not even 100% compatible.   But by 86, the clone market heated up,  prices were dropping below $1000 and compatibility issues were disappearing.  They quickly started eating the ST's lunch.

From what I researched, the ST / Atari was able to carve out a very minor computer share between '85 to '88.  It then stagnated before the bottom fell out in 1991.  The Amiga / Commodore followed suite just a few years later.

Link to comment
Share on other sites

4 minutes ago, Hwlngmad said:

From what I researched, the ST / Atari was able to carve out a very minor computer share between '85 to '88.  It then stagnated before the bottom fell out in 1991.  The Amiga / Commodore followed suite just a few years later.

Sounds about right.   People didn't suddenly stop buying STs when clones took off, but over time it became harder and harder to justify an ST purchase vs PC.  Especially once "Power without the Price" stopped favoring the ST and instead favored the clones.

  • Like 2
Link to comment
Share on other sites

Looking at one of his more recent interviews, Leonard said this about the ST:
 
 
"I think the ST was an important machine in the history of home computing. It was the first machine that had enough resolution to do serious tasks like word processing and used a graphical user interface that had a price point that was within reach of many people. The combination was truly “Power without the price”.
 
A console version was discussed but the cost was too high."
 
Of interest to me here, was him talking of why the ST Console was canned. 
 
His recollection seems quite sharp, yet by contrast  Rob Zdybel's hasn't fared so well. 
 
He was supposedly the individual who pitched Project Robin (put the ST hardware in a console style case, bundle it with old Arcade conversions which had appeared on the ST and sell both the console and games, at a budget price..).
 
He had absolutely no recollection of any such console, in the last interview I listened to with him. 
 
 
On the subject of the Panther, in same interview, Leonard said.. 
 
 
 
"I did a lot of work on the Panther. I wrote a graphical development environment that allowed you work on an ST to place objects on the screen and animate them. The system would produce commented code and download it to the Panther. The Panther was started before the Jaguar but ran into design problems that slowed down development. When the Jaguar was proposed it was seen as technically risky so the Panther kept going as a backup. When the first Panther chips came back and didn’t work right the project was dropped."
 
 
 
It's a shame he didn't go into more detail regarding these design problems that slowed down development. 
 
His statement about how it was faulty Panther chips that killed the Panther project, run at odds somewhat, to the statements coming out of Atari UK and other areas (but do run alongside Sam's Technical issies line.. which put it down to Jaguar development eclipsing that of Panther, so Panther was scrapped, had Panther made it to retail, Jaguar would of been ready to launch some 9-12 months later. 
 
Atari didn't have the resources (money or development teams) to support the launch of 2 flagship consoles, simultaneously - Panther supposedly having been intended to launch alongside Lynx-something that might have loose connections for what the Tiertex sources told Frank Gasking of GTW, Panther Strider 2 had been given the go ahead, once the Lynx version was finished). 
 
 
I really do wish I had been able to reach a lot more sources at Atari UK and those who worked on say Shadow Of The Beast, to get an even greater understanding of the hardware and it's time in R+D, but you can only work with the resources available to you at the time.. 
  • Like 2
Link to comment
Share on other sites

5 hours ago, Lostdragon said:
Looking at one of his more recent interviews, Leonard said this about the ST:
 
 
"I think the ST was an important machine in the history of home computing. It was the first machine that had enough resolution to do serious tasks like word processing and used a graphical user interface that had a price point that was within reach of many people. The combination was truly “Power without the price”.
 
A console version was discussed but the cost was too high."
 
Of interest to me here, was him talking of why the ST Console was canned. 
 
His recollection seems quite sharp, yet by contrast  Rob Zdybel's hasn't fared so well. 
 
He was supposedly the individual who pitched Project Robin (put the ST hardware in a console style case, bundle it with old Arcade conversions which had appeared on the ST and sell both the console and games, at a budget price..).
 
He had absolutely no recollection of any such console, in the last interview I listened to with him. 
 
 
On the subject of the Panther, in same interview, Leonard said.. 
 
 
 
"I did a lot of work on the Panther. I wrote a graphical development environment that allowed you work on an ST to place objects on the screen and animate them. The system would produce commented code and download it to the Panther. The Panther was started before the Jaguar but ran into design problems that slowed down development. When the Jaguar was proposed it was seen as technically risky so the Panther kept going as a backup. When the first Panther chips came back and didn’t work right the project was dropped."
 
 
 
It's a shame he didn't go into more detail regarding these design problems that slowed down development. 
 
His statement about how it was faulty Panther chips that killed the Panther project, run at odds somewhat, to the statements coming out of Atari UK and other areas (but do run alongside Sam's Technical issies line.. which put it down to Jaguar development eclipsing that of Panther, so Panther was scrapped, had Panther made it to retail, Jaguar would of been ready to launch some 9-12 months later. 
 
Atari didn't have the resources (money or development teams) to support the launch of 2 flagship consoles, simultaneously - Panther supposedly having been intended to launch alongside Lynx-something that might have loose connections for what the Tiertex sources told Frank Gasking of GTW, Panther Strider 2 had been given the go ahead, once the Lynx version was finished). 
 
 
I really do wish I had been able to reach a lot more sources at Atari UK and those who worked on say Shadow Of The Beast, to get an even greater understanding of the hardware and it's time in R+D, but you can only work with the resources available to you at the time.. 

Regarding the Panther, Jeff Minter stated he worked with a Panther development kit and even produced an (old?) picture. 

 

However, getting down to brass tax, it seems that the Panther just wasn't as viable as what was able to be done with the Jaguar in one way or another.  If the Panther had been able to come out in 1988 or something like that perhaps things would have been different.  But, again, that is all 'candies and nuts'.  It simply wasn't powerful and/or viable enough to keep the project going, whereas the Jaguar was good enough to see through.

 

It is all very curious.  One wonders what Atari could've / should've done once Tramiel and co. took over.  We can only speculate and wonder, and perhaps that is good enough.  Memories are meant to fade, after all.

Link to comment
Share on other sites

18 hours ago, Hwlngmad said:

brass tax

Brass tacks.

 

Pretty interesting stuff.  Console generations definitely shouldn't be so quick.  Though many would agree that the system after the 2600 should have been released sooner, though some may argue if that system should have just been the 7800. 

  • Like 2
Link to comment
Share on other sites

2 hours ago, leech said:

Brass tacks.

 

Pretty interesting stuff.  Console generations definitely shouldn't be so quick.  Though many would agree that the system after the 2600 should have been released sooner, though some may argue if that system should have just been the 7800. 

Even if they had released the 7800 far sooner, for myself, you'd still be left asking the same questions.. 

 

1)Would it of gained far greater third party support, especially over here in the UK? 

 

2)Woukd the Tramiel's allowed developers to use bigger cartridges? 

 

Give how they didn't on the Lynx, i think the 7800 would of been as throttled as it had been, even if it  had been released sooner. 

Link to comment
Share on other sites

On 7/15/2023 at 10:42 AM, leech said:

Pretty interesting stuff.  Console generations definitely shouldn't be so quick.  Though many would agree that the system after the 2600 should have been released sooner, though some may argue if that system should have just been the 7800. 

Oh, one could argue that Atari should have done a lot things differently.  Maybe they shouldn't have gotten involved in home computers at all and released (what would have been an earlier iteration) the 5200 in 1979 in order to have assured themselves graphical and technological dominance with home video game consoles.  Another could be they should have followed through on their deal to distribute (what would have been a different iteration) of the Famicom in the U.S.  If that would have taken place there is no way that Nintendo is as big as they are today as Atari could have, more or less, helped to kill off that product in favor of the 7800 they had pretty much developed by that point.  Again, a lot of could've, should've, would'ves out there.  It is interesting to think how history could have played out but didn't for one reason or another, especially in the video game space when it relates to Atari.

Link to comment
Share on other sites

On 7/15/2023 at 2:00 PM, Lostdragon said:

Even if they had released the 7800 far sooner, for myself, you'd still be left asking the same questions.. 

 

1)Would it of gained far greater third party support, especially over here in the UK? 

 

2)Woukd the Tramiel's allowed developers to use bigger cartridges? 

 

Give how they didn't on the Lynx, i think the 7800 would of been as throttled as it had been, even if it  had been released sooner. 

I don't think the Tramiels could have handled it differently, they were computer guys who viewed the videogame market as nothing more than to dump old inventory they inherited at first.   Took them too long to try to be competitive with Nintendo and Sega.

 

The best chance of a better outcome for the 7800 would be if the company hadn't been sold, or maybe sold to a different owner .

  • Like 2
Link to comment
Share on other sites

16 minutes ago, Hwlngmad said:

Oh, one could argue that Atari should have done a lot things differently.  Maybe they shouldn't have gotten involved in home computers at all and released (what would have been an earlier iteration) the 5200 in 1979 in order to have assured themselves graphical and technological dominance with home video game consoles.  Another could be they should have followed through on their deal to distribute (what would have been a different iteration) of the Famicom in the U.S.  If that would have taken place there is no way that Nintendo is as big as they are today as Atari could have, more or less, helped to kill off that product in favor of the 7800 they had pretty much developed by that point.  Again, a lot of could've, should've, would'ves out there.  It is interesting to think how history could have played out but didn't for one reason or another, especially in the video game space when it relates to Atari.

I can give my testimony about it after plunging into the history of video games and Atari.
I think it's easy to look at the situation today after almost 40 years.

But think the following.

These guys were creating this market. They had no history of other companies doing the same. So I understand the mistakes made, but also intrigue certain administrative decisions.

For example, have the 7800 console practically ready and choose to launch the 5200 to try to win one more.

This kind of decision really impresses today.

The problem is that Atari lost its innovation bias with Bushnell's departure and everything got even worse with Tramiel's purchase, a guy who only looked at costs and ended up losing the train of history when Nintendo and Sega passed over from him.

Nintendo knew how to enjoy the gaps left and got it right where Atari missed. Yamauchi-san had a much better sage than Tramiel's, after all.

  • Like 1
Link to comment
Share on other sites

1 minute ago, Ricardo Cividanes da Silva said:

I can give my testimony about it after plunging into the history of video games and Atari.
I think it's easy to look at the situation today after almost 40 years.

But think the following.

These guys were creating this market. They had no history of other companies doing the same. So I understand the mistakes made, but also intrigue certain administrative decisions.

For example, have the 7800 console practically ready and choose to launch the 5200 to try to win one more.

This kind of decision really impresses today.

The problem is that Atari lost its innovation bias with Bushnell's departure and everything got even worse with Tramiel's purchase, a guy who only looked at costs and ended up losing the train of history when Nintendo and Sega passed over from him.

Nintendo knew how to enjoy the gaps left and got it right where Atari missed. Yamauchi-san had a much better sage than Tramiel's, after all.

Understandable mistakes are still mistakes either way you slice it.  The 5200 was a quick and dirty way to try and get on par with what the ColecoVision was putting out.  However, Atari could have released an earlier version of the 5200 in 1979, but they decided to get into computers instead with the Atari 400 being the psuedo upgraded games machine. 

 

Also, regarding Jack Tramiel and Mr. Yamauchi, least we forget that Jack Tramiel led Commodore to be the first company in history to have $1 billion dollars in revenue.  So, while I would say Yamauchi was the better sage then any of the Tramiels, don't forget that Nintendo was lucky to have deals fall through with Commodore and Atari before striking it out on their own with their games and Famicom in the U.S.  BTW, for me, luck = when preparation meets opportunity.  Yamauchi and Nintendo had a lot preparation trying to enter the U.S. market when the right opportunity presented itself to swoop in and absolutely dominate.

Link to comment
Share on other sites

4 minutes ago, Hwlngmad said:

Understandable mistakes are still mistakes either way you slice it.  The 5200 was a quick and dirty way to try and get on par with what the ColecoVision was putting out.  However, Atari could have released an earlier version of the 5200 in 1979, but they decided to get into computers instead with the Atari 400 being the psuedo upgraded games machine. 

 

Also, regarding Jack Tramiel and Mr. Yamauchi, least we forget that Jack Tramiel led Commodore to be the first company in history to have $1 billion dollars in revenue.  So, while I would say Yamauchi was the better sage then any of the Tramiels, don't forget that Nintendo was lucky to have deals fall through with Commodore and Atari before striking it out on their own with their games and Famicom in the U.S.  BTW, for me, luck = when preparation meets opportunity.  Yamauchi and Nintendo had a lot preparation trying to enter the U.S. market when the right opportunity presented itself to swoop in and absolutely dominate.

It's true.

 

I agree. Nintendo was lucky where the others did not have.

Link to comment
Share on other sites

@Ricardo Cividanes da Silva

On the topic of collecting information on the Panther itself, here's the archived page from the Atarimuseum website:
http://web.archive.org/web/20200831194927/http://atarimuseum.com/videogames/consoles/jaguar/Panther/index.htm

 

The software development documents I have are in the folders "Netlists, PLA's and PAL's" (which downloads panther.zip ) and "Panther HW Documents Flare II"  downloadable at the bottom of that page. There's also folders with technical schematics in autocad format ( .DWG files) but I didn't have luck extracting those with the programs I tried in linux.

 

Info on recovered Atari ASIC designs and schematics for the Panther in PDF format.

http://www.chzsoft.de/asic-web/

http://www.chzsoft.de/asic-web/console.pdf

 

The "GAME SHIFTER" mentioned in that article should be the Panther's line buffer and palette chip, referred to as the SHIFTER in development documents. It doesn't have much in common with the ST or STe SHIFTER other than being designed by Shiraz Shivji (ie lead designer of the ST and head of Atari Corp engineering) who also worked on the design of the Panther Object Processor ASIC, but left the company and moved back to Texas (I believe) before it was finished. It was then in late 1989 that Martin Brennan of Flare Technology was brought in as a contractor to finish the chip design. (this should've been shortly after the production version of the Slipstream chip with a 8086 was completed)

 

I suppose another thing in common with the ST and STe SHIFTER was use of external resistor arrays for the video DAC, having digital 18-bit RGB output on 18 pins from that GAME SHIFTER chip rather than internal DAC and analog RGB outputs as with most consoles and some computer graphics chips of the time. (I believe the Amiga and some VGA implementations already had integrated DACs while others used separate RAMDAC chips with palette and DAC built into that chip)

 

Incidentally, the VGA chipset used in the Atari PCs used an INMOS RAMDAC chip, sometimes called a CLUT chip (Color Look Up Table). This is what I suggested the Panther could've used to save on chip space in the SHIFTER or line buffer chip and reduce its pin count while allowing full 256 colors and same 18-bit RGB colorspace. (you'd only need 8 video lines instead of 18 to feed the RAMDAC and the palette RAM would be inside the RAMDAC as well, meaning you just need the line buffers on a custom chip or moved inside the Panther ASIC itself like the old MARIA chip in the 7800 used) VGA RAMDACs were common, commodity off the shelf parts at the time, so until Atari got to really high production numbers, it probably would've been a cheaper solution overall.



Also, looking at the panther schematics in that link, it looks to me that only the upper 8 bits of the OTIS bus are connected to a single 8kx8-bit SRAM. There's additional connections directly to the upper 16 bits of the Panther bus (the same portion the cartridge slot and 68000 use) but I'm not sure if that was just for copying data to the 8kB SRAM or if there were any situations where the chip could access RAM or ROM directly for streaming 16-bit PCM data. Only 15 address lines are connected, which would limit it to 64kB (32k words x16 bits) of address space on the 68k/panther bus. Seeing as it only uses the lower 15 bits of the 68k address bus, that should map it directly into the first 64kB of the memory map, which would be the 32kB of SRAM (or space reserved for up to 64kB of 32-bit SRAM).

The use of 8-bit wide SRAM would mean any sound data would be limited to 8-bit and not 16-bit, which makes the use of the OTIS over the older DOC (or DOC II) sound chip even more strange. Granted, had they revised the design based on events of 1990 and 1991 with the DRAM shortage ending and an SRAM-only (or even PSRAM) console being less appealing, it shouldn't have been difficult to fit the OTIS with the 16-bit DRAM it was designed to directly interface to. A single 64kx16-bit DRAM chip (128kB) would probably be the minimum useful amount and minimal space on the board and a 256kx16 bit chip would be the next step up (512kB or 4Mbits) with 1991 bulk pricing being around $17.2 for the latter and maybe $4.4 for the former, but possibly a bit more as the 64kx16-bit chips were a bit more exotic or lower demand while 256kx16-bit ones were common, often used in pairs for 1MB SIMM modules of the 72 pin 32-bit wide, or at least would soon be used as such (those SIMM types were introduced in the early 90s by IBM, but I forget exactly when). The DRAM could potentially be shared as CPU work RAM, but I don't think the OTIS's built in DRAM control logic natively supports Atari ST or Amiga style interleaved bus sharing, though given the bandwidth requirements for OTIS at 10 MHz, it should easily interleave with a 68000 at 10 MHz with the 68k (probably faster) reading through a 16-bit latch. I think the chip already makes used of extra available DRAM cycles for streaming sample data into DRAM during playback, but not for sharing with a CPU directly on its bus, so some sort of external DRAM controller or MMU would be needed for that.

While technically less powerful and much less flexible than the "DSP" (32-bit RISC microprocessor copied mostly from the GPU in TOM) in the Jaguar, the OTIS chip with a decent amount of RAM should've easily been capable of more impressive sound than most Jaguar games actually ended up with due to it being an off she shelf part with better technical support and more straightforward use by programmers and musicians, especially those already familiar with Enqoniq keyboards or MIDI synth boxes, or the Gravis Ultrasound card with the relatively similar Ensoniq OTTO derived chip. They just would've needed to put the joystick I/O handling, serial port, clock synthesizer and such inside TOM instead of JERRY or a much smaller and simpler custom IC. The 68000 could've been used as the sound controller like the 68EC000 on the Sega Saturn, which might have meant it worked on the sound bus and off the main bus most of the time, which would've also had the side effect of making it a much better CPU than it was in the Jaguar, for the many games that couldn't offload all the processing onto the GPU. (ie it would've made heavy use of the 68000 for more than just sound and I/O handling much less painful on performance than it was on the Jaguar) Motorola also had 68000s and 68EC000s available up to 20 MHz by 1993 for what that's worth. (they also certainly could go faster than that, at least the CMOS versions do, but Motorola and its Japanese licensees never pushed it past that commercially, probably due to licensing restrictions or at least hypothetical legal concerns and, in Motorola's case, to avoid competing with their own 68020 and higher processors) You had Atari ST accelerator boards using 68000s factory overclocked to at least 36 MHz. (I wouldn't be surprised if every single Atari Jaguar has a "16 MHz"  68000 in Jaguar consoles runs perfectly fine at 26.6 MHz, and nowhere near warm enough to need a heatsink ... it should run much cooler than a 286 or 386SX at the same clock speed, and probably much cooler than NMOS 286s or even 68000s at any clock speed: note NMOS parts consume as much power and run just as hot at any clock speed, unlike CMOS parts that use more power the faster they go, but still use much less power than NMOS equivalents)

See average mass market DRAM prices here:
https://phe.rockefeller.edu/LogletLab/DRAM/dram.htm

 

 

I'll also upload the developer documents I've previously saved in odt format, if that's easier to use if you have open office. (the original files are in Wordperfect and MS Word doc format)

 

PDS.odt PANTHER.odt PANHW.odt

  • Like 1
  • Thanks 2
Link to comment
Share on other sites

4 hours ago, kool kitty89 said:

@Ricardo Cividanes da Silva

On the topic of collecting information on the Panther itself, here's the archived page from the Atarimuseum website:
http://web.archive.org/web/20200831194927/http://atarimuseum.com/videogames/consoles/jaguar/Panther/index.htm

 

The software development documents I have are in the folders "Netlists, PLA's and PAL's" (which downloads panther.zip ) and "Panther HW Documents Flare II"  downloadable at the bottom of that page. There's also folders with technical schematics in autocad format ( .DWG files) but I didn't have luck extracting those with the programs I tried in linux.

 

Info on recovered Atari ASIC designs and schematics for the Panther in PDF format.

http://www.chzsoft.de/asic-web/

http://www.chzsoft.de/asic-web/console.pdf

 

The "GAME SHIFTER" mentioned in that article should be the Panther's line buffer and palette chip, referred to as the SHIFTER in development documents. It doesn't have much in common with the ST or STe SHIFTER other than being designed by Shiraz Shivji (ie lead designer of the ST and head of Atari Corp engineering) who also worked on the design of the Panther Object Processor ASIC, but left the company and moved back to Texas (I believe) before it was finished. It was then in late 1989 that Martin Brennan of Flare Technology was brought in as a contractor to finish the chip design. (this should've been shortly after the production version of the Slipstream chip with a 8086 was completed)

 

I suppose another thing in common with the ST and STe SHIFTER was use of external resistor arrays for the video DAC, having digital 18-bit RGB output on 18 pins from that GAME SHIFTER chip rather than internal DAC and analog RGB outputs as with most consoles and some computer graphics chips of the time. (I believe the Amiga and some VGA implementations already had integrated DACs while others used separate RAMDAC chips with palette and DAC built into that chip)

 

Incidentally, the VGA chipset used in the Atari PCs used an INMOS RAMDAC chip, sometimes called a CLUT chip (Color Look Up Table). This is what I suggested the Panther could've used to save on chip space in the SHIFTER or line buffer chip and reduce its pin count while allowing full 256 colors and same 18-bit RGB colorspace. (you'd only need 8 video lines instead of 18 to feed the RAMDAC and the palette RAM would be inside the RAMDAC as well, meaning you just need the line buffers on a custom chip or moved inside the Panther ASIC itself like the old MARIA chip in the 7800 used) VGA RAMDACs were common, commodity off the shelf parts at the time, so until Atari got to really high production numbers, it probably would've been a cheaper solution overall.



Also, looking at the panther schematics in that link, it looks to me that only the upper 8 bits of the OTIS bus are connected to a single 8kx8-bit SRAM. There's additional connections directly to the upper 16 bits of the Panther bus (the same portion the cartridge slot and 68000 use) but I'm not sure if that was just for copying data to the 8kB SRAM or if there were any situations where the chip could access RAM or ROM directly for streaming 16-bit PCM data. Only 15 address lines are connected, which would limit it to 64kB (32k words x16 bits) of address space on the 68k/panther bus. Seeing as it only uses the lower 15 bits of the 68k address bus, that should map it directly into the first 64kB of the memory map, which would be the 32kB of SRAM (or space reserved for up to 64kB of 32-bit SRAM).

The use of 8-bit wide SRAM would mean any sound data would be limited to 8-bit and not 16-bit, which makes the use of the OTIS over the older DOC (or DOC II) sound chip even more strange. Granted, had they revised the design based on events of 1990 and 1991 with the DRAM shortage ending and an SRAM-only (or even PSRAM) console being less appealing, it shouldn't have been difficult to fit the OTIS with the 16-bit DRAM it was designed to directly interface to. A single 64kx16-bit DRAM chip (128kB) would probably be the minimum useful amount and minimal space on the board and a 256kx16 bit chip would be the next step up (512kB or 4Mbits) with 1991 bulk pricing being around $17.2 for the latter and maybe $4.4 for the former, but possibly a bit more as the 64kx16-bit chips were a bit more exotic or lower demand while 256kx16-bit ones were common, often used in pairs for 1MB SIMM modules of the 72 pin 32-bit wide, or at least would soon be used as such (those SIMM types were introduced in the early 90s by IBM, but I forget exactly when). The DRAM could potentially be shared as CPU work RAM, but I don't think the OTIS's built in DRAM control logic natively supports Atari ST or Amiga style interleaved bus sharing, though given the bandwidth requirements for OTIS at 10 MHz, it should easily interleave with a 68000 at 10 MHz with the 68k (probably faster) reading through a 16-bit latch. I think the chip already makes used of extra available DRAM cycles for streaming sample data into DRAM during playback, but not for sharing with a CPU directly on its bus, so some sort of external DRAM controller or MMU would be needed for that.

While technically less powerful and much less flexible than the "DSP" (32-bit RISC microprocessor copied mostly from the GPU in TOM) in the Jaguar, the OTIS chip with a decent amount of RAM should've easily been capable of more impressive sound than most Jaguar games actually ended up with due to it being an off she shelf part with better technical support and more straightforward use by programmers and musicians, especially those already familiar with Enqoniq keyboards or MIDI synth boxes, or the Gravis Ultrasound card with the relatively similar Ensoniq OTTO derived chip. They just would've needed to put the joystick I/O handling, serial port, clock synthesizer and such inside TOM instead of JERRY or a much smaller and simpler custom IC. The 68000 could've been used as the sound controller like the 68EC000 on the Sega Saturn, which might have meant it worked on the sound bus and off the main bus most of the time, which would've also had the side effect of making it a much better CPU than it was in the Jaguar, for the many games that couldn't offload all the processing onto the GPU. (ie it would've made heavy use of the 68000 for more than just sound and I/O handling much less painful on performance than it was on the Jaguar) Motorola also had 68000s and 68EC000s available up to 20 MHz by 1993 for what that's worth. (they also certainly could go faster than that, at least the CMOS versions do, but Motorola and its Japanese licensees never pushed it past that commercially, probably due to licensing restrictions or at least hypothetical legal concerns and, in Motorola's case, to avoid competing with their own 68020 and higher processors) You had Atari ST accelerator boards using 68000s factory overclocked to at least 36 MHz. (I wouldn't be surprised if every single Atari Jaguar has a "16 MHz"  68000 in Jaguar consoles runs perfectly fine at 26.6 MHz, and nowhere near warm enough to need a heatsink ... it should run much cooler than a 286 or 386SX at the same clock speed, and probably much cooler than NMOS 286s or even 68000s at any clock speed: note NMOS parts consume as much power and run just as hot at any clock speed, unlike CMOS parts that use more power the faster they go, but still use much less power than NMOS equivalents)

See average mass market DRAM prices here:
https://phe.rockefeller.edu/LogletLab/DRAM/dram.htm

 

 

I'll also upload the developer documents I've previously saved in odt format, if that's easier to use if you have open office. (the original files are in Wordperfect and MS Word doc format)

 

PDS.odt 17.75 kB · 3 downloads PANTHER.odt 23.17 kB · 3 downloads PANHW.odt 28.83 kB · 3 downloads

@kool kitty89 thank you!!!

 

Dude, what an amazing material.
I was impressed how much the Panther project was advanced.

This reinforces the theory that cancellation occurred due to problems in the graphic chip.

Link to comment
Share on other sites

On 7/14/2023 at 4:14 AM, Lostdragon said:

Going off a very hazy memory, didn't in seperate interviews, Sam and Leonard offer up polar views on the launch of the ST. 

 

Sam portraying it as a massive success at the time, Leonard feeling it hadn't made any real impact next to the PC? 🤔

 

 


I'm not sure about Sam on the ST, but I generally treat him as an unreliable narrator both on the business and technical side of things. On Leonard in a recent interview on youtube, he said pretty straight up that the ST was slower to get started than they'd expected and followed up by very clearly stating the IBM PC was the main reason for this, with the ST having launched just as the PC started getting really big and becoming more of a commodity product. (he didn't say explicitly, but also the issue of PC clones becoming common and home/botique built or upgraded machines starting to appear from wholesale Turbo XT clone motherboards becoming common at lower cost and better performance than anything IBM or pre-built 3rd parties offered with the possible exception of the Tandy 1000) That said, he didn't make out that it failed to make much of an impact, just that the massive potential for success was curtailed heavily by the success of the IBM compatible PC market of the time.

He went on to mention the ST took off more quickly in Europe, which is pretty evident and obvious. That's also the market where they competed much more directly with the Amiga, or rather where the Amiga 500 (and probably the 2000 in a lot of business/professional applications, especially given Commodore Germany pushed that model hard and the ST was doing especially well for business use in Germany). From the records I've come across for both the US and European markets, the ST did significantly better market share wise than the Amiga prior to the A500's release in 1987 or even through 1987, and the ST's market presence and software base almost certainly laid the groundwork for the Amiga's success in a way that it wouldn't have had it only been the Amiga 1000 on the market prior to that. (with '88 being when they fell behind, likely in part due to supply and pricing issues partially related to the DRAM shortage, I know Atari had to raise prices of the 520 ST that year while Commodore was dropping the Amiga so they both hit around $400 or 400 pounds in the UK where it had been around 300 for the 520STF vs 500 for the A500 at the beginning of the year ... they might have been able to set the 1040STF as the baseline standard if not for the DRAM shortage and supply issues, and with yield problems with new 1Mbit DRAMs whilst having scaled back production of the older 256kbit DRAMs in anticipation of that, so Atari's use of 256kbit chips in 520/1040 ST models and 1Mbit chips in MEGA ST models were both hurt for different reasons)

The ST had the bonus of being more IBM-compatible-a-like than the Amiga or Macintosh both in hardware and OS and disk format (with CP/M or DOS-like syntax and compatible file system) with a very good feature set compromise between a PC-AT or XT and 512k Mac in most respects, except without a Tandy 1000 or Amiga 2000 style big-box expandable option (let alone standard ISA slots) or cheap MFM/RLL hard drive options without a SCSI bridge adapter. Shame the ST's cartridge slot wasn't like the 130XE's ECI port with at least basic expansion potential and direct connection to the 68k bus (also like the Amiga 500 side expansion connector). SCSI drives were nice, but not cheap. OTOH external 5.25" floppy drives did become available, making portability of files from ST to PC even more flexible at a time when 3.5" drives on PC were less universal.

From what I've seen and read, Atari was never super keen on promoting the ST as being PC-like or trying to expand further in that direction in terms of software support or hardware features, but that really seemed like its strongest point as a flexible utility machine with overlapping software compatibility and portability to both PC and Mac with vastly better raw performance and features to anything else in its price range or well above it. (with the exception of PC or Apple II style expandability)

 

see the interview:

 

 

[quote]

I've always viewed the Panther as Atari trying to re-enter tye console market, with a machine that technically could match or better what Sega were putting out with the MD and Mega CD, so more colours and Nintendo with the SNES, so better sprite handling abilities, which themselves have been mocked by the likes of Rob Nicholson and Jeff Minter, who actually worked on the development hardware, on projects Leonard Tramiel himself seemed unaware of.. [/quote]
From the point of view of Atari Corp, they'd re-entered the console market (and even the video game publishing market) in 1986 when they formally released the 7800 and 2600 Jr and actually started up new production rather than selling off surplus Atari Inc inventory.

It's better to think of Atari Corp as a re-named Tramel Technology Ltd. more than anything else, a separate company that absorbed Atari Inc's liquidated assets. Had Atari Inc actually been sold off whole sale, things would've been drastically different. (granted, had James Morgan and Atari Inc staff even been aware that liquidation was pending rather than just a potential sale of the company, they'd have certainly handled things better and much smoother than what happened ... the liquidation of Atari Inc was a complete mess on Warner's part, and that confusion did a great deal to hurt what potential there was for the new Atari Corp from making the most of potential new hires of former Atari Inc staff or Atari Inc projects) Then you had the bigger mess of Atari Inc not owning the 7800, but it being a Warner contract with the GCC guys and that led to more legal issues and negotiations until Tramiel finally just purchased the rights to the hardware (and paid off GCC's contract) outright. Albeit, from what Curt and Marty mentioned years ago, with the pace things were moving at Atari Corp and the resouces they had, a full roll out of the 7800 wasn't going to be ready in 1984 or even 1985 as it was, and it took the efforts of Mike Katz throughout 1985 to actually set up the new Atari Corp entertainment division. Katz also noted the very strong sales in the 1985 Christmas season to the extent they sold off all existing inventory and would've sold more had they had the production capacity ramped up.

The 7800 launched at the same time as the NES and was quickly outpaced by its 1986 success, though maintained strong second place market share for much of the 1986-1988 period (I'll have to dig up figures at some point), but with the limited amount of investment Jack Tramiel was willing to put into the Entertainment division, it would've been up to 3rd parties to come up with anything really big on the 7800 and make the sorts of investments for enhancement chips (mappers and sound expansion) that the Famicom saw in Japan or the NES in the US (albeit without the sound due to change in cartridge pinout), and Nintendo's market lead combined with predatory licensing agreements prevented that. The only reason the Master System had such decent software was Sega's massive internal software development efforts even though they did relatively weakly in the Japanese Market and far behind Atari's figures in the US. They did OK in the UK as far as consoles went, but that's inflated by excluding home computers (with the ZX Spectrum and C64 being the real standard 8-bit game consoles in that market prior to the 16-bit consoles and home computers where I believe the Mega Drive eventually ended up outselling the ST and Amiga in that market segment of the UK).

Atari's very late entry into the UK market and Nintendo's predatory (eventually illegal) licensing restrictions in the US for both domestic and foreign publishers trying to release games there screwed the 7800's potential for growth. It sold well enough that there easily could've been better 3rd party games with heavier use of bank switching, sound chips (even just super cheap off the shelf ones like SN76489s or slightly less cheap low pin count versions of the AY8910), and expanded RAM, and I'd expect Jack Tramiel especially would've preferred if they stayed out of the video game software development and publishing business entirely, but made money from 3rd parties licensing royalties instead. Atari ended up having to put more money into getting more games out on the 7800 since no one else would, or rely on computer game publishers who weren't locked into agreements with Nintendo.

Even then, the 7800's hardware sales were drying up by 1990 and it was pretty much a legacy platform by 1991 (ie installed user base, but few to no new buyers), and it hadn't been on the market in the UK for nearly as long, but also didn't sell nearly as well, probably in part for having to compete with the ZX Spectrum, being even cheaper and having much cheaper software, and the C64 having not being as cheap, but still having much cheaper games (and more comparable graphics, but better sound). Maybe if the 7800 had launched with a RAM+tape drive expansion module it would've taken off, but even then it also would've been needed to be in stock sooner. Wikipedia says it made it to PAL regions in 1987, but I recall a lot of anecdotes and some print articles that indicated wider availability in the UK didn't really happen until about 1989 or 1990.
Actually, a cartridge with a 32kB SRAM and POKEY chip with SIO port added for standard Atari 8-bit tape drives would work nicely ... though a simple bit-banging software driven tape interface would be cheaper and could use generic tape drives, the Atari drives were still reasonably inexpensive ... and better for Atari profits, and had the potential for doing cool things like synchronized tape soundtracks with game code and/or loading. (not ideal for in-game music, but would've been really cool to use for intro/title screens or cutscenes with linear organization on tape)

In any case, they'd made a big enough of a market presence in the US during the years Katz was there (1985-1989) that the Panther would've been a cntinuation of that had it made it to market on time, or had they come out with ANYTHING remotely marketable in 1989-1991 ... even 1992 if it was something better than the panther as-cancelled and cheaper than the Jaguar while also a reasonable value at that. (the Jag was neither cheap nor compelling in its test market time in 1993 but also entered just in time to hit a recession/slump in the market, which in fact was the main drive behind Sega's release of the 32x, to try and fight the slump ... though that's another story in itself).


They also tried with the EXGS, which I believe Leonard Tramiel said was their attempt to "Do the 5200 right" except from the print ads I've seen, it ended up being priced higher than the 65XE, with $200 figures in some cases when the 65XE was just $95~99 And the 7800 was under $80. This was probably including the light gun and keyboard, but exen so a 65 XE had a keyboard built in and the light gun was a gimmick at best. I'd argue a simple RAM+POKEY expansion cart for the 7800 would've been better (and could've been built-in to upgraded models for still under $100) ... hell, just use 28 of the 32kB in the expansion unit and re-map 4kB of the SRAM chip to replace the 2 2kB SRAMs in the 7800 to save board space and cost, probably fitting onto the same motherboard. (if you couldn't quite fit POKEY, you could save a lot of space by removing the RF modulator and using an Atari monitor port instead with optional external modulator/switch box combo like Sega did with the SMS and MD in Japan and with the MD2 in the US and UK ... also Commodore did with the VIC-20, though I think that RF modulator was both bulkier and a bit crap ... but by the late 1980s, composite video was becoming common on tons of TVs with hi-end ones adding S-Video in 1987, and even if your TV lacked it, VCRs of the time were including it and could provide the RF modulation to an older TV, so Atari could've offered higher quality composite video cables and omitted RF modulators in some bundles, while saving costs so long as they marketed it clearly enough ... and probably had the RF modulator bundled in regions most likely to have RF only; not doing this with the Jaguar in 1994 was far more bizarre, and including an RF modulator inside the system was a strange move for that time when they wasted board space and manufacturing cost on that vs a cheaper A/V port and optional external modulator ... hell, I'd argue it was wasteful/overkill to stick an RRF modulator on the STe motherboard either rather than just including composite video + composite sync for wider compatibility of RGB monitors and SCART-capable TVs in the UK/Europe ... though S-Video would've been nice and relatively cheap to implement with common RGB video encoders of the time: you can also omit a dedicated composite video line in favor of merging Chroma and Luma in )


Actually, if they could get it out the door by 1990 (better 1989 in parallel with the STE's release) a console based on the STe hardware stripped of as much as possible and not ST-compatible (unlike the XEGS, but maybe with a general purpose expansion port allowing computer expandability) would've been a lot better than nothing and might have had the significant side-effect of getting more developers to release STe-specific games and with those more STe-specific graphics/art/music programs. Leonard Tramiel mentioned an ST console would've been too expensive and an STe one too and/or not been good enough, but I'm not sure that's in the concept of: take the STE ... remove everything but the GST SHIFTER, GST MCU, 68000, and blitter (which was later integrated into the MCU chip anyway), plus minimal I/O port handling logic for the 2 enhannced joystick ports. (I think those have dedicated hex decoders or something and don't make use of the keyboard controller at all).
They might have still needed to include the Motorola MFP chip, though even the panther dev units had that onboard. (you could also repurpose its timers and I/O lines no longer needed for computer stuff, but then that would also complicate expansion into a full ST and add things for programmers to exploit that couldn't be brought over to the STe)

The LMC1992 stereo mixing chip could be included if it was cheap enough, but it also doesn't get you that much as configured in the STe (you'd really need the 2 DMA channels to be mapped to separate left and right stereo channels, using 4 of the chips inputs to allow smooth panning of each channel or even setting both to center/mono but at different voume levels ... and even doing 16-bit mono sound by setting one to 1/256 the volume of the other, which that chip does actually support ... it's in dB steps and not linear in the datasheet, but at one point I checked and it's either exactly 256 or close enough to still approximate 16-bit linear sound better than either the Amiga or the Sound Blaster 16 can: the latter only rated for the upper 14 bits to be valid, at least in early models, ie not Vibra 16s or such)

Then you just have to force developers to so software mixed music and sound effects on those sound channels, or the simplest solution: just multiplex channels by interleaving samples (2 50 kHz channels become 4 25 kHz, 6 16.6 kHz, or 8 12.5 kHz channels, and the latter case is closer to what the 8 MHz 68000 could handle inside a game if you're doing a lot of note-scaling on the fly and not pre-scaling everything in RAM, even then, with 512kB you can afford to do a lot of pre-scaling and also have some samples directly in ROM, ie anything at a fixed pitch like percussion; and without scaling it's just a matter of copying over and interleaving the samples with relatively little CPU overhead, or potentially even using the blitter to help with that; plus cases where you can store short looping sections of music in RAM and then just mix in sound effects, or short segments of music stitched together into longer, more complex tracks without needing to be one long stream)

Then add a boot ROM that includes the copy protection signature check for catridge based software (ie the only softare on an unexpanded unit) and there you go. A comparatively tiny motherboard with little more than a CPU and Atari's custom chips (less the DMA chip) and not using any features the STe didn't also have, but FORCING all of those features to be used.

 

The final question is 512kB or less? With ROM you can make do with less and not lose much since with hardware scrolling and the blitter you don't need to rely on pre-shifted data as much. 128 kB would be the minimum practical (via 4 64kx4-bit DRAMs) and would leave one bank unpopulated to more easily upgrade. But if they used SIMMs or SIPPs (same pinout and board mounting positions), then 512kB would be the minimum anyway via a pair of 256kx8-bit modules. (then the upgrade would be to 1040STE standard to "full computer")

Hell, Atari could've avoided releasing the 520STe at all in favor of a 512kB game console variant that was MUCH cheaper than the Amiga 500 (though consdierably more expensive to expand to 1040ST standard, even if you got a cheap 3rd party RAM expansion, and 256kx8-bit SIMMs became dirt cheap quickly, and got even cheaper when everything else was getting more expensive in 1993/1994 due to a new DRAM shotrage, they got so cheap they nearly matched bulk DRAM prices but at a consumer level: it was because there was a surplus of new and used 256k modules and a high demand for 1MB modules needed for 4 MB or greater in typical PC motherboards of the time ... 8 SIMM slot boards could get you 2 MB very cheaply at the time, but unless you used SIMM saver style multi-SIMM adapters, you couldn't get more than that... the latter was also good for adapting 30-pin 8-bit SIMMs to 72-pin 32-bit SIMMs, my Dad actually used those in the first home/family PC he built for me back around 1993 or 1994 ... also a used, paper white industrial monochrome VGA monitor, so 3D glasses effects in games didn't work, but we eventually upgraded that ... after installing a CD-ROM drive, yay for black and white multimedia PC) Come to think of it, the SIMM savers were probably used in a later upgrade and not the initial 1993/1994 build. (I still have that baby AT case with the K6-2/550 build from around 2000, but don't have that original monitor ... I think I still have the color SVGA monitor that replaced it, though; that B&W one would've been fun to try out with an ST in monochrome mode, though)


And yes, the STe was underpowered comapred to what Atari should've come out with by that time (and for a computer, not a games machine, a faster CPU should've been higher priority ... ie a vanilla 1040STF with 10/12/16 MHz 68000  + blitter would've been more appealing and DMA sound could've been added without a new SHIFTER ... a cheap IDE interface or general purpose ISA-compatible expansion port would've been way more useful too, or just a bare 68000 bus expansion port to implement the same externally). But given whatever chain of events led to the STe being as it was, they'd brought it to market and a game console based on similar specs (sans all the computery bits) and while obviously not better than an Amiga, it was still notably more capable at games than the ST and without slaving most of its RAM for pre-shifted graphics and workarounds for software scrolling and faster sprite rendering. (you could still do a lot of the latter, but make much better use of that RAM doing such).

512kB ROMs would be common on the lower end even from the start with 256kB being limited to really cheap budget titles and probably ST shovelware ports that originally fit on single 360 kB floppies (except they'd need new sound/music design and could fairly easily implement hardware scrolling if not new sprite blitting routines and would have to use the enhanced joystick ports and remap any keys to the keypads on the controllers: we're assuming the Panther/Falcon/Jaguar gamepads were used here). But 512kB ROM games with 512kB RAM could do some things you'd need a 1040STe to do, albeit without the flexibility of multi-disk floppy based games or games that made use of a full 720kB floppy, especially with compression (you could also compress into ROM, but you lose the ability to use it as RAM-speed data for the CPU/blitter, albeit streaming compressed data from ROM could be faster than streaming from floppy, and compression on the latter was sometimes used to improve data rate as much as it was to save space)

In any case, since the contemporary Amiga 500 wasn't touting 1MB of RAM and indeed, was limited to 512kB of ChipRAM where the ST blitter could easily work in the full 24-bit address space (4MB limited by MMU alone, but even then much larger than OCS or ECS Amiga) you'd thus have cartridge based games capable of animation and/or higher quality sound effects, or other features the Amiga couldn't manage, and RAM to sacrefice for faking dual playfields via animation while maintaining full 16 colors per scanline vs 7+8 colors in dual playfield + lots of CPU and blitter slowdown on the Amiga. (you could still opt to drop to fewer bitplanes in portions of the background that don't need as much color to speed things up and use less RAM, and do that along 16 pixel, word-alligned columns to maximize speed) It's the same method used for PC engine games that fake added parallax layers and some Mega Drive games like, Sonic 3 that does it as well. (some NES games do that too, and obviously a number of Atari ST games and demos)

Now, one area ROM could net you more performance over ST RAM is you could set it up to only be on the 68k bus, so it acts like FastRAM on the amiga, except the ST blitter works on the 68k bus already, so it gets the boost as well. Thus any code and data fetched from ROM avoids the 4-cycle allignment limitation and cheap 250 ns ROM (standard by that time) would get you easy 0ws access with an 8 MHz 68000. You hypothetically could even interleave 68k and blitter cycles in ROM, but this would need more hardware to do (basically a ROM-specific MMU sort of chip with a bus latch) and not in the "as dirt cheap as possible" sort of engineering option as far as repurposing the STe hardware as-is from 1989. Additionally, ROM could even help with 3D games, both for the above reasons (heavy use of multi-bit shift operation for accelerated multiple/division routines don't allign with the 4-cycle memory period the MMU provides, so true 0ws operation would

It also breaks the "no DRAM in consoles" decree from Jack Tramiel, but then it's got totally different market potential as expandable to full STe standard and a tool to enhance software support for the STe's feature set, perhaps especially so in the US market where the ST itself didn't have the sort of success as a dual-purpose game console and home computer as it did in the UK and Europe, though especially the UK, as far as games went (ie it wasn't as much a business machine as some of its success in mainland Europe made it to be ... granded Jack Tramiel's vision was less of a business machine and more a general purpose personal/home/business/educational machine).
 

Albeit if they really wanted to push that no-DRAM angle, you could technically use PSRAM in place of DRAM in the ST, but there would be little point as it still wouldn't be as cheap unless you could do away with the MMU entirely in favor of a much simpler bus controller chip (that'd be nice if it also added more programmable screen size/overscan for the SHIFTER ... you could also omit refresh logic entirely that way as you could extend the vertical display as far as needed to ensure refresh via pure video cycles alone). You could thus leave the MMU to be included in an expansion module with all the other hardware ... except I think DMA sound partially relies on the MMU, too, so you'd need that in the replacement chip too. (if it was significantly cheaper and available at the same time and/or could include the blitter on-chip from the get-go, that would make sense, but otherwise it really doesn't)

OTOH including PSRAM on the ST itself as a 0ws CPU scratchpad "Fast RAM" optionally accessible to the blitter would be something else, especially if it switched to a faster speed while in that RAM. (not a cache, a cache is much more complex and needs caching logic and much faster SRAM to actually fill/update and read through or write back the cache data, depending on the routine used: ie what the MEGA STe has took a lot more engineering work where adding PSRAM/SRAM fastRAM to the ST would be as simple or simpler than DIY RAM upgrades already were ... which makes me wonder why no one did it back then, I mean the Atari 8-bit got early, proprietary RAM expansions that got software support, soo .... actually you'd just have to write a TOS routine for it and any TOS based software could be accelerated automatically, games would be another matter, though) Actually, you could probably just have the CPU clock speed "Turbo Mode" gate toggled without using bank switching, but simply mapping the FastRAM outside the 4MB range the MMU can support (and outside of the chipset I/O area, which is all inside the MMU address range anyway, I think), so when the upper two address lines on the 68k go active or wait no, I just checked the ST memory  map and there's a bunch of I/O and register stuff way up at the top of the 24-bit address space so that's not what you want. What you'd want is a specific address above the 4MB MMU limit but within the lower 10 MB range already reserved for RAM expansion but below the MEGA ST VME bus range (though Falcon RAM expansions map into that range to, up to roughly 14 MB), so it would be a little more complicated, but still could be done automatically fairly simply without having to have the CPU write to a memory mapped I/O port for bank switching. You also probably wouldn't need to be able to disable fast RAM as it would only ever be enabled in software making accesses to that specific address range.
ST memory map is here:
http://cd.textfiles.com/ataricompendium/BOOK/PDF/APPENDB.PDF

 


Even without a faster CPU clock you'd still get a modest speed boost for certain types of code, just much moreso when combined with a clock speed increase. (and as far as DIY upgrades went ... without replacing the CPU or TOS ROMs, you could probably get away with 10 MHz, but you'd need to synthesize it from the system clock to remain synchronous with the MMU and allign along every 5 CPU clock ticks, so that would've been needed in the conversion kit; and the question; at some point it would probably become cheaper to have an upgrade that just used 120 ns SRAM or PSRAM and required a 16 MHz 68000 driven off the MMU clock and require software to bank switch the CPU into fastRAM and simultaneously gate over the 8 MHz clock to the 16 MHz one)

Hmm, Atari could've done that in an STe based console: provide 64kB of 120 ns PSRAM (or SRAM) and a simple mechanism that swiched the CPU from 8 to 16 MHz while accessing it. You could then have the blitter in HOG mode while working in ROM and RAM while the CPU can still do fast routines in its private RAM. Plus you still have the marketing label of 16 MHz even if it ends up being realistically weaker than competing 16-bit game consoles in most respects, though probably better in 3D, especially simple filled/shaded polygons (I wonder if the blitter's halftone mode would allow dithered polygon fills for shading). For texture mapped 3D and scaled 2D objects you'd have more trade-offs compared to the Mega Drive's chunky pixels, not that it actually got many wolfenstein-a-like games. (plus Catacomb 3D and its kin working in EGA graphics had to deal with bitplanes anyway AND work within 512 or 640 kB of RAM, generally 512kB minimum free, so 640kB with DOS loaded or a boot disk required for a 512kB system, though I think Wolf3D itself had added features with full 640k RAM and more with EMS memory available)

The 16 MHz CPU + small scratchpad local RAM (64 kB is a comfortable minimum using 32kx8-bit SRAMs or PSRAMs) could've been introduced as an afterthought on the console, but then retroactively added to an STe+ of sorts or technically ... An 1110 STe+ going with Atari's naming schemes based on truncated totals of memory in decimal byte counts. (1024+64 kB = 1088 kB = 1,114,112 bytes) Or you could even drop back to the 512kB base DRAM, especially with the shortage dragging on into 1989 (starting to fall away in 1990 then completely in 1991) but adding SRAM or PSRAM and a faster 68000 at relatively low cost using components that were not in any sort of shortage for a 580 or 590 ST+ or STe+ or something. (honestly during 1988 with the DRAM shortage forcing 520 STF prices to match Amiga 500s, fast 68000 + SRAM scratchpads and possibly fastROM TOS should've been a quick and simple fix that could appear before the STe itself would in 1989 ... that and actually populating STF/STFM motherboards with blitters, which should've been even faster/easier to do than messing with faster CPU speed and RAM modifications)

This also would've been a very inexpensive way to add a 12 or 16 MHz (ie bottom-end) 68020 to the ST while having a much more meaningful performance gain than if it was just stuck there on the slow MMU bus with small gains from working in cache and faster internal execution times. (in this case use 128kB as a minimum for full 32-bit width ... albeit keeping the 68020 in 16-bit bus mode at all times would be simpler, too ... and technically would still make it more of an ST than a TT, just more T and less S than a 68000 machine ;) )

But for an STe console, a 68020 would be too expensive and lack the flexibility of the many vendors of 68000s (both CMOS and NMOS), most of which offered 16 MHz versions which were falling into the dirt-cheap embedded systems market segment by 1989 where slower 68000 models had already fallen. (

[quote]Maybe because they were UK based projects??[/quote]
Had Atari not just hired one of the lead engineers of that Slipstream project, that might make sense, but the fact Martin Brennan was brought on to actually implement the chip design of the Panther (the Object processor portion) should have immediately opened the doors for potential there, all the more so when John Mathieson was brought in too for the Flare II Jaguar project. Mind you, in parallel with that they were still working on further developments of the base slipstream hardware for follow-on plans with Konix's Wyn Holloway. The intermediate developments are still lost, but by 1993 a CD-ROM based version of the slipstream using 32-bit wide DRAM (or optionally 16-bit with reduced video modes), added 16bpp rendering (not CRY, but 565 RGB pixels) with gouraud shading and texture mapping functions in the blitter (similar to the Jaguar's scaling and rotation function) and a planned 25 MHz clock speed with TV or VGA compatible resolution modes. It was still much simpler (and probably cheaper) than the Jaguar's chipset and all inside one ASIC, with a 386SX planned as the CPU running at 1/2 the ASIC speed (ie 12.5 MHz). No object processor and a less feature rich blitter working on a 16-bit bus (albeit texture mapping wouldn't be any slower as the Jaguar Blitter only fetches one texel at a time and writes one pixel at a time, plus I think maxes out at 1 pixel per 5 clock cycles, even if reading from GPU SRAM, at least based on some tests KSKUNK did years ago: it's also limited to 11 cycles per pixel when reading and writing entirely in DRAM vs 4 16-bit pixels per 2 clock cycles for gouraud shaded fill operations)

See:
http://www.konixmultisystem.co.uk/index.php?id=downloads

"Slipstream Rev4 Reference Guide v3.3

5th October 1993"

Now it's possible development of the new Slistream derivative was delayed until after the Jaguar's production silicon came back, but it would also make sense if they were working on that in between contract work with Atari, especially while waiting for test silicon to come back in mid/late 1991 into early 1992 (at which point the bugs in the first version of TOM came back and they reworked it into the version that was released in 1993 with still prominent, but not as bad bugs).

So aside from the 1989 production version of the slipstream, Atari and Flare had plenty of opportunity to consider a more foolproof and simpler alternative (or predecessor to) the Jaguar without using any part of the Panther at all, or potential to re-use much of the existing Panther Object Processor along with improved versions of the Slipstream DSP and Blitter along with a 32-bit wide DRAM controller fast enough for the Panther to work with. (or at least fast enough to use for max bandwidth object data and use SRAM for the list) That and/or adding page-mode bursts at least for fetching object lists if not data. And if not a DRAM controller, they at least could've used 32-bit wide PSRAM and just basic refresh logic as was in the Slipstream already (and even that should only be needed during vblank, so long as video is enabled and the object lists and/or data are organized to cross all pages/rows of PSRAM ... in which case you'd organize it like in the Atari ST with each consecutive word being on a new row/page so for 256 rows, you just need to scan through a line of 256 words every 4 ms, typical of 32kx8-bit PSRAMs or 32kx16-bit ones too; with a linear framebuffer like the Slipstream uses, this is simple to achieve, but with a more flexible object processor like the Panther, you'd need to pay a bit more attention to how you organized list data and object data to make sure that all works without refresh ... granted it becomes super simple as soon as you use at least one full background framebuffer as one of the objects)

The above scheme of refresh via video access and that weird word organization along DRAM rows works fine as long as you don't need to use page mode (or you want static column mode instead: which some PSRAMs support and some DRAMs did back then, but page-mode was much more common: where you stay in one row and work through different column addresses in the DRAM array). So they could've implemented a system in PSRAM while fully intending to switch to DRAM when possible (and when production volumes merited such a shift in production). Had anyone involved looked at the specs for the few examples of available 64kx16-bit DRAMs (ie all the ones I can find documentation for or catalog listings for from, with 1991 being the earliest) and also knew the DRAM timing used on the ST and STe MMU, they'd have realized the timing parameters were just about ideal for the 80 ns DRAM when using a 32 MHz MMU clock rate for achieving 125 ns read/write cycle times.
Shiraz had left Atari by that point, but he was hardly the only one who knew the ST DRAM timing, even if the Flare guys didn't ever look into that. Or for that matter, if they knew the timing used in the Amiga, which is identical in terms of the critical RAS (Row Address Strobe) and RP (precharge) times, just that the Agnus chip in the Amiga used a 28.636/28.375 MHz clock for ~280 ns cycle times and the ST MMU uses a 2-phase 16 MHz clock (or works on both the high and low cycles for effective 32 MHz inside the MMU) for ~250 ns cycle times. (The Amiga might actually do something fancier since it also has a second clock phase shifted 90 degress off the main one and might actually achieve ~104.8 ns RP and 157.1 ns RAS, the latter being effectively 4.5 clock ticks and would be done by using one of the clock phases to start the pulse and the 90 degree shifted one to end it ... and doing it this way would optionally get you extended precharge time and have slightly more flexible address and data latching time between the various DMA channels and CPU, even if the actual access slots are all effectively 280/560 ns, and as far as the CPU sees things, bus cycles are rounded to multiples of 4 CPU clock ticks just like the ST)

I'm not actually sure what the DRAM timing used was on the Slipstream, but it might've been something funky making use of the 33% duty cycle clock native to the 8086 (and why they used 1/3 of the clock crystal rate like XT motherboards use, in this case 17.897725 NTSC or 17.7345 PAL). DRAM cycle times are 3 DSP/blitter clocks (or 251~254 ns). And a full 11.9318 MHz clock pulse is  83.81 ns which could be used as a precharge period directly with 167.6 ns RAS, but that's quite a mismatch for available DRAM specs and would cut precharge short for most things slower than 100 ns (and even many 100 ns DRAMs, especially NMOS ones). OTOH if they were using a 33/67% duty cycle split on the clock and uses both phases for DRAM control, that means they could effectively use 1.333x for RP and 1.667x for RAS, which is 111.75 ns for RP and 139.68 ns for RAS, both of which are very nice, conservative figures for bog standard 120 ns DRAM and would probably work fine for most 150 ns DRAM out there (it might even be more compatible than the ST's timing, since RP is extended where the ST cuts that to ~93 ns and instead extends RAS to ~156 ns where cutting RAS short is often safer, but wasn't an option on the ST with the clock rate used for the MMU ... though a hypothetical 16 MHz with a 33% duty cycle 2-phase clock actually would've allowed a nicer 145.8 ns RAS and 104.2 ns RP)

Aside from that, the more common DRAM timing the Jaguar used (with equal RAS and precharge times rather than 5:3 ratio the Amiga, which for the needs of the Panther would also work fine with much more widely available and typical 100 and 80 ns DRAMs with RP at 55~70 ns for 80 ns parts and RP at 80~90 ns for 100 ns parts (usually 90 ns for NMOS) and for typical cases you can just round that to 80+80 ns RAS+RP and 100+100 ns RP ro close to it and be fine. (say using a 40 MHz DRAM controller clock rate and 75+75 ns for 80 ns rated DRAM with 150 ns cycle time, which is also typical of 80 ns DRAM, which would probably be typical of 1ws or at least 2-tick access 3-tick cycle time 386SX-20 motherboards or 0ws for a 68020 since those use 2 tick access and 3 tick bus cycle times vs 2-tick access/cycle for 286, 386, and 68030; 68000/010 is 2 tick access 4 tick cycle, 8088/8086/186/V20/V30 are all 3 tick access 4 tick cycle)

However, for the cycle times the 50 precharge requirement along with 70 ns RAS and 130 ns rated cycle times, so very close to the ~125 ns needed for the 32 MHz Panther, but you'd be further out of spec if you just used 60 ns RAS and 60 ns RP. You could use ST/Amiga style timing at 32 MHz for ~78 ns RAS and ~47 ns RP (which people have successfully done for 200% overclock mods on STs, though not STes)
 Or you could get nice figures if you used a DRAM controller at 48 MHz with a 2-phase clock, or at 24 MHz with a 45 degree shifted phase rather than 90 degree. Ie at 24 MHz you could use 1.75 clock ticks for RAS and 1.25 ticks for RP for effectively ~72.4 ns RAS and ~52.1 ns RP. This doesn't help with ideal page-mode cycle times, but for a direct conversion of the Panther or 1989 slipstread (with simple SRAM or PSRAM cycle times and no burst reads) that's all you want.

 


 

[quote]Sam's memo saying, just say technical issues, is typical Tramiel playing the press, they loved to be in the spotlight, annoucing new hardware, that'd never arrive or arrive late and didn't deliver half of what was promised, they'd always have a throw away excuse ready to explain why it wasn't now appearing.[/quote]
Which is why I'm not sure what to make of that without further context, like asking Leonard Tramiel himself about it. An internal memo is still different from public statements, but at the same time that wording really makes it unclear how to take it, or if the "anyone" who might ask refers to developers, atari staff (incuding Atari UK/Europe), or the general public, or both. It does at least fit Leonard's story about the hardware not actually working, but it also obviously worked well enough that at least a handful of dev units went out.

 

 

[quote]With games known to be in development when Panther was canned or at the very least, approved for conversion, more colorful versions of Shadow Of The Beast, Tiertex Strider 2,an unknown RPG, Jeff Minter's take on Star Raiders.. 

 

That's not going to cut any mustard next to the libraries and more importantly, third party support, Sega and Nintendo could call apon for their platforms. 

[/quote]

Yes, the initial lineup wasn't that impressive, but relative to the lackluster early (and few) Jaguar releases, it's far less disappointing. Bone stock Amiga Shadow of the Beast would already be a good example of what average use of the Panther's hardware and color capabilities could be, just throw in some Sega System 16 arcade Alterted Beast style zooming sprites into the foreground when killed (instead of just falling off screen on the Amiga or poofing into smoke in some ports) and you've also shown off at least the basics of the sprite scaling function.

Without broader 3rd party interest, it probably would've had a niche as a budget console with some unique games and a lot of cheap shovelware, but probably a larger library than the 7800 ended up with. It at least should've had a better chance to get 3rd party support than the 7800 did due to the antitrust cases against Nitnendo establishing a precident against certain types of predatory, anti-competitive licensing. (ie if companies wanted exclusive developers they basically had to buy them up outright or continually pay for exclusive rights to given games and publish them themselves ... and even then they'd have to think that's in their best interest for profits, where sometimes a multi-platform business model is best, even if you're a console maker: the profits are from software sales, and if you can sell to the competition's platform and make more money, then by all means do so ... or for that matter make games for multiple of your own platforms, like Microsoft should have but idiotically failed to do with Windows game publishing after the Xbox released, and even delayed the PC release of Halo among a few other games just to bolster Xbox interest: having a console should've expanded their publication base, not diminished it ... they were doing amazingly well for both PC and Mac OS applications at the time and into the early 2000s, then they ended up screwing that up too by doubling down on the Windows OS market when they could've been moving towards making Windows freeware, albeit not open source, with just a pay-to-play IT support maket and focusing on where their real profits should've been: application software; I mean they obviously did well enough in spite of that, but they probably could've done as well or better without having to be as ... unpleasant in their business practices as they continued to be; or you know ... still slimy, but at least more consistently competent in ways that don't simultaneously hurt end users, the industry at large, and their own bottom line; but generally speaking, when you can competently and fairly out-compete and dominate the market)

Now, something like the first stage (and all the on-rails stages) in Soul Star on the Sega CD would be much more like what you'd get for a more seriously optimized Panther game, except likely with slightly nicer colors (the Mega Drive gives you 4 15 color palettes to work with, but for drawing large scaled bitmaps, they all have to be on the same tilemap or sprite layer and using one set of 15 colors to avoid artifacting; Soul Star puts extra work in to draw some things onto clusters of sprites and some to a background later to get more colors, but it's limited there and the more you use different layers, the more VDP DMA time you eat up for re-loading VRAM with the rendered animation: copying to VRAM is usually the bottleneck for those sorts of games on the Mega CD, you also have to halt rendering during the copy process, which slows things down further).

So take that, but do it at 60 or maybe 30 FPS solid instead of 20 FPS if you're lucky on the MCD for the actual scaling effects. And even with the tiny 32 kB SRAM limit, with heavy use of the RLE formatted data and use of 3 color objects (or 4 color background) where useful, you'd save a lot of space in ROM, just not as much as if you could compress those even further and then load them to SRAM. Realtime decompression in software and streaming to small regions of SRAM would also work, but eat up CPU time that would already be limited for a game maxing out the Panther during active screen time. (a game that manages to use close to 100% of bandwidth in a 200 line display leaves less than 24% or less than 4 MHz worth of that 16 MHz 68000's time available ... though granted, for games that avoid sprite flicker and tearing, you'll only hit max usage on a few lines with the max number of sprites, so something more like 8 MHz effective speed wouldn't be out of the question)

Also I'd suggest the considered 64kB vs 32kB would help a lot, but really, the next realistic step up would be 128kB. 64kB would use 8 SRAM chips rather than 4, but even aside from the price of the chips alone, there's 2x the board space used, more traces to run, etc ... but in terms of manufacturing cost of the SRAM suppliers: consider this, packaging costs are a big chunk of costs and pin count is a big part of this. 32kx8 and 8kx8 chips both use the exact same 28 pin DIP (or skinny-DIP) packaging, as do 8kx8 and 32kx8 PSRAMs, but on top of that you had 32kB SRAM densities falling into the best-value, highest volume class somewhere in the late 1980s, probably by 1988 (when it was cheaper for Epyx/Atari to use a 32kB SRAM chip in 7800 Summer Games and WInter chames, when they just needed 16kB, as it was cheaper than using 2 8kB chips that would've required a special longer PCB and cartridge size).
Plus as a "last minute" change to the Panther, they would have only needed to add a couple bodge wires or make a minor motherboard modification since 32kx8-bit SRAMs use compatible pinouts with 8kx8-bit ones.

Hell, for that matter, switching to 32kx8-bit PSRAMs ... and handling the required refresh periods entirely in software via selective code and data location; hell in vblank you could just have a 1kB sound mixing buffer that crosses all 256 SRAM pages and gets read/copied over to sound RAM at presribed intervals ... or hell, if vblank is no more than 63 scanlines long, you wouldn't need any refresh there if all 256 pages get refreshed every active line, so a 200 line NTSC screen is fine, but PAL would need to be extended to 250 lines minimum, even if that just means having blank lines doing dummy all-black/border color Object list reads for minimum refresh, ie still leaving most time for CPU.

Also, in Soul Star, the free-roaming sections with rotating floor textures would be more difficult to do on the Panther and not supported in hardware. But the games made by Core Design and Clockwork Tortoise on the Mega CD are generally good examples of the sort of color usage the Panther could do without heavy use of Amiga style palette swaps. (just again, you'd have nicer 18-bit RGB to work with)




 

  • Like 2
Link to comment
Share on other sites

23 hours ago, Ricardo Cividanes da Silva said:

For example, have the 7800 console practically ready and choose to launch the 5200 to try to win one more.

This kind of decision really impresses today.

7800 was developed outside of Atari by GCC, and I don't believe Atari commissioned it.    From what I understand, GCC took the 7800 to Atari after they had already launched the 5200.  So I'm not sure Atari even knew of the 7800 when then launched the 5200

Edited by zzip
Link to comment
Share on other sites

I'll try to go back and reply to some other posts I missed earlier, but I just thought of something else regarding Atari Corp working with UK contractors.



Atari Corp was outsourcing to the UK from before the ST itself was completed and while they were still working on OS and launch software and built-in OS utilities. Specifically they outsourced to (and likely collaborated with) Metacomco based in Bristol. In particular they provided the initial version of ST BASIC used on the ST. The same company designed Amiga DOS derived from TRIPOS which they'd gotten in 1984, and were very much involved with Sinclair Research with various projects for the ZX-81, Spectrum, and QL. (Wikipedia only mentions the QL, but my dad, Alan Hamilton, personally worked on software for the 8-bit Sinclair computers at Metacomco back then) The Flare team probably bumped shoulders with Metacomco staff at some point, but I can't comment on anecdotes with them. I do know the Tramiels personally visited them on more than one occassion, and even used some metacomco staff (including Dad) as extras sitting/working at ST systems in one of the early UK TV promotions (not one I've been able to find a recording of). He remembered meeting Jack, Leonard, and Gary, but couldn't remember anything about Sam, or at least not by name.

Metacomco had an office in Pacific Grove, CA (Near Monterey, where Dad lived at the time) and got a job with them at some point in 1984, and ended up moving to Bristol to work with them full time by early 1985.

I don't think Dad was especially involved with ST software while there, in terms of actual programming, but had some overlapping involvement in PR stuff and was kind of a utility player as far as overlapping software engineering and technician skills (he was a software engineer by trade, but ended up doing lots of technician work and hardware tweaking, at least at high level stuff: ie actual assembly or configuration of chips on the board or boards assembled into a system case, both at home and unofficial stuff at work ... catching mistakes some of the hardware engineers made: I remember one particular story where pre-production hardware had severe overheating problems, and he went to look at the prototype systems and realized the board layout and ventilation had the airflow going in the opposite direction over the board vs the production model ... which on top of not working in the production configuration also implied the margins for cooling were rather poor)

I think Dad was mostly involved with the Z80 based stuff early on, as far as actual coding he did, and ended up doing some work with their LISP compiler or something to do with that later on, I'm not sure on the specifics. He was involved with their work on the Amiga, not directly in programming the OS, but definitely with testing, quality assurance stuff, and tech support, and represented Metacomco at some conventions where Commodore was demonstrating the Amiga at several points. (he had one story where there was an Amiga demo set up and Dad suggested installing various programs to a hard drive for better performance, and the Commodore guys got upset ... or if it wasn't Commodore, it was some software publishters, apparently upset over giving people ideas on how to pirate software or something, or maybe even just an issue over promoting hard drive support, but it was rather silly overall in any case, given this was all standard procedure on XT class PCs of the time) He was probably involved in some of their IBM PC compatible stuff as well.

Again not sure on the specifics, and Dad passed away back in 2019 following a bunch of health complications, so I'm going on what we talked about in previous years. He was at Metacomco into the late 1980s, but I'm not sure exactly when he left. I know he visited Bristol again in 1990 along with Mom, but I think that was a vacation where he met up with friends and may not have been with Metacomco anymore by then. (he was at Mizar/Integrated Solutions in most of the early 90s doing a lot of stuff with VME bus minicomputer sized rack servers or mainframes or something like that, 68020 and 030 based machines, one we had set up in our garage at one point ... had some of the old VME bus cards for such a system until they went to ewaste in the early 2000s, sadly ... though I have one of the 4MB RAM boards still, and did consulting work with Pencom after that, at least I think in that order) At some point among all that he was working with plenty of x86 based systems, especially 32-bit stuff. (also made comments on 32-bit code on a fast 386 based system ran about the same as it did on the 486 workstations they had at work, and people at the company were surprised when his home-built PC was running their code as fast as the vastly more expensive workstations, where the performance difference was much more obvious for 486 systems running legacy 16-bit code or IA-32 code that made heavy use of 16-bit operations)

I know he went from a TRS-80 Model I, to a Model 16 (or a Model II upgraded to Model 16 configuration with 6 MHz 68000 running Xenix) that he used for home business stuff for years (I've still got copies of Turbo Tax on 8 inch floppies ... still have that Model 16 as well), and then home-built PC compatibles (outside of company-provided proprietary machines he had at home for a while). We never had Atari or Commodore computers at home when I was a kid. So I actually grew up with Nintendo and PC games mostly, though Mom had a VCS boxed up in a closet (missing power supply and RF adapter) that I got hooked up around 1999 or 2000, but that's another story.

Dad definitely remembered the Jaguar, though I don't remember seeing ads for it (I was certainly old enough to remember in '93-95, just never saw it on TV or on display at Fry's or such when Dad used to take me there). He mentioned something about that way back when I first got into retro atari stuff in the early 2000s, I think something about it being neat/interesting but unfortunate, or a mess, or something like that as far as it turned out on the market. We never owned one, but I think he tried one out at some point, or had a friend who had one. He was also working in Sunnyvale somewhere near Atari HQ around the time it was being developed (not sure if that was Mizar or something to do with Pencom, probably Mizar). Weirdstuff Warehouse was a few streets over from Atari HQ, too, and we used to visit there. (I know we visited some other wholesale/used electronics warehouses in the area more frequently than Weirdstuff, but don't remember the names ... other than a guy named Jay operating the one we went to most often)
  • Like 2
Link to comment
Share on other sites

On 7/5/2023 at 11:48 AM, Ricardo Cividanes da Silva said:

I found Panther's development, very strange. Why create a new graphic chipset when they could have used existing PCs or even created by Flare?
As far as I understand in the research, Atari decides to create something new and, possibly cheaper and, as regards Flare, decided to send everything to the Jaguar project, leaving the Panther in the background.

In addition to what Crazyace already said about cost, there's more specifics to consider with PCs.
By 1989, there were a number of highly integrated PC compatible chipsets, some very low cost ones specific to XT-compaytibles, single-chip ones with minimal features in most cases (though there's one oddball Citygate branded chipset that appears to actually use a 4-bit wide DRAM bus to feed an 8-bit latch for a 10 MHz 8088, probably using page-mode to do 2 quick 4-bit accesses before the CPU needs the data, with the added side effect of more flexible memory size options as DRAMs can be installed as 4 sets of 1-bit wide chips or single 4-bit wide chips up to 640kB ... or I think 640k, as I don't think it has UMB support).

The more relevant chipsets would be the single-chip 286/AT class ASICs also getting common by then and continuing into the early 90s (some transitioning use as 386SX chipsets, though many of those are 386SX specific and actually lack some features of the 286 ones, like hardware EMS bank-switching). AMD even made a single-chip 286 core inside an AT compatible ASIC that was available around 1990 (I'll have to check the catalogs to be sure).

But the problem here is, even if you can get a good deal on those chips as they're falling out of mainstream demand for desktop PCs (AMD 386s rapidly drove down prices/costs of 286 systems into the low end after 1991 and Cyrix's 486SLC and DLC in 1992), there's no graphics chipset on those, and on top of that, a lot of extra chip space going to 100% PC compatibility with unnecessary cost going to that.

However, the other area would be looking to turn a PC-oriented mass market graphics chipset into a game console. You have various generic VGA chipsets for that as well as some blitter-equipped hardware accelerators of the period, with the most common and cheapest available I can think of was ATi's Mach 8 series.
See: https://en.wikipedia.org/wiki/ATI_Mach_series#Mach_8

Except that's still going to be too expensive or too late to be relevant here. And if Atari could get a good deal on them, it would've been much better introduced into their computer line. It's an IBM 8514 clone with 640x480 256 color support (ie beyond VGA and into the SVGA umbrella of features).
Granted, for TV resolutions, you only need a fraction of the clock rate for the video output and video DACs, so there could've been some potential for something lower cost earlier on using failed yield parts at much reduced clock speeds that still work perfectly fine for 320x200 or 320x240 in 256 colors at TV sync rates. (but then you need Atari to work towards that sort of partnership and decide to go that way rather than in-house designs; the latter always having an advantage provided you can build enough to have economies of scale take over)

More basic VGA chips are also overkill resolution wise and could have similar underclocking potential for TV-res-only functionality and you had a decent handful of companies with single chip VGA chipsets around 1990, but VGA also has a very limited feature set for acceleration (namely hardware scrolling, some copy and line fill functions, I think, and mask registers to help with blits) and a weird non-linear pixel organization in 8-bit chunky pixel mode (256 colors with VGA features ... not the simple 320x200 mode 13h that works like MCGA as a linear framebuffer with no hardware acceleration, double buffering in RAM, scrolling, or anything)


And besides that, any sort of partnering with PC graphics chip manufacturers of the time would more likely lead to developing a game-console specific chip, even if it was derived from one of their existing parts. But Atari doesn't seem to have had interest in partnering in that manner, or may have just had their hands full with the partnerships they were already attempting. (they had some sort of deal with Ensoniq for sound chips and some sort of collaboration with Inmos with their transputer and their RAMDAC used in VGA chipsets in the Atari PC line, and they were really excited about the potential for the Transputer at one point, their chief Engineer Shiraz Shiviji said as much in an informal interview at a trade show in 1987 or '88, but the Transputer itself ended up largely failing on the market and failing to manage the yields and low costs that were hoped by Imos or Atari, albeit I'm not sure Atari ever considered the bottom-end 16-bit models of the Transputer for more embedded applications or as a coprocessor on the ST, or as part of a Game console compared to what they ended up doing with the Transputer Workstation they did release in small numbers, later than planned, and at a price I suspect was much higher than they'd originally envisioned)

As for the VGA chip in the Atari PC 4, see below:

http://www.ataripc.net/pc4-286/
It's a Paradise PVGA1A-JK
http://www.vgamuseum.info/index.php/cpu/item/479-paradise-systems-pvga1a-jk
Released in 1988, a single-chip implementation of VGA (or 2-chip if you include the RAMDAC), DRAM based, 256 kB to 1MB supported and probably has some extended VGA modes to make use of the extra RAM, but at very least is fully VGA compatible. Now, this most likely was just an off the shelf part Atari got a good/low bid for in quantity and they probably didn't have any extended or special relationship with the Paradise company. If they had gone the latter route, they would've been wise to plan to use related hardware in both upgraded STs and game consoles rather than continuing to focus entirely on in-house custom hardware for the STe, TT, Falcon, Panther, Jaguar, etc. (albeit I'd argue the Jaguar chipset couldv'e easily been used partnered with a VGA chipset similar to the way 3DFX Voodoo 3D accelerators were later used as a companion to a standard VGA-compatible 2D accelerator)

In hindsight, Atari probably would've done better had they just used that same VGA chipset in an upgraded ST (or Mega ST) model in 1988 as it's better than what the TT video is capable of and came 2 years earlier (though that too was delayed), and much better than what the STe video is capable of. And VGA is programmable, so aside from VGA monitor resolutions, it could easily be programmed to use TV sync rates for low resolution stuff. You'd also still need the ST SHIFTER for backwards compatibility, but that's a fairly simple and low cost part that also could've been integrated into one of the other ST chips later on. (the original ST shifter, not the upgraded STe SHIFTER)

All that said, if Atari had really focused on engineering better in-hosue graphic chipsets earlier on, that should've still been much more competitive than outsourcing. And if or when they worked on a new game console graphics chip, it should have been something equally useable on their computers, either whether it was a backwards compatible upgrade of the ST SHIFTER, or a separate chip intended to be used in addition to the SHIFTER. For whatever reason, that didn't work out, but I will say the general appropach they took with the Panther wasn't wrong, just late and over complex. (compared to looking at the MARIA chip in the 7800 and realizing the object list + line buffer combination could work very well for a greatly enhanced replacement for the SHIFTER, using the ST bitplane format but supporting more than just the 1, 2, and 4 bitplane modes of the ST)

Also, given the Game SHIFTER of the Panther (the line buffer and palette chip) is dated 1989 and was designed well before the object processor was complete, it's possible they even had alternate design intents for that chip as well.
While I haven't seen anything indicating that GAME SHIFTER chip was intended for use in the ST family, it's possible the Panther itself had originally been intended to be more like the ST or Amiga and use bitplanes rather than packed pixels.
The 5-bit wide line buffer and 32 color limit would make plenty of sense if bitplanes were used, and you could then use 1, 2, 3, 4, or 5 bits per pixel for 1, 3, 7, 15, or 31 color sprites (+ one color reserved for transparency).
This is actually much more flexible than what the Panther actually does and would have been fairly easy to design to also be backwards compatible with ST SHIFTER modes (more work for the STe and TT modes, though, but STe compatibility would also include DMA sound support and a decent cheap sound option compared to the Ensoniq sound chip).

The Master System, NES, PC Engine, and SNES (except for Mode 7) all also used bitplanes. The Mega Drive and lynx use packed pixels.
The Atari 8-bit computers are also packed pixel format, as is the 7800 (mostly, though some pixel modes are weird, none are true bitplanes and at least some are standard linear packed pixels). CGA uses packed pixels, Tandy graphics modes use packed pixels, EGA uses bitplanes, VGA uses bitplanes for 16 color modes and chunky pixels for 256 color modes.

The panther itself used packed pixels, AKA chunky pixels, but the 32 color mode uses 8 bits per pixel with 3 bits wasted. Likewise the run length object mode uses 2 bytes, with only 5 of the first 8 bits used for 32 colors and 8-bits for 1 to 256 pixel run length. (there was also no compact run-length object using just 1 byte: ie 4-bits for color and 4-bits for run length for 15 colors and 1 to 16 pixel run length, which probably would've been very useful; 5-bits for color and 3 bits for 1 to 8 pixel run would be interesting but logically more complex to split up than just cutting a byte in half)

Packed pixels are much easier to work with in general and usually easier to design display hardware around, but bitplanes make more efficient use of memory at variable bit depths and allow fast manipulation of 1 bitplane at a time, so if you want to draw monochrome 1-bit characters and use the same drawing routine in all color depths, bitplanes work great (I assume this is why they went with it on the ST, since the ST SHIFTER doesn't allow really flexible and efficient memory usage like the Amiga does with display lists, but the ST can use the same character set and 1-bit monochrome graphics in all of its resolution modes and equally fast in all modes).

Hardware scaling in particular is a pain to do with bitplanes (and any sort of texture mapped or smooth shaded 3D, less bad for flat shading, but still slower), so at the point they decided scaling was a key feature in the Panther, they likely abandoned bitplane graphics at that point rather than trying to use both formats (like the SNES does ... its Mode 7 feature is more or less a separate graphics processor entirely). This makes sense, but at the same time is unfortunate in terms of ST development potential, as a simpler 16-bit extension of the MARIA display list instructions using ST bitplane format but with 1 to 5 bitplanes would've been better than the Amiga in most respects, though better still if they did things more like MARIA and let the line buffers be re-configured to different bit depths natively. (MARIA uses 320x3-bit or 160x5-bit line buffers, or probably 160x6-bit for simplicity but only 25 colors are supported per scanline, so only 5 bits is actually needed ... and this also means the Panther line buffers are not even 2x as big as MARIA's from 1984, which is a bit sad ... where the Jaguar's are absolutely massive by comparison at 720x16 or 360x32 bits).

They really should have built MARIA-like features into the STe SHIFTER, or used a 2-chip solution like the Panther, with that Line buffer "GAME SHIFTER" chip being fed by a display list processor based on MARIA, at least conceptually. (MARIA is already more flexible and powerful than the Amiga's own graphics, at least in concept, but lacks the bandwidth and resolution support to actually do what the Amiga graphics can ... incidentally it actually has some things in common with Atari Inc cancelled 16-bit computers the Sierra or Rainbow or Silver and Gold chips, even though it was developed entirely independently of Atari by the GCC guys) The Tramiels did decide to keep the 7800 and bought the rights to MARIA, so it would've made perfect sense to study and consider its potential beyond the 7800 itself. (as the MARIA chip itself and as a design concept for something new)

 


As for GCC and the Sierra projects see:
https://en.wikipedia.org/wiki/General_Computer_Corporation
https://en.wikipedia.org/wiki/Atari_Sierra#Description


This is opposed to the advanced (and likely expensive) workstation class 68000 based Sierra project (the transition was also a horrible mess helped none by Warner, so it's questionable whether Tramiel's engineers even got a proper look at most of that stuff before documentation and staff started disappearing, then there was yet more work, especially on advanced derivatives of the 8-bit computers none even done at Atari's Sunnyvale branch in CA, but in their New York division). Given the Amiga chipset was seen as the sensible, low-cost alternative to those in-house Atari Projects, it's somewhat understandable they were outside the scope of what Tramiel wanted with the concept behind the ST given even the Amiga chipset was probably more costly than ideal, plus it lacked the highres monitor support of the ST that give it more potential for the "Better than a Mac or PC" business/productivity side of things and was better than the Mac or PC in this regard (or better than baseline standard PC options like MDA or Hercules graphics and vastly cheaper than the new EGA graphics, and even then there's some cases where 640x400 monochrome is more useful than 640x350 in 16 colors, and the ST's high refresh rate, flicker-free, monitor was nicer than the 50 Hz super high persistence MD monitors that have horrible motion blur ... actually for cartoon or comic line drawing, the monochrome monitor was pretty good as well, or monochrome cartoon animation cells, even ... plus pixel shape was closer to square in monochrome mode than in color modes, excluding monitors manually calibrated to display square pixels in 320x200 16 color mode)

Anyway, a 16-bit enhancement of MARIA angled at more ST/Amiga beating features for both computer and game console use should've been in development back in '86 or '87 as an alternative to the STe SHIFTER design. Granted the STe design is confusing as Shiraz Shiviji in 1987 or 88 described STe video modes to be more like what ended up being in the TT later on (except 640x240 256 colors rather than 320x480, but the same 640x480 16 color mode was quoted), which was either in error or the TT SHIFTER is what was originally going to be in the STe, but might have been too expensive or delayed. (more likely the latter as the TT SHIFTER needs 64-bit wide DRAM, or at least 2 banks of 32-bits wide, in either case requiring more or wider DRAM chips and more traces on the board. Except even a cut-down 32-bit (or 2x bank 16-bit) version would've allowed 320x240 in 256 colors and 640x240 in 16 colors (or 640x480 in 4 colors), but maybe they were too ambitious, and expected RAM and chip costs to drop more than they did before 1990, and only had that more expensive TT SHIFTER design without a cheaper alternative.

 

I do wonder how cheap a TT chipset based machine could've been with a 16 MHz 68020 in place of the 68030 or a 16 MHz 68000 version for the bottom end of that line. It was on the market before AGA Amigas were and could have been a cheaper Amiga 3000 or mid-range color Macintosh competitor had the price been more like one third to one half of those rather than the TT's roughly $3000 at launch. You'd need the 68030 for out-of-the-box Unix-ready MMU, but a 16 MHz 68030 would do that, too, and the ST MMU itself had a hacky work-around feature to even allow a 68000 to implement protected memory (at least with revisions used in the MEGA ST) to allow UNIX-compatible multi-tasking, albeit slowly, and you could have 68010 or 020 CPUs with optional external MMU socket (and 68000 machines could upgrade to a 68010 along with MMU if desired), but really, the base model wouldn't have been a Unix work station, but just a fast single-task oriented TOS based machine, like most PCs were still at the time. (even if you ran Windows 3.x you mostly used DOS software and while you could do some multi-tasking, any serious work or any games would need to be run with little to nothing else running in the background both for RAM and performance reasons, and plenty of users, probably the majority still ran straight DOS anyway into the early 90s, especially with the graphical DOS shell included in DOS 4.x in 1988 and in the more stable/compatible/popular 5.x)

And on that note: the TT SHIFTER would not have been a good basis for a game machine either, probably worse than trying to hack a VGA chip into a console as far as cost vs performance goes. The STe SHIFTER itself was questionable in that role, but the TT SHIFTER's added cost (or cost of implementing with 64-bit bus) was worse. The 256 color mode was still using bitplanes as well, using more chip space to implement and slower for software rendering (at least anything that used all the biplanes) and any sort of 3D or pseudo 3D.

 

Edited by kool kitty89
  • Thanks 1
Link to comment
Share on other sites

For panther cost would have been everything - in 1991 the Genesis was $149 with Sonic , and if Atari were pushing panther as a 7800 successor it would have to be aggressively priced. In the end though Jaguar just outclassed it on every level and it would have struggled to compete against Sega and Nintendo 16 bit machines.

From the netlist for the game shifter (panther chip) it would set a complete run at once ( using the begin address and end address to set a 320 bit wide load mask for the single 5 bit input ) - so long runs would be quick. It would be interesting to write some tests to figure out what kind of polygon fill performance might have been supported by using the run length objects.

 

On your other points - a 16MHz 68020 TT ( no FPU ) with 512k memory (64K by 64bits would be possible using 64k4bit dram) would have been the best machine to release instead of the 520STe in 89 - just tweaking the modes to be TV and VGA compatible ( so 320x240 256colour and 640x240 16colour for TV as well as VGA ) would give the graphics boost needed along with the better cpu.

89 might have been too early to support memory upgrades though - 2 simm72s for expansion would be great but I dont think they appeared till 1990.

 

 

  • Like 2
Link to comment
Share on other sites

@Crazyace


Prior to SIMMs, there were at least SIPPs which are pin compatible and use identical headers on motherboards. The idea isn't just for expandability, and in fact I don't think Atari added them to the STe for expandability, but instead to reduce cost and to deal with the DRAM shortage and import restrictions. Also critical for both assembly/sub-assembly or user (or service center) installation: SIPPs have symmetrical power and ground lines, so plugging them in backwards is harmless vs instantly destroying most DRAM chips (I've done this myself when working on some 286 builds).

Ram expansion boards had no import restrictions from Japan, so Atari could have assembly done in the US as well as overseas without any legal or supply issues that using bare DRAM chips would require. I think opening up and changing the ram on an STe voided the warantee (though could be cheaply done at service centers) just like with older models. And if Atari actually wanted to go the flexible, open-box expandability route, they could've just made an ST motherboard in AT or Baby AT standard PC case form factor with AT standard power connector + expansion slots in standard AT locations (and I/O headers + brackets filling the first few slots) ... I'd argue they should've used standard AT bus slots too with simple byte swapping of the data lines for big/little endian conversion (or at the address lines for 8-bit cards) ... you could've had dedicated ST-compatible BIOS versions of some boards, but also just potential for PC BIOS emulation/initialization of standard cards. (plus AT bus = IDE bus = cheap hard drives) Or at very least use the common PC clone form factor with expansion slots + onboard I/O like the Atari PC 4 (mostly) used, Compaq style or Tandy style. (and low-cost oriented like Tandy, unlike the Amiga 2000)

Anyway, from 1987 to at least 1991 the US had set price floors for importation of Japanese Dram chips into the United States, and on top of that you had issues with poor yields of 1Mbit DRAM chips for many manufacturers combined with already having shifted production tooling away from 256k DRAMs (or at least didn't expand production due to investing in 1Mbit chips instead), so you had a combined shortage and even more acute price increase for domestic US DRAM prices. That meant Atari would have to abandon any attempts or plans to move or expand final assembly (or upgrades via socketed DRAM, potentially at their Federated locations) and have to deal with higher DRAM prices across the board, even if not quite as bad for overseas production. (CBM didn't have any such problems in Canada or at any overseas facilities outside of the US)

Atari also got in trouble for importing and reselling 256k DRAMs around '88/89 (apparently mostly limited to within Silicon Valley ... including several hundred thousand dollars worth to a vendor in Morgan Hill: a semi-rural town just south of San Jose, that was more rural back then). They claimed it was surplus stockpile, but were brought up on charges of smuggling that dragged on, got dropped, re-examined, and I think no legal consequences came of it. That's strange overall given they had trouble keeping ST prices low due to the RAM shortage and any sort of surplus wouldn't make sense unless it was specifically 150 ns chips that failed at ST timing specs. (plus Atari used a lot of Korean DRAMs, so it would've only been the Japanese ones that were problematic, many of which were purchased before the restrictions began) They were supposedly 256k chips, so not the 64k chips used only in 8-bit models, and also not the 64kx4-bit chips used in the later model XEs since those were all newer chips that shouldn't have failed in STs (though I don't think the ST ever used 64kx4 DRAMs anyway, just the XEs and Lynx).

 

Still weird given Tramiel had previously been known to find new product niches to use surplus components in, and there's all sorts of things you could do with 150 ns DRAMs that only run at official specs and not slightly faster. (256kB 260XE would be one possibility, or an ST at 7 or 7.5 MHz with all the same chips except GLUE, providing different dividers for video sync timing, then just sell those models at a discount; 7/14 MHz pixel clock is still fine for 320/640 pixels on TVs and 28 MHz would probably still show 640 pixels in hires, or might need different monitor calibration: albeit an unmodified GLUE would output VGA monitor compatible timing if clocked at 7 MHz or technically 7.047 MHz since /224 =  31.46 kHz /500 = 62.92 Hz, so VGA h-sync and within the 60~70 Hz 525~449 line specs, with pixel clock slightly lower than 720 pixels wide VGA and well above 640 wide's 25.175 MHz) So quick dirty and cheap option would've been a monochrome-only ST at 7.05 MHz and very conservative DRAM timing for 150 ns chips, plus possibly using cheaper, generic monochrome EGA/VGA compatible TTL or analog grayscale monitors tolerant of that sync rate (not standard EGA, but a lot of Super Herclues or Super EGA modes used 31~32 kHz, many were also ST monochrome 35.8 kHz compatible, and for that matter, a cheap NTSC clock crystal at 28.63636 MHz like the Amiga used could provide 7.15909 MHz CPU + GLUE clock and 31.96 kHz, 63.92 Hz and work with a ton of different monochrome monitors of the 1987/88 period). But I could be wrong and they didn't just have a parts surplus and genuinely were trying to just smuggle DRAM to make money  on the side, but that seems pretty stupid, especially while Jack was still CEO, or even just after Sam took over. (and a hires monochrome only ST at discounted price probably still would've sold ... no good for games, but most of the serious computing and music/MIDI oriented stuff was best in, or required hires anyway ... then again, the print ads and TV ads I've seen from back then did fail to really emphasize the strengths of 640x400 monochrome)



Also, I just stumbled on the Super XEGS documentation from 1988, which I'd missed back in 2019 when it was posted on the Atarimuseum, it sheds more light on what Atari Corp was thinking and just how much effort they were putting into trying to producing a new game console and an extended 8-bit family system. At first I thought it might have been more like the Panther and even used the same GAME SHIFTER line buffer chip, but it looks like it only needed a single 320x5-bit buffer, not 2, and had a much more Sega/Nintendo/Hudsonsoft (PC ENgine) style of tile + sprite architecture, but with expanded bitmap framebuffer modes more on par with the ST or Apple IIGS. It's not very MARIA-like, where the Panther definitely seems more like a MARIA inspired architecture with complex/flexible object/display list, but cutting out the character mode entirely. (the Super XE docs even have "Display List Architecture" crossed off at one point, so it seems like they considered expanding on the A8 display list, but decided against it, too, the opposite direction the Panther took) It also has a full-screen background rotation mode, like SNES mode 7, though was probably going to be a linear framebuffer (and is 4bpp 16 colors not 8bpp 256 colors) rather than the SNES's tiled mode 7. They also went the opposite direction there with the Panther, using scaled/zoomed sprites rather than a single scaled/rotated background. Atari was working with Ricoh at the time for a planned XE-on-a-chip ASIC with 4/8 MHz 65816 (8 MHz with 64 bytes of on-chip RAM, it looks like) and external DRAM controller ("New FREDDIE") . It makes me wonder how much common engineering went into the SNES and this Atari Project, or if at least the Mode 7 feature was related to this.

That said, I'd criticize lack of unified development of a cheap, multi-purpose ST SHIFTER + game console useable chip, in general terms, or at least limited to 2 progressive designs in parallel that replace the ST SHIFTER, ANTI+GTIA, and TIA+MARIA. Ie not compatible with all 3, but paring their line-up down to just 2 platforms and just 2 graphics standards. (so have an A8+ST compatible upgraded video chip plus an enhanced 7800 on a chip with better graphics and faster CPU core: since it's 650x, lots of options for manufacturers with existing licenses, fewer if they had to have an '816 license, but VLSI had both, and they used them for the Lynx), or have an upgraded ST architecture plus A8+7800 combined on a chip, the latter could approach some of the Panther's features but use a fast, embedded 65816 core instead of external 68000. (in fact, both could use different ASICs with embedded 65816 cores, and could be used as a dedicaed sound+IO coprocessor on an enhanced ST or TT) Plus the Super XE targeted a 5-bit line buffer which could've been empoyed for new 5-bitplane modes where A8 extended packed pixels couldn't do that (albeit I think the line buffer only needs to be used for sprite+character modes, hence why 640x2-bit screen modes can be supported, likely without using the line buffer, unlike MARIA where the line buffer can be confugured as 320x3 or 160x6 ... though only 25 colors are useable so it's no better than 160x5)

I used to think more in terms of just discontinuing the 8-bit hardware and focusing any new console graphics on something also useful for an upgraded ST (fewer chips to produce), trying to think  in terms of the Tramiel era business model, but they clearly had a lot more interest in extending the 8-bit line than I thought, which changes things. Though one idea was taking the SHIFTER and adding MARIA style DLLs and display lists for a flexible framebuffer + object system and double line buffers to freely allow all the extra SHIFTER DMA cycles to be used (all the wasted H-border bandwidth, or 256 bytes per line using STe MCU timing or PAL ST MMU/GLUE timing), but then also enable bus saturation modes for up to 512 bytes per line, plus flexible use of the line buffer for higher color depths at lower resolution. (albeit MARIA already has 960-bit line buffers, so 960x1, 480x2, 320x3, 240x4, 192x5, 160x6, 120x8, and you'd only need 1280-bits minum to support all existing ST modes, or less than that if you restricted some modes to single-line buffer orientation with single framebuffer only, possibly for bitplane modes only, and in which case MARIA already has 1920 bits = 960x2, 640x3, 480x4, 384x5, 320x6, 240x8; and if they wanted to be really cheap-but-better-than-Amiga, use just 32x9-bits for CLUT and max 6 bitplanes for 64 halfbright colors mapping 9-bit CLUT entries into 12-bit space, as good as Amiga, but using less CRAM, plus re-mapped optionally as 16x18-bit for VGA quality 16 colors and an 8-bit ROM palette mode simulating the 256 color MARIA palette in chunky pixels, or 8-bit RGB, or 32x9-bit CLUT colots x 8 shades in 18-bit colorspace; you could do other CLUT re-mappings, but I think 3-bit vs 6-bit channels would be simpler to design)
Also, using MARIA's existing DLL format, if you extended that to 4 bytes (also good for a 16-bit bus) you could extend the Zone selection from 2 bits to 5 for 8 to 256 lines, so you can do 1 big display list per TV res screen or 2 lists per 512 line screen (plus 3 more control bits for other things, like per-line or per display list colorspace mode select: ie 9-bit, 18-bit, 8-bit ROM, 8-bit RGB, 8-bit CLUT+shade, maybe others). ANTIC+GTIA are older, relatively low transistor count and simpler chips, so maybe it wouldn't have cost much to add those in as well. (or been almost free transisor cost wise, but require more engineering effort to implement them seamlessly vs just chunking in an extra logical block ...)

 



OTOH dropping both of those ideas in favor of a blitter capable of hardware scaling and rotation (which would double as texture mapped spans for polygons or 3D plane effects) would've done all of the above and been a lot more flexible, at the expense of being slower for a given graphics bus bandwidth (ie slower if just emulating the fixed features of scaled sprites OR one rotated background, but way more flexible and more efficient for many situations where you don't need that full fixed-function effect). Absolute minimum addition to the Slipstream would've been hardware rotation and you'd have something impressive for 2D/pseudo 3D even if arcade and console style 2D games would run slower than the competition. (but OK by 1991 PC VGA standards, or a bit later even. I'm not sure, but the Slipstream seems like it could pull off Jazz Jackrabbit, ignoring RAM constraints ... since it uses relatively few and relatively large sprites and a 2D background with no parallax and limited foreground priority layers: the latter reduced at low detail settings) Early 90s PC style space/air combat sims and fantasy games, ray-casting FPSs, and graphic adventure games could've been their niche while having some overlap with common 3rd party console/arcade games. (an ST mouse pripheral would make a lot of those games easier to play ... a keyboard would be nice, but not as necessary; FPSs and point and click adventure games would benefit a lot, sim style shooters would be better with a keyboard + analog joystick) IMO, offering PC style interface pripherals would've been more useful than the angle Konix was going for with "unconventional" game console interface, plus it wouldn't have been far off the plot Atari was going towards with the Super XEGS.

Sort of like what the Mega CD supported, but without the need to render into tile order, but just use 4-bit or 8-bit linear packed pixels. Basically, add a scaling/rotation effect to the Slipstream chipset and you'd have a powerful and unique feature set able to do the things Atari was already interested in (at least in 1988 and 1989) before Martin Brenan even came on board and before the SNES specs were public (or possibly before they even had that feature), but taking the simpler approach the flare team did of "do graphics one way, the best way" rather than the background + sprite hardware + blitter of the amiga or other game consoles (sans the blitter). Obviously if they were doing a re-spin  of the slipstream hardware itself they could fix or improve other things, too, like using a faster DRAM controller (to do 2 tick DRAM cycle times) or faster chipset as a whole (stick with 3 tick DRAM cycle times but run the ASIC at ~20 MHz with more variable pixel clock ... or 21.48 MHz with 5.37 MHz pixel clock just for the common 256 pixel width resolution) and/or adding page-mode support for blitter operations, or at least for a fast fill function (if there's no line buffer or at least FIFO of some sort, you'd still have to do interleaved video data fetches, but screen clearing during hblank or vblank would be possible, and just intermitted faster fill operations if not synchronized with screen timing). The sprite drawing function already only works 1 pixel at a time, so does the vector line drawing function, so keeping things just as simple with a texture mapping feature would make sense (that's all the Jaguar's one is anyway). Since the blitter mask register already supports 4-bit masking for 4-bit sprite data and 4 bits for CLUT offset (16 sets of 16 colors) you could easily extend that feature to do 16 shades/light levels for flat shaded texture mapped 3D as well (or drop to 8 light levels with 2 16 color texture palettes to choose from). Granted, rotation is more computationally intensive than just scaling, so it might make sense if the scaling mode could be done "free" (as fast as un-scaled blitter sprites) but rotation might be slower. Also they might have implemented the scaling/rotation routines by duplicating the DSP logic block and adding an embedded ROM to run on, but still allowing that second DSP to be programmable when scaling/rotation mode isn't enabled (sort of like the SNES's mode 7 multiplier unit being available when Mode 7 is off, except a lot more useful than a bare multiplier).


The Jaguar itself was too risky and ambitious to be the ONLY design. If they ever thought it could be production ready by 1992, it was severely over-optimistic, and using Standard Cell logic was faster and denser than Gate Array logic (as the Panther and Slipstream used) but meant longer delays between tape-out and silicon, and between silicon revisions, and restricted which manufacturers they could source chips from. Plus putting so much into TOM meant compounded bugs were more likely and a bigger, more complex chip to revise more slowly. Like if it had still been a 2-chip design with the same motherboard footprint and pin count (similar cost to physical chip packaging), but removed the GPU RISC chip from TOM and implemented it on a second chip (putting the GPU onto the sound/IO chip and intending it to do double-duty sound + 3D math and game logic when possible) and initially used gate array logic, possibly at more conservative targeted clock speed, and intending to allow both chips to run at different speeds if necessary would've all been a lot more foolproof. Then just plan a single-chip Standard Cell implementation as either a cost reducing measure or as full successor with more features or just higher clock rate. They could've hedged their bets even further by having an alternate sound+I/O ASIC that just used the simpler Slipstream DSP plus separate DMA sound circuit that could read from DSP RAM or main RAM (so DSP-math heavy 3D games and such could at least use simple DMA sound for sound effects) and have that chip act as a bus gateway between the 64-bit bus and the host processor, possibly with a 16-bit wide DRAM controller suitable for a 68000 and narrower/cheaper 16-bit wide connection to cart ROM. (since you now lack the GPU to act as CPU, local RAM + ROM access separate from GPU RAM would be better than trying to do a true single-bus system)

Except even then, I'd argue the Object List processor was overkill. If the Panther had been released, then having an enhanced and backwards compatible OLP inside a successor would make sense, but after the Panther's cancellation, they could've cut that out and beefed up the blitter to work with a fast, but much simpler display controller (with short line buffers or FIFOs sufficient for good page-mode performance) and then put more chip space towards dedicated blitter texture RAM (for fast sprite/tile rendering and texture mapping) and fast blitter scaling. (either dedicated scaling at full bus speed, or enhancing of the texture mapping rotation feature to full bus speed, at least when reading from internal texture RAM)
A dedicated blitter CLUT for color expansion (and fitting more into internal texture RAM) would've been more useful as well, or just using 1 256 color CLUT, but give it to the blitter when rendering 16-bit direct color and to the VDC when doing paletted framebuffer modes. The object processor's sprite system really didn't make sense for any 3D or pseudo 3D game where the blitter rendered any sort of foreground or terrain or scenery, since you can't composite OP objects on a per-pixel basis with a blitter scene (there's no per-pixel priority in the framebuffer to support it, and the OLP can't use the Z buffer for object priority either) so the fast scaling sprite function is only any good in an all-sprite game and the much slower blitter rotation mode has to be used for scaled sprites in any other situation. (Even in a game like doom where the full scene is texture mapped, the enemy sprites and projectils really, really slow things down, which also means the higher difficulty settings run noticeably slower than the easy ones: I think minimum difficulty hits 30 FPS fairly often, but usually below 20 at high difficulty settings ... granted, Doom would be a good example where it'd run much faster if only a dedicated un-rotated scaling function was added to the blitter, since columns and sprites use straight scaling, only the floor/ceiling spans use affine mapping style rotation) Well, that or using an embedded 65816 inside the sound/IO chip along with the DSP (or J-RISC) as the host processor working in its own fast scratchpad RAM (and due to 650x access timing, you could allow the DSP or RISC "free" interleaved access to 8-bit SRAM at half speed, or DMA to/from that scratcpad withotu slowing the '816) and have a second 65816 core inside the blitter+display controller ASIC for blitter handling, so it can "babysit" the blitter full-time without the massive waste that using the J-RISC for that would be. (have a blitter command list in 8-bit scratchpad RAM and the 65816 poll status or receive interrupts for each blitter operation).

They could've even done embedded 68HC000 cores, but that's more transistors, arguably no better performance, and a more limited (but still decent) selection of manufactuers with 68k licenses. (Hitachi, Toshiba, and Motorola come to mind, and with the 80 ns Standard Cell ASIC process Atari did use for the Jaguar, Toshiba and Motorola were both relevant, but a 65816 could be had for really cheap) Plus 8-bit local external sound/CPU bus + cheap 8-bit wide ROM used only for streaming sound data and bulk storage would also cut costs for ROM (sort of like the hi speed serial ROMs the N64 used) and get you 5 MB/s DMA transfer rate if using then-common 200 ns ROMs. (Atari used a lot of Macronix roms for the Jaguar, and the slowest grade available in 1993 in their databooks was 200 ns ... Atari was also using PROMs and not Mask ROMs, which reflects their low production numbers vs cheaper high volume mask ROMs) And I know 6502/816 cores suck at using C code, and some compilers were especially awful, but having something like 10-16 MHz core-clock in internal SRAM should've helped a lot and been better than a 13 MHz 68000 without any local RAM at all. (and still even better off than the 68k in the Jag if cart ROM bus was separate from main DRAM bus, and better still if there's some external local sound bus DRAM, PSRAM, or SRAM, even just a single 32kx8-bit chip)

Actually, deleting the external CPU of the slipstream and using an embedded 65816 instead, and sourcing chips from VLSI (as they were with the Lynx) would've probably be cheaper and faster than the existing 8086 system. Or, with just a 6 MHz 65816, there's no "probably" it would've been definitely and quite substantially faster, but it probably could've been 12 MHz internal, 6 MHz external in 100 ns PSRAM, and 3 MHz external in 250 ns ROM or DRAM, plus the 16-bit CPU data bus latch could run at 12 MHz and used to accelerate sequential byte accesses for 16-bit CPU operations, so every other byte fetch is full 12 MHz with the other bytes being 6 or 3 MHz depending on data location. (if they'd released the initial system around 1990 as-is, with all the internal modifications related to adding the 65816 + scratchpad RAM, Flare/Atari could've stuck with using PSRAM and planned to release a DRAM-based version later on) 
Plus the 65816 is very fast for doing simple I/O handling stuff and almost anything that conventional 2D video games need, plus very fast response time for driving the blitter and very fast interrupt handling. And the SNES used it, just vastly slower than what I'm suggesting. (more like the SA-1 chip some late gen SNES games used)


A single chip ASIC line that + 128kB of PSRAM should've been cheaper than the Panther with 32kB of SRAM + 68000. (32kx8 PSRAMs were probably a little more expensive than 8kx8 SRAMs, but the cost of the external 68k would probably more than make up the difference) OTOH using just 2 128kx8-bit PSRAMs for 256kB and less board space used might be better. If they spent the time to also add the scaling+rotation blitter feature, it probably would've been 1991 already anyway and 256 kB would make even more sense. OTOH, if you added an external sound chip or added a simple DMA sound circuit instead (to drive the PWM DACs and free up the DSP), you might get enough performance with the DSP alone doing scaling/rotation effects in software or with an embedded ROM specific to that function, to keep DSP RAM free for other things. Also CLUT RAM is already 16-bits wide, and if they used external resistor arrays for video DACs, it would've been trivial to expand the 12-bit RGB space to 16-bit, or a bit less simple (but really neat) to implement a custom 16-bit RGBI or RGBY colorspace with 12-bit RGB and 4-bit intensity/luminance mapped into 24-bit colorspace. (that way you'd get true 16 shades of any 16 colors from 12-bit RGB and could add a 128x16-bit direct color mode allowing flat shaded lighting effects by simply setting the 4-bit intensity element; you could also give all 512 bytes of the CLUT RAM to the DSP in this mode)

 



Given the specs and intention of it being a game system with home-computer useable system, the Super XE would have beating the STe's game capabilities and even had some raw processing power advantages (for reasonably optimized 65816 code it should've been much faster at a number of things). Where it could've filled the low-end computer + (roughly) Amiga class graphics + sound and let them focus on cost-reducing the TT architecture as the baseline standard ST successor moving towards the Falcon. Plus the TT SHIFTER was fed externally with at least some of the address and DMA logic shared with the MMU (at least I think it's more like the original ST in that sense) and revisions on the MMU side of things could've allowed cheaper 32-bit or 16-bit wide DRAM to still allow some or all of the TT modes. (plus, unlike the Amiga, the ST/TT used word-interleaved bitplanes, so burst-mode DRAM access could be used without much complication, so with FIFOs filling 64-bit wide SHIFTER data latches or however it's configured internally for 8 16-bit plane words in the TT SHIFTER, you'd have a narrower DRAM bus "appearing" to be 64-bits wide to the TT SHIFTER) The ST does a weird memory address arrangement for each word being in a different DRAM column (thus refreshed automatically during video DMA, leading to massively excessive refresh counts in the ST) so keeping that scheme makes page-mode not useful at all, but even if you kept that refresh scheme, you could change it to change rows every 64 or 128 bits and still refresh automatically and allow page-mode or static column mode to be used effectively (with or without bank-interleave as well). OTOH they could drop that scheme entirely and just have constant minimal refresh handled by the MMU, or done via the DMA sound circuit (use sound words stacked through DRAM rows and a looping buffer 512 words long for up to 512 or 1024 DRAM pages and do dummy read cycles when sound is off/muted). Whatever the case, if you have the MMU do only minimum refresh during vblank and no refresh during H-border region, you'd also gain the ability to give extra cycles to the CPU or blitter, and use a 32-bit latch for CPU access if using a 16-bit bus. (without the latch you'd need the faster bus cycles of a 68030 to "soak up" the available bandwidth in 16-bit bus mode, and that probably worked for the Falcon, but I'd think the cost disparity between an '020 and '030 in 1989 or 1990 would've been more significant)

Also they could/should have just had faster ST versions prior to that. Do that either via faster CPU+MMU clock and fixed 32 MHz SHIFTER clock rate, or just faster CPU + local bus DRAM controller allowing 0ws 68000 or 68020 bus timing using slower/cheaper DRAM with less aggressive timing than needed for interleave, plus potential page-mode support: in the latter case, all 120 ns DRAM could be used for 0ws 16 MHz 68k/020/030 bus cycles with 125 ns access and cycle times, or short of that, put 64kB of SRAM or PSRAM as local CPU RAM (128 kB for 32bit bus). You'd also need to boot the CPU at 8 MHz for slower ROM speed or use fast ROM for faster CPU. (performance wise, FastTOS ROMs would be good ... and the MMU already handles wait states for DRAM access, so I don't think that needs to be modified) I think the SHIFTER can still get sufficient DMA cycle access times to work at 32 MHz (ie normal speed) with MMU at 20, 24, or 32 MHz, but I'm only 100% sure of the 32 MHz setting. If the slower settings worked, there's really no reason Atari didn't at least come out with a 10 MHz ST using a mix of 120 and 100 ns DRAM that fit within 20 MHz MMU tolerances. And even so, if they hadn't ever released the STe, they at very least had the potential to continue using the older ST chipset with 16 MHz 68000 as a lower-end option to the TT once 70 (and certain cases of 80 ns) DRAM became cheap enough to do that with. (though I'd argue a 16 MHz 512kB ST with 4 256kx4-bit 70 ns chips could've met an appealing price point by 1990, if not 1989) I'm not sure if use of the STe's external video sync mode would allow MMU overclocking like that, I haven't seen any such attempts. (even then adding a CPU fastRAM style bus would work around that, or just adding bank-interleave to the STe and using fast enough DRAM timing to allow 0ws 16 MHz 68k operation in one bank and normal slow 500 ns access slots while working in the other, still with up to 2 MB in either bank, but just 512kB in the video bank probably being the popular option, and the Blitter and DMA chips, everything on the 68k bus side could also be doubled, blitter would just get 68k style wait states when accessing the SHIFTER bank)

Albeit instead of developing the STe features, just integrating the custoom ST chips into fewer chips would help ... except you'd also lose the ability to run the MMU, SHIFTER, and GLUE at different clock rates (required for the 16 MHz mod), though at least expanding the SHIFTER's address range to allow for overscan would be nice (or modifying the MMU to allow multiple 32k pages per screen, even if it requires a CPU interrupt to set the next SHIFTER base address during h-blank). More bitplanes would be nicer, but definitely use more chip space, though I don't think so much so if adding packed pixel modes, or at least much cheaper gate-wise than expanding the SHIFTER bitplane buffers beyond the 2 sets of 4 words it uses, the simplest probably being 4 MHz 8-bit chunky pixels bypassing the palette and using 8 of the 9 digital RGB outputs. The 1-bit monochrome mode already works with linear chunky pixel addresses, so you'd just need to have the 1-bit shifted output accumulated to 8-bit words and latched every 8 32 MHz cycles.
(in fact, it might be possible to mod/hack this externally by hooking up the monochrome pixel output line to an 8-bit shift register, or chained pair of 4-bit shift registers, feeding the 8-bit latched output to 8-bits of the RGB resistor array, then putting the SHIFTER in monochrome mode while setting the GLUE in NTSC or PAL color mode, so you'd get a 4 MHz pixel clock with 256 color 8-bit RGB for 160x200 ... or 320x200 with a 16 MHz overclock and 64 MHz SHIFTER ... and at least some ST SHIFTERS can already run fine at 64 MHz) You need to use the software overscan hack to extend the video height to 640x200x4bpp though (not much CPU overhead needed for that though) otherwise video ends at 320x100 lines with garbage for the remaining 100 lines.
https://blog.troed.se/projects/atari-st-new-video-modes/
 
Actually supporting a 256 color entry CLUT would add a lot more to chip space used and more changes to internal SHIFTER logic to allow it, since the pixels need to be in packed format before passing through the CLUT where monochrome mode does linear pixel order by bypassing the CLUT entirely. Albeit they could've started with a cheap direct 8-bit RGB and later switched to feeding the 8-bit output to a VGA RAMDAC for 18-bit RGB CLUT. Use of the RAMDAC would be more relevant if they'd done a 16 MHz system clock version recycling the old SHIFTER, but allowing a linear 320x200 256 color chunky pixel mode identical to MCGA/VGA Mode 13h. (dumb framebuffer, no hardware scrolling ... but better than VGA or MCGA in that it could multi-buffer and page-flip at least) TT SHIFTER has the bandwidth for 320x200x16bpp highcolor at TV resolution, like the Falcon. (had they only included an ST-compatible CLUT inside the TT SHIFTER and hooked it up to an external RAMDAC, 8bpp could've been initially supported with optional 16-bit highcolor RAMDAC ... except that won't work at VGA resolutions unless you have a line buffer or enable bus saturation during active display)

 

  • Like 1
Link to comment
Share on other sites

3 hours ago, kool kitty89 said:

@Crazyace


Prior to SIMMs, there were at least SIPPs which are pin compatible and use identical headers on motherboards. The idea isn't just for expandability, and in fact I don't think Atari added them to the STe for expandability, but instead to reduce cost and to deal with the DRAM shortage and import restrictions. Also critical for both assembly/sub-assembly or user (or service center) installation: SIPPs have symmetrical power and ground lines, so plugging them in backwards is harmless vs instantly destroying most DRAM chips (I've done this myself when working on some 286 builds).
 

I think SIPPs were still only 30pin - and the TT was 64 bit - Atari could have just had extra sockets ( like the 1040st having 32 ram chips for 2 banks ) - to allow expansion.

Link to comment
Share on other sites

3 hours ago, kool kitty89 said:

@Crazyace

Also, I just stumbled on the Super XEGS documentation from 1988, which I'd missed back in 2019 when it was posted on the Atarimuseum, it sheds more light on what Atari Corp was thinking and just how much effort they were putting into trying to producing a new game console and an extended 8-bit family system. At first I thought it might have been more like the Panther and even used the same GAME SHIFTER line buffer chip, but it looks like it only needed a single 320x5-bit buffer, not 2, and had a much more Sega/Nintendo/Hudsonsoft (PC ENgine) style of tile + sprite architecture, but with expanded bitmap framebuffer modes more on par with the ST or Apple IIGS. It's not very MARIA-like, where the Panther definitely seems more like a MARIA inspired architecture with complex/flexible object/display list, but cutting out the character mode entirely. (the Super XE docs even have "Display List Architecture" crossed off at one point, so it seems like they considered expanding on the A8 display list, but decided against it, too, the opposite direction the Panther took) It also has a full-screen background rotation mode, like SNES mode 7, though was probably going to be a linear framebuffer (and is 4bpp 16 colors not 8bpp 256 colors) rather than the SNES's tiled mode 7. They also went the opposite direction there with the Panther, using scaled/zoomed sprites rather than a single scaled/rotated background. Atari was working with Ricoh at the time for a planned XE-on-a-chip ASIC with 4/8 MHz 65816 (8 MHz with 64 bytes of on-chip RAM, it looks like) and external DRAM controller ("New FREDDIE") . It makes me wonder how much common engineering went into the SNES and this Atari Project, or if at least the Mode 7 feature was related to this.
 

The super xe doc is interesting - it isn't a cheap console ( the 512x512 4bit rotation buffer needs 128k ram just for itself ) , and just getting rotation working with the 250ns ram needs both banks to be fetching seperate 4 bit nibbles per cycle ( Hence the 100% bus usage ) - by having a line scroll ( and column scroll ) it's much more like the SNES , and the faster SNES ram ( along with 256 pixel display ) allows 2 banks to fetch character and pixel - making mode 7 practical.

  • Like 1
Link to comment
Share on other sites

  • 3 weeks later...

Wow. That was a novella... Could be published as a standalone work.

My first PC, a "boutique" 386SX16 had SIPPS in it. It was the only PC I ever saw with them, but then it also had a 287 co-processor slot too. Very early machine. I probably paid $2K for that in 1990, with 2MB of ram, 40MB HDD, VGA, the monitor, and a Soundblaster. I went all-in on my first PC. This was when the flagship from IBM was a 10MHZ 286 PS2 with 604K and a 20MB drive.

 

  • The VIC 20 was my Atari 2600 replacement in 1980?. I paid $159 for it. I bought a 1541 a year later, and a C64 a year after that.
  • I never even heard of the 7800 until I saw and bought one at a yard sale in 1993 with a handful of carts. Galaga and Asteroids blew me away.
  • The 5200 shouldn't have ever been built The 400 should have just been vaderized (black) and sold as a game system.
  • The 2600 should have been sunsetted in 1981-82.
  • The O2 had a membrane keyboard. so why not an Atari console? In '79 the 400 was $350-$399, and it could have been $300 in 1980 and $250 in 1982 when the 5200 came out for $250-$279.
  • If they had figured out the "sell the printer at cost and make money on the ink" concept, things could have been much different.
Link to comment
Share on other sites

26 minutes ago, Zonie said:
  • The 5200 shouldn't have ever been built The 400 should have just been vaderized (black) and sold as a game system.
  • The 2600 should have been sunsetted in 1981-82.
  • The O2 had a membrane keyboard. so why not an Atari console? In '79 the 400 was $350-$399, and it could have been $300 in 1980 and $250 in 1982 when the 5200 came out for $250-$279.

Odyssey 2 wasn't exactly a runaway hit, and it's not clear to me that consumers actually wanted a console that doubled as a computer.

 

The people who wanted computers were buying actual computers,  people who stuck with consoles wanted something uncomplicated.   The console makers were all showing off "keyboard add ons" to turn a console into a computer,  all of them failed.

 

  • Like 2
Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...