Jump to content
IGNORED

Super XE Game Machine


Philsan

Recommended Posts

I remember when I needed to move beyond 16-bit computing and the Amiga, I really had no idea what to get. There were hundreds of clones on the market. Then I quickly realized they all had a BIOS, ISA slots, x86 processor. And they all said they were compatible. And cards, peripherals and other add-ons all seemed to be compatible. So my fears of buying the wrong machine dissipated. A little bit of reading and help from a knowledgeable sales force set me on the right path. Those important things weren't there from Commodore or Atari. Not any more in 1991-1992.

 

I recall a similar experience years earlier as a younger kid when I was getting my Apple II. Both ecospheres had loads of support. And both made you feel good about your purchase. Made all the difference in the world being able to get questions answered in a way that made sense in the real world. Intangibles like that often outweigh any lacking technical superiority.

  • Like 1
Link to comment
Share on other sites

21 minutes ago, Keatah said:

I remember when I needed to move beyond 16-bit computing and the Amiga, I really had no idea what to get. There were hundreds of clones on the market. Then I quickly realized they all had a BIOS, ISA slots, x86 processor. And they all said they were compatible. And cards, peripherals and other add-ons all seemed to be compatible. So my fears of buying the wrong machine dissipated. A little bit of reading and help from a knowledgeable sales force set me on the right path. Those important things weren't there from Commodore or Atari. Not any more in 1991-1992.

 

I recall a similar experience years earlier as a younger kid when I was getting my Apple II. Both ecospheres had loads of support. And both made you feel good about your purchase. Made all the difference in the world being able to get questions answered in a way that made sense in the real world. Intangibles like that often outweigh any lacking technical superiority.

Yup.  It's amusing how many people blame Doom, or the 'people had IBMs at work, so got them for home too.'  Sure some of it was that.  But basically Atari and Commodore lost because of Atari and Commodore.  Throughout most of the 80s and 90s, it was Atari vs Commodore, ignoring everything else (with some small dabbling into the IBM compatible market).  They kept trying to out price each other, to the point that their profits stank and they couldn't even / or didn't bother, putting R&D into their computer line. 

 

The Amiga, when it first popped up was a few years ahead of anything else on the home computer market... but they sat on their asses for far too long, and AGA was more or less something they sharted out because they didn't get the funds to release what they were planning on.  Atari and Commodore both have the stories of rising to huge popularity, only to start bleeding cash until their ultimate downfall... but let's be honest, it was their own damn fault for being such bitter rivals.

  • Like 2
Link to comment
Share on other sites

On 1/29/2022 at 11:55 PM, oky2000 said:

I like GEM/TOS in hi-res (med res icons needed to be done separately to account for 2:1 ratio of screen area and lo-res icons are wayyyy too big full stop)

Ugh, I never liked GEM in medium res.   It just looks hideous.    I tended to use high-res for real work too,  unless I was coding something that ran in color

On 1/30/2022 at 2:51 AM, leech said:

Just think, if the reverse engineering of the IBM BIOS never happened, the computer industry today would be very very different.  No clue if it'd be any better, but it would definitely be different.

 

Now mind you... this is just discussing HOME computer usage.  What about all the other big iron competitors that all basically became consolidated into the x86 (and now more and more ARM)?  I'm talking about SGI, Sun, DEC, etc.  These systems were amazing for the time!  The end of them?  Well the infighting of the various Unix flavors.  Funny enough, now we basically have a few flavors of BSD, and the many distributions of Linux to replace those.  But the real reason we don't see them anymore is because of Linux being a free Unix-like operating system for x86 (commodity hardware), so people no longer had to pay 50k-200k to SGI for a nice workstation...

If the IBM BIOS reverse engineering never happened and everything remained proprietary,  I think we'd have seen a similar movement for hardware that Linux did for OSes:   The internet might have come up with an open standard hardware to run something like Linux,  and just like Linux snowballed until it basically runs on everything, this open hardware might have slowly gained popularity until it became the dominant hardware platform.

 

Just like Linux quietly snuck into corporations as web servers and Samba servers in its early days until companies realized the cost savings that could be had and officially embraced it.  The same could have happened with an open computing platform.

  • Like 1
Link to comment
Share on other sites

On 1/30/2022 at 4:12 AM, leech said:

Yup.  It's amusing how many people blame Doom, or the 'people had IBMs at work, so got them for home too.'  Sure some of it was that.  But basically Atari and Commodore lost because of Atari and Commodore.  Throughout most of the 80s and 90s, it was Atari vs Commodore, ignoring everything else (with some small dabbling into the IBM compatible market).  They kept trying to out price each other, to the point that their profits stank and they couldn't even / or didn't bother, putting R&D into their computer line. 

 

The Amiga, when it first popped up was a few years ahead of anything else on the home computer market... but they sat on their asses for far too long, and AGA was more or less something they sharted out because they didn't get the funds to release what they were planning on.  Atari and Commodore both have the stories of rising to huge popularity, only to start bleeding cash until their ultimate downfall... but let's be honest, it was their own damn fault for being such bitter rivals.

The amount of investment in the PC market and the economies of scale it produced was not something Atari and Commodore could compete with.

 

I remember in the late 80s, every month I'd open up Compute magazine and it seemed like the latest clones had ever increasing clock speeds  8mhz, 10mhz, 12, 16, 25, etc.   They got high density floppies, "cheap" IDE hard drives,  VGA from 1987 on.

 

in the ST world, there was no official increase from 8mhz to 16mhz until the Mega STe in 1991.   same with 1.44 floppies.   Sure there were aftermarket kits that you could install that promised these things but they were expensive and risk to install.   No IDE until the Falcon in 92,  you had to live with a much more expensive external SCSI hard drive with expensive host adaptor.   Nothing to compete with VGA again until Falcon in 92 (unless you count the TT, but that wasn't really a home machine)

 

Atari delivered "Power without the price" in 85, but by the early 90s that shifted to the PC world and everything was more expensive in the Atari and Amiga world.

  • Like 1
Link to comment
Share on other sites

1 hour ago, zzip said:

I remember in the late 80s, every month I'd open up Compute magazine and it seemed like the latest clones had ever increasing clock speeds  8mhz, 10mhz, 12, 16, 25, etc.   They got high density floppies, "cheap" IDE hard drives,  VGA from 1987 on.

Cheap IDE drives were the in-my-face slap. The last straw. I was just sick and tired trying to find stuff for the Amiga. Finished!! FINISHED!!

 

For a long time I had been following the MHz progression from 4MHz to 33MHz. And I always somehow imagined I would eventually possibly somehow someway get my Amiga up to those speeds. All I had to do was wait for prices to drop. There were at least 6 or more accelerator add-ons I knew of straight away.

 

But hard drives remained (in my area at least) stubbornly elusive. I'd have to spend another $500-$600 to get something going. And that may have been for something like 40MB or 60MB. I don't recall the exact capacity points I was perusing. But I wasn't excited because I had just gotten a Sider 10MB drive for my Apple II 2 years earlier. Make no mistake, however, the Sider was groundbreaking peripheral for the II series. It was eminently affordable in comparison to the 5MB, 6MB, or 11MB Corvus drives costing in the thousands. I got my use out of it and have no regrets whatsoever. I still have it today, and it worked fine last year when I last powered it up.

 

So, seeing a 200MB HDD standard on PC's around 1992-1993 was remarkable and exciting. Suddenly I didn't want to fart around trying to expand up the Amiga anymore. Purchasing peripherals for 16-bit machines was just not a fun experience. Impossible to find. Little software. No support. No commonality. And certainly nothing like the happy campy circusy grocery-store-like environment of CompUSA.

 

In short, yes, it was seeing the constant progress of PC hardware, the increasing MHz and HDD capacity, that made me not bother with the Amiga platform anymore. Not graphics, not sound, not multi-tasking. Any new advancements just weren't interesting anymore. Meanwhile the PC platform had proven itself over the past 10 years (from 1981-1991) that it was compatible and would continue evolving.

Link to comment
Share on other sites

24 minutes ago, Keatah said:

But hard drives remained (in my area at least) stubbornly elusive. I'd have to spend another $500-$600 to get something going. And that may have been for something like 40MB or 60MB.

I ended up buying a second-hand hard drive setup for my ST because a brand-new one was just too expensive.    I paid less than $200 and received a tank of a case with a 5.25" Full Height Seagate hard drive and ICD Host Adaptor as well as an SCSI<->MFM adapter.   So a lot extra electronics and cables to make the thing work, but at least it was all enclosed tidy in the case..   I knew other people  didn't mind having a mess of exposed boards and ribbon cables running from their ST, but I didn't want that kind of mess on my desk, so I was happy about that.

 

But boy was that thing loud!   5.25" drives apparently need a fair amount of spin up time before they are usable, so I had to turn the hard drive on for 20 seconds or so before booting the ST!   However it did make the ST so much more usable.   No more disk swapping,  could load up a whole bunch of TSRs and desk accessories at all times and disk access was so much faster!

 

But it couldn't run Doom, and that was my breaking point for jumping to PC. 

Link to comment
Share on other sites

10 minutes ago, zzip said:

But boy was that thing loud!   5.25" drives apparently need a fair amount of spin up time before they are usable, so I had to turn the hard drive on for 20 seconds or so before booting the ST!

Yes. Same thing with the Sider. And I believe it only spun at 1200 - 3000 RPM. Since it used a band + stepper motor for head positioning, you had to manually park the heads with a tiny utility. And it was L O U D ! Seemed to resonate at just the right frequencies to irritate me. I tried muffling the fucker with towels and cardboard boxes, mindful of heat buildup. Eventually I strung the data cable through the wall and into the closet. Blissful silence.

 

11 minutes ago, zzip said:

However it did make the ST so much more usable.

I only gained practical productive real-world use of the HDD toward the end of my Apple II gig - which was just short of getting a PC. I mostly used the drive for storing single-file "BRUN" games, accessing tons of hi-res pictures without floppy swappage, fast loads and saves of stupid-ass recreational Applesoft BASIC programs that did nothing but make me feel like I was super smart. Then there was the BBS. A real hit because I could have hundreds of text files (a big thing back then) online. And it seeming eliminated all lag as a user navigated the message boards.

 

Toward the very very end, when I was packing away my II stuff, it took on duty of storing something 50 or 100 floppies, pre-compressed with Dalton's Disk Disintegrator, DDD - the vintage equivalent of a zip file.

 

I also had Appleworks, and ProTERM going, "big stuff, heavy hitters" in the Apple II world at the time. Even did multi-boot between CP/M, DOS 3.3, and ProDOS.

 

I was surprised at how much I got out of the II platform. Something like 10 to 14 years give or take. I would have stayed with it longer if Scully didn't kill it off. He specifically said he did that because he wanted to create the image that Apple could participate in "big business". Many II series enthusiasts are still sour on that decision.

 

I never got into the 68K MAC, like I say it was too expensive in 1984/1985 for a kid. And by the time I was entering the big leagues I was pissed that Apple dropped the II. And then later killed off the IIgs too. I was personally upset and all. I didn't want to deal with a company that drop platforms like hot potatos. I wouldn't buy another Apple product till the iPod in 2004/2005, and then never again. Not out of hate or anything. Simply that they didn't fit my needs. And gone was the elegance and finesse of the first MACs. I got the iPhone as a gift. But I don't think I'd have gone outta my way for it.

 

I couldn't stand those plexi-glass G3 and G4 machines. So tacky. Butt-ass ugly. Confining. Restrictive. And that lamp thing? WTF? No. I'm happy to let Apple's dotcom era designs rest in peace thankyouverymuch.

 

By comparison the Atari ST's styling still looks nice today. As do the XL/XE 8-bitters. Atari's concept stuff would be at home in a Syd Mead retro-futurism painting. As would the 5200. http://blog.iso50.com/19111/atari-computer-concepts/

 

11 minutes ago, zzip said:

But it couldn't run Doom, and that was my breaking point for jumping to PC.

I suppose I could be thankful that Doom came out after I got my PC. I was just settling down learning the ins and outs of Windows 3.1, how it sat atop DOS 5, how CONFIG.SYS and AUTOEXEC.BAT worked, how they determined what games I could play. Had it going so that one configuration covered everything minus Commanche. That was the one game I had to use a boot disk. And perhaps rightfully so because I tended to load a lot of TSR and Drivers.. No big deal.

 

Retrospeakingly I was thankful because I remember loading it and being blown away straight off the bat. It had a warm fuzzy feeling from the look of the menu and vibrant colors. Even if it was a dark environment. It was a vivid experience. And instantly contrasted against something like FRACTINT in hi-res or any productivity thing in Windows. The same hardware doing that was playing a 1st rate game. Unbelievable! It felt like "Atari" all over again. But so much more.

 

12 minutes ago, zzip said:

So a lot extra electronics and cables to make the thing work, but at least it was all enclosed tidy in the case..

Yes. The Sider had a mess of stuff inside too. The "huge" 5.25-inch mechanism, a SASI interface, the drive electronics itself. A small-size PC power supply, and quarter-inch diameter frame/caging to organize it all. It was a nice looking drive and had some styling cues to fit in with Apple stuff. Barely though. The white textured paint, the red LED. I often imagine gutting one of those and packing it with 4x 18TB for NAS.

  • Like 1
Link to comment
Share on other sites

15 hours ago, Keatah said:

Yes. Same thing with the Sider. And I believe it only spun at 1200 - 3000 RPM. Since it used a band + stepper motor for head positioning, you had to manually park the heads with a tiny utility. And it was L O U D ! Seemed to resonate at just the right frequencies to irritate me. I tried muffling the fucker with towels and cardboard boxes, mindful of heat buildup. Eventually I strung the data cable through the wall and into the closet. Blissful silence.

I don't remember what the RPM was.   I do remember the seek time was something like 27ms, when every PC drive I owned was 10 or less.    So not particularly fast, but since the ST was only transferring maybe 10's of KB or 100's of KB at a time, it was fast enough for that purpose that I don't remember having to wait for disk I/O very often.

 

15 hours ago, Keatah said:

I wouldn't buy another Apple product till the iPod in 2004/2005, and then never again. Not out of hate or anything. Simply that they didn't fit my needs. And gone was the elegance and finesse of the first MACs. I got the iPhone as a gift. But I don't think I'd have gone outta my way for it.

I don't like how "walled garden" Apple has become.   iOS devices are very locked down compared to android devices,  you can't easily browse the filesystem, it's harder to sideload apps on them, harder to connect common USB devices to then.   Apple is more prone to reject your app from their app store than Google is (in my experience anyway).   We couldn't even create a Mac utility and let other Mac users in the same company use our own software without getting a Mac distribution certificate from Apple!

 

16 hours ago, Keatah said:

By comparison the Atari ST's styling still looks nice today. As do the XL/XE 8-bitters. Atari's concept stuff would be at home in a Syd Mead retro-futurism painting. As would the 5200. http://blog.iso50.com/19111/atari-computer-concepts/

Yeah most of the Atari computer line still looks decent.   The only one I think didn't age well is the Atari 400,  but lots of people think that looks cool and retro.  I think it's hideous.   Oh and the XEGS with it's pastel "easter egg" buttons, but I thought that looked bad back in 87 too, so no change there...

 

16 hours ago, Keatah said:

I was surprised at how much I got out of the II platform. Something like 10 to 14 years give or take. I would have stayed with it longer if Scully didn't kill it off. He specifically said he did that because he wanted to create the image that Apple could participate in "big business". Many II series enthusiasts are still sour on that decision.

So Apple II being the "education" computer wasn't serious enough for them?    Kind of like how Atari wasn't taken serious as business computer because of their game image.   Ironic how Microsoft is now one of the biggest gaming companies and still a serious business company

 

16 hours ago, Keatah said:

Retrospeakingly I was thankful because I remember loading it and being blown away straight off the bat. It had a warm fuzzy feeling from the look of the menu and vibrant colors. Even if it was a dark environment. It was a vivid experience. And instantly contrasted against something like FRACTINT in hi-res or any productivity thing in Windows. The same hardware doing that was playing a 1st rate game. Unbelievable! It felt like "Atari" all over again. But so much more.

I remember what blew me away most about Doom was the fact you could go up and down stairs.  I had never seen that in a 3D game prior.   Seemed like complete freedom!   Well not really because once I started to design my own levels, I learned what the restrictions were.   You could not actually  build a building with multiple floors,  there is really only a single level of the entire map, but different parts had different heights so you could create the impression that there were different levels, but it was not a true 3D map.

 

 

 

  • Like 1
Link to comment
Share on other sites

  • 1 year later...

I managed to miss the reveal of this hardware, but it's super interesting. It appears to be the early stages of what became the Atari Panther, and given the 5-bit (32 color) line buffer and palette limitations, the "GAME SHIFTER" was probably designed for the Super XE first and re-used for the Panther. Interesting it says 320x8 in the document with the 8 crossed out and 5 written, so they might have been planning on full 256 color palette before cutting it to 8, or that was a typo.

It also mentions using an 8 MHz 65816 (when working in fast SRAM), and 4 MHz in slow memory. Both of those are blazing fast and at 8 MHz, faster than the 16 MHz 68000 in the panther, at least for assembly language programming and for most uses in games (ie not making heavy use of 32-bit arrays or multiply/divide instructions ... except at 8 MHz, software mul/div functions would probably be faster too with some small look-up table optimization). This is really, really fast in general, not just for a 650x. The PC Engine's 7.16 MHz 65C02 derivative is already faster at a number of things than the Mega Drive's 68000, but the 65816 is significantly faster for 16-bit operations and would remove some of the disadvantages. (the 68000's ability to use much slower RAM and ROM without wait states would be a big factor and I'm not sure how Atari would work around that unless they used slower ROM and copied it into RAM for the CPU to work in there)

Given they were partnering with Ricoh for production, the 8-channel PCM chip would almost certainly be the same one used in the FM Towns, Mega CD, and several arcade boards (including Sega's System 32). It's vastly more cost effective than the Sony DSP based SPC module used in the SNES, yet for many purposes it wouldn't sound much worse and in many cases would sound similar or better, especially later games that could afford to use higher quality samples in ROM. (it uses 8-bit linear PCM, uncompressed, but you could use ADPCM in ROM and decompress it into sound RAM, but best case is good quality 8-bit PCM in ROM for games that can afford it: in that case they will be cleaner and higher quality sounding than the compressed, interpolated, and heavily filtered SPC ADPCM format samples ... honestly Nintendo should've used it, saved money there, and used that for faster DRAM and a faster CPU clock rate).

Note this is also a full 3xx as fast as the SNES CPU. The SNES CPU when working in RAM or normal (slow) ROM is limited to 2.685 MHz  (actually NTSC chroma clock x3/4 = 2.68466 MHz) and assuming they used the same 32.215905 MHz system clock as the STe and Panther, then 8 MHz would be 8.053976 MHz  or NTSC chroma x9/4. So it's literally exactly 3x the clock rate of the SNES.

Note the SNES's CPU can run at 3.579545 MHz (1x NTSC Chroma) while accessing PPU registers and in fast ROM, but for some reason they opted to use 128 kB of DRAM too slow to run even at that speed, or the DRAM control logic in the system ASIC just couldn't manage to be fast enough. This also makes it impossible to copy data and code from ROM into RAM to make it run faster.
Also note that NEC had already used DRAM instead of SRAM for the PC Engine CPU at 7.15909 MHz in the CD-ROM expansion unit (it uses 64 kB of 70 ns DRAM, cycled at 7.15909 MHz and with access time fast enough to work with the 650x timing) and that was in 1988, not 1990/91 like the SNES. NEC later switched to SRAM for the Super CD upgrade, but that was likely only cheaper to do because of vertical integration and their own SRAM production. (there's few other situations  where 256 kB of 100 ns SRAM would be cheaper than 256 kB of 70 ns DRAM). Even if the DRAM controller was the problem, they could've used a smaller amount of SRAM and been better off, or more likely, PSRAM as that's cheaper and was widely available by 1989 (the SNES uses it for Sound RAM in most cases, the Mega Drive uses it for CPU RAM). PSRAM = Pseudostatic RAM = DRAM cells + embedded DRAM control and refresh logic put inside an SRAM compatible package and mostly a direct replacement at around 1/4 to 1/2 the cost compared to SRAM. (in terms of silicon chip space used, it's closer to 1/4, but you still have the higher pin count of SRAM, so it's not going to be as cheap as the smaller, multiplexed address true DRAM chips, though the DRAM shortage of 1988-1990 made PSRAM and SRAM more attractive for a while on top of that as SRAM and PSRAM production facilities didn't directly compete with normal DRAM, but there was other stuff going on at the time like a DRAM-specifric price floor set by the US for Japanese imports, but that was only for bare chips, not assembled hardware using the chips)

650x type CPUs need memory cycled at their clock rate and access times somewhere between 1/2 of a cycle to 3/4 of a cycle. Datasheets usually show times much shorter than 3/4 of a cycle for all speed grades over 2 MHz, but actual hardware products using the CPUs seem to fare better than this, like NEC's case using 100 ns SRAM without issue for a 7.16 MHz 65C02 (139.7 ns cycle times, so 100 ns is .716 cycles). Most 65C02 and 65816 datasheets I've seen show 70 ns access time requirements for 8 MHz, which would imply 80 ns for 7.16 MHz. Atari and Ricoh likely knew the more realistic access time limitations and were working around that, but even so it's extremely tight timing compared to x86 or 68k type chips, or Z80 for that matter. In the broadest sense, 650x type CPUs need memory running 4x as fast as a 68000 at the same clock speed, like with the Atari ST's memory timing, the best you could get is a 2 MHz 6502. A 68000 needs memory access within 2 clock cycles vs 1/2 of 1 cycle on the 650x (worst case). The 68000 uses 4 clock tick machine cycles and bus cycles, so memory needs to respond in 2 cycles, but can take 4 CPU cycles between accesses (or as in the ST and Amiga, memory can cycle at 2 CPU clock cycles with DMA access periods interleaved in the second half of that 4 CPU clock cycle period; some 650x machines do the same thing; in fact, the BBC Micro uses almost identical DRAM cycle and DMA cycle timing as the Atari ST, but with a 2 MHz 6502 and 8-bit bus so half the bandwidth of the 16-bit bus) The 8088 and 8086 (and 188/186 and V20/30) also use 4 clock tick memory cycles like the 68000, but don't need access until the end of the 3rd clock tick or 3/4 of a full bus cycle, more like the best-case scenarios for a 650x or standard scenario for 1 or 2 MHz parts based on data sheets. (I believe the Atari 8-bit DRAM timing is based around this, with data latched 3/4 into a 1.79 MHz cycle and DRAM cycling at 1.79 MHz; The Apple II and C64 cycle DRAM at 2 MHz in order to interleave video DMA with CPU cycles at 1 MHz, ie 1/2 the speed of the BBC Micro or Atari ST)

The 65816 unlike the 6502/C02 has a multiplexed address and data bus for the upper 8 bits of the 24-bit address space, so this also requires an external address latch which can potentially add additional memory delay and mean faster RAM or ROM than a non-multiplexed equivalent. A fast enough latch should allow effectively similar timing to a non-multiplexed bus, though, especially for DRAM or PSRAM where the delay can be hidden during precharge.

Also note that for CPUs like this where access time must be faster than cycle time, SRAM and ROM isn't ideal as SRAM and ROM both cycle as fast as its access time. This is great if you do a fancier memory system with interleaved sharing of DMA cycles with CPU cycles and  use RAM exactly 2x as fast as the CPU, but means a simpler non-interleaved system has to use faster SRAM than the minimum cycle time. This is not the case for 286 and later x86 processors or 68030 and 040 (those all do 2 clock tick access times and cycle times).
In any case, that makes DRAM even more appealing than it otherwise would be since DRAM can be accessed significantly faster than its minimum cycle time, and the same is true for PSRAM (since it's DRAM based). For example, 150 ns PSRAM can be accessed in 150 ns, but cycles at around 235-250 ns (depends on manufacturer), or 120 ns access and 210 ns cycle, 100 ns access and 160 ns cycle, 85 ns access and 135 ns cycle, 80 ns access and 130 ns cycle. Applications that need PSRAM to behave like SRAM need to use chips based on cycle times and not access times, but for cases where you actually need a faster access time, that makes DRAM and PSRAM even more efficient compared to SRAM.


Now, note how this Super XEGS intended to use DRAM, just like the 8-bit computers and dual banks of DRAM (I don't see it in the document, but I suspect 2 banks of 64kB, or 128 kB total), and if operating at 4 MHz you'd need an access time of ~150 ns (datasheets for the 65816 say 130 ns, but that's probably excessive in practice and even 150 ns is probably conservative where best case it might be around 180 ns). In any case this means 150 ns or faster PSRAM and 120 ns or faster DRAM, 100 ns if they didn't use an especially fast DRAM controller and address latch. Note: late model XE systems and all XEGS units I've seen motherboard pictures of already tend to use 120 or 100 ns DRAM, so this should be a non-issue cost wise. The docs say 250 ns DRAM cycle time, so they're probably using 120 ns DRAM like the Lynx and a lot of STs used. (early STs used almost entirely 150 ns DRAM and pushed it slightly out of spec cycling 260 ns rated RAM to 250 ns, but it worked without issue in practice; the ST MMU doesn't latch data early as you'd need for a 4 MHz 650x or 16 MHz 68000, it only needs to latch data at 250 ns for the 8 MHz 68000 and SHIFTER DMA cycles don't need early access either; from the investigation I've seen, the ST MMU simply does 250 ns cycle and access times)
120 ns DRAM can be cycled faster than 250 ns, but that's not important here as 250 ns is the CPU cycle time and all you need is fast enough access time. (if it used a 16 MHz DRAM controller with 2-phase clock like the ST used, for effectively 32 MHz pulses or 31.25 ns, you could do 125 ns RAS + 31.25 ns CPU to MMU to DRAM address delay time or 156.25 ns which is probably good enough for 4 MHz) They probably just use similar DRAM timing to the Atari Lynx, probably not supporting page mode cycles like the Lynx unless maybe for the Object processor.



However, due to the DRAM shortage in 1988, at some point Jack Tramiel ordered that any game console must be based around SRAM, or at least not use DRAM (this should have made the new, at the time, PSRAM a consideration though, but the Panther didn't use that). This likely led the Super XEGS to being cancelled and the SRAM based Panther to be the follow on project. The Lynx being DRAM based was probably bad enough at the time. They also obviously abandoned backwards compatibility to simplify design and keep costs down. They used 120 ns SRAM for the Panther on a 32-bit wide bus, but just 32kB, and in 1989 that's not too bad, but by 1991 when the Panther was cancelled, that's really skimpy due from a cost and price to performance ratio with 32kx8-bit DRAMs and 120 ns speed being in the low end of things and a better value than the 8kx8 chips in the Panther (you need 4x 8-bit wide chips to make 32kB and 16-bit wide SRAMs weren't available at the time so 64KB would've used 8 8kx8-bit chips, even less cost effective and using 2x the board space).

Then again, they should have just switched from using DRAM to PSRAM instead. They should've known it existed and been looking into it as an SRAM alternative from 1986/87 when promotional material and publications were coming out, then more so as a DRAM alternative in 1988 with new trade restrictions and the DRAM shortage due to poorer than expected 1Mbit chip yields among other things. (PSRAM is also called Virtually Static RAM early on in 1986/87 articles and still in some datasheets later on, also called XRAM by NEC)


Given Atari ended up using VLSI for Lynx chipset production and not Ricoh (both of them had licenses for 6502, 65C02, and 65C816 cores), there must have been a falling out between Atari and Ricoh by 1989 when Lynx production started and/or VLSI just offered better deals than Ricoh. Atari also made a deal with Ensonq at some point in 1988 as well, which might have complicated the Ricoh sound chip situation (and might also by why Atari never used a Yamaha FM synth chip in the ST or console designs) or that might be wrong too and Atari might have kept all their options open and made sure any partnerships or deals made weren't mutually exclusive, especially before any product actually went to market. (ie keep multiple options open until something actually goes into mass production ... negotiaions with multiple chip vendors while also looking for an in-house chip vendor to use, same for sound chip manufacturers, etc: though I honestly doubt anything would've been as cheap yet still "good enough" as Yamaha's OPLL or YM2413 ... a cut-down version of what Adlib and Sound Blaster cards used, same one as in the Japanese Master System and probably would've easily beaten most PC Adlib/SB music if used by the more talented UK/European chip composers ... or Japan, but Atari would've had more of an uphill battle getting Japanese developer support, I think; then again after the Nintendo anti-trust lawsuit they might have had more room to sign on Japanese 3rd parties ... but even so without a platform actually having a Japanese market presence. Then again, maybe they could've partnered with a Japanese company to market the new console)



OTOH it's a shame they didn't just re-use MARIA as-is with a fast enough DRAM controller to support it (more likely use DRAM as ROM for graphics data at MARIA's 3 clock cycle = 419 ns requirement and use SRAM for display lists and high speed CPU access) or just a single 32kB SRAM chip like a few 7800 games already used on cart (Summer and Winter games both do, only using 16kB of it as they don't make 16kB chips and couldn't fit 2 8kB chips without using a larger/custom cartridge size, apparently more expensive than 32kB of SRAM in 1987/1988). You'd want more than 32kB for a MARIA based computer, though, so DRAM needed there and probably just use a single 8kB SRAM chip.

Also, I believe MARIA only needs SRAM to cycle at ~279 ns (2 MARIA clock ticks per access) given lists take 2 clock cycles per byte to process, but it may still need access time in 1 clock tick, this still means you could add an external memory controller that allows 50/50 bus sharing with MARIA SRAM without any wait states (7800 has to halt the CPU during all MARIA bus activity time in RAM or ROM, so this would be much faster even with the same 1.79 MHz 6502).
Additionally that 50/50 SRAM sharing would play to the strengths (or weaknesses) of 650x access timing and you'd have 279 ns synchronous memory cycle slots with 120 ns access times, which means 3.58 MHz could easily be obtained with a 65816 (ie faster than SNES or Apple II GS). Outside of MARIA bus time, the CPU cpuld be clocked faster, 4.09 MHz conservatively (7.16 MHz MARIA clock x4/7) but  likely 4.77 MHz would be fine (7.16x2/3) and possibly even 5.73 MHz (7.16x4/5).
Additionally, MARIA doesn't access SRAM while doing graphics reads (ROM in the 7800, and ROM or DRAM here) so the CPU could potentially access SRAM in high speed mode during MARIA ROM/DRAM reads. You could also interleave DRAM or ROM access, but for that it would need to be cycled at 209.5 ns (MARIA graphics data reads are in 3 clock ticks or 419 ns) which means 200 ns ROM would be needed rather than the cheaper 250 ns used by the Mega Drive at the time. (You could reserve that as an optional FastROM setting) Still that means dropping the CPU to 2.3866 MHz for interleave, but vastly better than being halted and faster than the 7800's or XL/XE CPU. You'd also need 100 ns DRAM to achieve those cycle times without running faster than spec (with the exception of some CMOS 120 ns DRAM cycling faster than that, so Atari could've used a mix of 120 and 100 ns DRAM that fit the required timing). Or if you don't want to deal with interleaving, just use 120 ns DRAM with 139.7 ns access and 279.4 ns cycle times (or 120 ns PSRAM) with potential to run faster cycles when MARIA is idle/off the bus, possibly 209 ns for 4.77 MHz CPU speed. (using PSRAM for this would be easier and not depend on a fast and efficient DRAM controller)

Though if you don't need 650x backwards compatibility, a 3.58 MHz Hitachi 6309 would be interesting and a Motorola 68008 at 9.545 MHz would be worth considering (209.5 ns access 419.5 ns cycle time, so it could interleave fully with MARIA in DRAM or ROM cycled at 209.5 ns) plus be just as fast as a 68000 for 8-bit operations (lots of MARIA list operations would be modifying byte-wise data) and work on cheaper 8-bit wide ROM but half the speed on most 16-bit and 32-bit operations (multiply and divide would still be faster than an 8 MHz 68000 due to how slow those instructions are). That or if they wanted to have more software compatibility overlap with the ST architecture and a cut-down version of TOS ... sort of an EST (8/16/32 except EST would be terribly confusing with the STe name). Granted the 6309 is also technically a hybrid 8/16/32-bit architecture. 
Note: to even make use of interleaved SRAM speed along 279 ns bus cycles, you'd actually need a 14.318 MHz 68008 (139.7 ns access and 239.4 ns cycle) and then you could run at up to 16.7  MHz when not interleaved (120 ns SRAM).

Now such a system couldn't be cost effective if it had to be both 7800 and XE backwards compatible and would be cheaper still (and/or better performing) neither of those, but just re-use existing Atari custom chips (and off the shelf MOS compatible I/O chips) where advantageous. The design philosophy here is quite divisive. Albeit given they'd already done a 2600-on-a-chip, and given the simpler hardware of the 2600, doing the same thing on a new chip that replaces the 6502 with a 65816 and as much added I/O and sound hardware as you could manage would the the cheaper option, and cheap enough to not discard all backwards compatibility entirely.

Now THAT said, it's also worth noting they could have done what Commodore did with the PET to VIC to C64 (and C16/Plus4) and still have software compatibility at high level for all programs using BASIC or running through system OS calls. So a MARIA based computer that used compatible Atari DOS and Atari BASIC and could run tape and disk programs that ran in BASIC or in Atari DOS. And stil use the Atari SIO port for standard 8-bit peripherals, or some mix of SIO and Atari ST/PC compatible interfaces. (more likely SIO + Eenhanced Cartridge Interface with the latter allowing further expansion via modules if Atari bothered ... not actually use Atari 8-bit ROM carts, but the same expansion port and pinout, or just a PBI style edge connector)
And even then they'd have had fewer confusing, overlapping products than Commodore.


Aside from that, they could have designed a new system in parallel with a 7800 expansion unit designed to provide similar features, but this is messier and difficult to do efficiently. Still it probably could've been done OK if they just had a bunch of redundancy on the expansion module (65816 + added RAM + added I/O hardware) and forced software to not use portions of the 7800 that would be eliminated as redundant while still being at least slightly cheaper to produce than an all new system. (that might include duplicating TIA+RIOT onboard the expansion unit in order to access all of those I/O registers at full speed rather than slowing down as the 7800 has to) And even then you have the down side of the 7800's muddy video output (due to the way TIA+MARIA video are coupled) and RF only output. At some point it's just going to be cheaper to not offer an expansion module and just release an all new system, but that really depends on resources needed to support 2 separate hardware releases vs market appeal of an upgrade vs new system, and Atari had a very solid install base of 7800 owners, selling many more of those than 8-bit computers or STs at the time (at least in North America) with over 3 million sold in North America by the end of 1988.

 





Also it's a shame they didn't use the MARIA style display list and object list architecture in souping up or replacing the ST SHIFTER, as part of the STe, potentially using the same 5-bit (32 color) line buffer as the Super XEGS and Panther used. For that matter, they could've used the same chip for an 8-bit bus console and on the 16-bit ST. Either you have an 8-bit data bus on the graphics chip and have the memory controller feed it 2x as fast reading from a slower 16-bit bus or you use a 16-bit bus on the graphics chip and have an 8-bit memory controller in the console feeding it 2 bytes in series through a FIFO. (just make all lists and data sizes multiples of 16-bits or 2 bytes and you're good ... hell even with MARIA all you'd need to do was extend DLLs from 3 to 4 bytes and 5-byte DL headers to 6 byte, and keep the 4-bit DLs unchanged, then have graphics data all point to multiples of 2 bytes rather than 1 byte and you're good to go)
Now it would either need to natively support ST style bitplanes and MARIA style packed pixels (and weird MARIA pixel formats in some 2bpp and 4bpp modes) or not be backwards compatible with both. Given both formats (aside from MARIA's funky nonlinear modes) are useful and advantages for different situations in video games and in computers, that they should have just supported both formats. That way 5 bitplanes could be used for 32 color objects (or 31 color + transparent) with maximum RAM/ROM efficiency (at least when uncompressed) and 3-bit objects could also be supported rather than just 1, 2, 4, or 8 bits per pixel as packed format would limit you to.

Focusing on hardware that could be equally valuable and useful across the 8-bit computer, ST, and game console sides of things seems like a big win in terms of optimal use of R&D. Then the best cases is the same hardware gets used in multiple platforms, and if not that then at least you don't spread yourself as thin with parallel projects (STe, TT, and Super XE all in 1988, Panther and Falcon in 1990, Falcon and Jaguar in 1991/92, all of which used different graphics chips and memory controllers ... which comprises most of the custom chips in the ST family).

Plus a 16-bit souped up MARIA working at ST or better resolutions would be a genuine Amiga beater (even with the 32 color limit it would beat OCS Amiga graphics for games, art/graphics, and productivity applications, especially if the line buffers were taken advantage of to do line doubling of low-res modes at VGA monitor res, so you wouldn't need multi-sync monitors to run the hi and low res modes, just a standard VGA monitor).


There's a variety of other ways MARIA could've been upgraded less dramatically but to very serous effect like just making parts of it faster or some data fetches faster (like 2 clock ticks for character map and graphics reads instead of 3 ticks) or expanding CRAM to allow for 4 15 color palettes rather than 2 12 color ones (line RAM is already 320x3-bits in hires so should be 160x6-bits in low res, but it only has enough CRAM for 25 entries total for 61 colors in 4bpp mode. (8 logical palettes are supported, but only some modes can use all 8 palette selections due to line buffer and CRAM limitations: in 320 width mode you're limited to 8 colors, that's 8 1-color palettes, or 2 3-color palettes). Using 8 15 color palettes would be possible, but you'd need 7-bit wide CRAM instead of the existing 6.
A 240 pixel wide mode with 16 colors would be possible with the existing line buffer and palette space, but would need new modes using a 5.37 MHz pixel clock (like the SMS or NES or Colecovision, etc) instead of 7.16/3.58 MHz. This might also complicate color generation as the chroma/luma system might depend on integer multiples or fractions of the NTSC chroma clock.

Or, mostly relevant to computer uses, add 640 pixel wide 1bpp (2 color or B/W modes). Given NTSC artifacting, you'd want to be able to disable colorburst for clear high res graphics/text like CGA allows. (MARIA uses separate chroma and luma outputs already, so you'd just need to terminate the Chroma output, like plugging in the luma line form an A8-bit or C64 to a composite monitor or TV) Then you can do everything CGA graphics modes can do and a lot more ... but you can't do 16 color 40 or 80 column text modes like CGA can (or 16 shade grayscale text in luma only composite mode). But then the Atari ST also can't do 16 color 80 column text modes like CGA or Tandy video can. You can do 40 column 4 color text on MARIA or 16 color text even if you use 4x8 character cells in 4bpp mode.

Technically, you can do 80 column 4-color text using MARIA via a 320x200x2bpp framebuffer (not enough DMA time to do 80 column in character mode) but you'd be using 4x8 characters there as well. OTOH you at least can render them quickly as they're all 1 byte wide (2 bits per pixel) where you have to do bit manipulation for the same sort of 80 column mode on GTIA 320x2001bpp. (plus you get 4 colors, or 4 colors per scanline, for more colorful text or more shades at least in grayscale mode, potentially useful for anti-aliasing of the low-res characters: 4 shades of gray is pretty decent for doing some basuc antialiasing)



If nothing else, a MARIA based 8-bit (or 8/16-bit) computer could easily manage to beat the C64's graphics (for games or otherwise) as well as the Atari 8-bit computer, and has some advantages over the Atari ST itself, especially when you start considering RAM and ROM limitations and tricks/compromises used on some ST games for software scrolling effects or hardware transparency by sacrificing colors. (using pre-shifted sprites and background data uses up lots of RAM ... try doing that even with 128 kB of RAM, let alone 64kB and with sane ROM sizes, especially on the lower cost end Atari Corp tended towards) For that matter, with small-RAM limited situations, MARIA probably beats out the Amiga chipset as well, even once you have enough for some framebuffers, now MARIA can do framebuffers as well at 320 wide up to 4 colors or 160 wide up to 13 colors (plus more colors for sprites) vs Amiga not allowing 160 pixels (or not at 3.58 MHz pixel clock) so would be stuck with using fewer bitplanes at 320 width or at least 256 width to avoid using too small of a screen (or maybe a bit less if you use a 1-bitplane side-bar style scoreboard/status bar or something) Or technically you could still do 320x200x4bpp (or slightly less visible) if you single buffered with a looping scroll in RAM, but it'd be tight and require 60 Hz blitter/screen updates to avoid artifacts. Drop that down to just 32kB and maybe you could manage 3 bitplanes single buffered with less than 8kB of RAM left for the CPU to use. (though realistically, the Amiga was never going to use less than 128kB ... and if you did do 64 kB it'd have to be using 16kx4-bit DRAMs, though technically you could use 4 16kx4-bit chips to do 32kB 16-bits wide ... but had Atari Inc been planning on releasing an Amiga based game console in 1985, it probably would've had at least 64kB)

Though, to the Amiga's credit, the flexible display list nature of its framebuffer, plus the 8 hardware sprites does make it surprisingly viable as a game system all the way down to 64 kB. And given you'd be working in ROM, you could use almost all the RAM for a framebuffer ... so even with just 32kB you could technically do a 256x192x5-bitplane title screen with limited animation ... or maybe a little more or less than that, I'm not sure how much you'd use up with Copper lists and bare minimum CPU work RAM. (infinitely more than you could do with an ST with just 32kB of RAM) Of course, none of that considers the cost of the chipset vs RAM and whether so little RAM was ever worth it relative to total system cost even in a bare minimum game console implementation. (ie DRAM prices dropped far too fast in 1985 for this to be a real consideration)

Then again, Atari Inc had actually been considering releasing an Amiga based system (console or computer) in 1984 had the chips arrived on time ... instead of the Amiga team claiming the silicon was bad, then refunding Atari's license while in negotiations with Commodore (and while Atari Inc was such a mess that someone cached that check without reporting it to upper management, let alone James Morgan, and all within a month of Warner liquidating the company outright) Of course, Amiga ended up renegging on all other investors/licensees of the chipset in order to sell to CBM.
 

 

 

 

On 1/25/2019 at 12:23 PM, davidcalgary29 said:

This would have been very nice to see...in 1985. Exactly when did they think that this would be ready for market and to whom did they expect to sell it? Still, this is the first that I've seen that the XEGS (or whatever this is) wasn't just a one-off to repackage and utilize existing A8 overstock. And I've always wondered if Atari was considering a "POKEY II" or something like it in the face of the AMY fiasco.

I realize this is an old post, but it's something I see come up quite often.

The XEGS and XE family as a whole wasn't designed to use up existing stockpiles of anything. The XE may initially have used some old stocks of RAM and Atari chips in inventory, a very large number of them were built with newly manufactured components and by 1987/88 (around the time of the XEGS's release) they were using all new components based on the date stamps and manufacturer names on the custom chips and off the shelf parts. (I'm not 100% sure on POKEY chips, they may have had a bigger stockpile of those than other 8-bit computer parts for some reason ... possibly due to the heavy use in arcades and plans to use them in 7800 carts)

They has just introduced new revisions of the XE motherboards at about the same time as the XEGS, switching to using two 64kx4-bit DRAMs instead of 8 64kx1-bit DRAM chips. (or 4 vs 16 in the 130XE) and the XEGS was based on the new DRAM type. The Lynx also ended up using the same DRAM density and, often, the same mix of parts as most if not all the XE computers of that era used 120 ns or faster DRAM (almost 3x as fast as the chipset actually needs).
Actually, if they'd updated the chipset internally, those XEs could've been running comfortably at 2x their existing speed based on the DRAM used. (a lot like early 90s Amiga 500s and 600s using 70 ns DRAM on the motherboard and cycling at less than 1/2 the rated speed) Side-note, but at least in the case of the ST, the existing MMU (especially of later production dates) can already run at 2x speed and support a 16 MHz 68000 and, potentially, a 64 MHz SHIFTER with double res video modes (ie 640x200 16 colors). I'm not sure if the 8-bit chipset could be modified in that way or the Amiga. The STe can't use the same trick as it requires the GLUE chip timing to be separate from the MMU and SHIFTER. 70 ns DRAM works, some types of 80 ns does as well. (I haven't seen any STF/STFM models using faster than 100 ns chips though, I think they production shifted to the STe too early for that to happen)
Though the STe might allow the same overclock if you could provide external video sync and blanking times using the GENLOCK mode features. (in this respect it should actually be easier to do than on the STF, but can't be done by just overclocking existing chips on the board: all you need for the old ST/STF is to synthesize and possibly buffer appropriate clock signals from a common, synchronous source)

See for the successful implementation of doubled resolution modes on an overclocked ST SHIFTER:
https://blog.troed.se/projects/atari-st-new-video-modes/


Anyway, for that same reason, the DRAM chips being used in the late model XE computers and XEGS were already fast enough to be used as MARIA ROM and probably as MARIA RAM as well (I'd have to ask 7800 programmers about this, particularly if DLLs are read faster than DL headers, in which case you'd at least need SRAM specifically for DLLs but not DLs).






 

  • Thanks 1
  • Sad 1
Link to comment
Share on other sites

OK, so I read through the document more completely.

The feature set is nothing like the 7800 or Panther. It also doesn't look to need double line buffers like MARIA or the Panther (or Jaguar) used, but just a single 320x5-bit line buffer for loading background and sprite data into.
It's much more orthodox or standard for the late 80s and early 90s game consoles, except it includes bitmap framebuffer modes as well, like GTIA does, but higher res and color depth, more like the ST. Except it also has a full-screen background rotation mode like SNES Mode 7 (but in 16 rather than 64 colors).

You've got bitmap framebuffer or tilemap backgrounds, 12-bit RGB colorspace like STe or Amiga.

64 hardware sprites per frame (without multiplexing)
Sprites are 16 colors (presumably 15+ transparent)
16 sprites per scanline, 2-bit size select for 8 or 16 pixels wide and 8 or 16 pixels tall. All sprites can be their max 16x16 per line, so 256 sprite pixels per line.
1-bit palette select from 2 16 color palettes (32 colors on screen total, shared with background)
2-bit sprite scaling for both V and X (1x, 2x, and 4x setting for width and height) somewhat like the VCS does
(could help with scaling/zooming animation with animation frames filling the in between steps or just BIG chunky coarse pixel sprites)
8bit V and 9-bit H position (512x256 screen/off-screen space)


Tilemap background (character mode) using a set of 256 8x8 character cells per frame
40x24 character map (320x192 pixel screen)
2-bit per pixel (4 color) 8x8 character cells,
8 4-color palettes can be selected per character.
1-bit transparency flag allows 3 colors + transparent instead of 4 colors. (selected per tile)
2-bits for horizontal and vertical flipping/mirroring of tiles
per-scanline scrolling effects
per-column scrolling effects (usually used for screen-tilting as seen in Mega Drive games)

I worked out the math and it seems like sprites are loaded during H-blank and active screen space is in a similar area to the ST or C64, ie larger horizontal border space than the A8 or Amiga, and using an 8 MHz pixel clock like the 320 wide modes on ST and C64 vs 7.16 MHz on the A8 and Amiga.
With 512 8 MHz cycles per scanline, and 320 for the screen, this leaves 192 for border area, and 192 bytes is exactly what's needed for 16 sprites using 4-byte sprite list entries and up to 8 byte (16 pixel) width. This also means peak bus bandwidth is 8 MB/s, possibly using bank interleaving or page mode (there's 2 8-bit DRAM banks, so bank-interleave would work). CPU only accesses DRAM at 4 MHz, though, so the chipset can soak up more bandwidth than the CPU.

Bitmap framebuffer screen modes:
320x192x4-bit (16 colors) stealing 50% of CPU cycles
640x192x2-bit (4 colors) stealing 50% of CPU cycles
Special 320x192 16 color mode with full screen rotation (possibly scaling/zooming too, given the same hardware needed for rotation normally does scaling as part of the same math)
This mode uses all bus time (steals all CPU cycles) when displaying the screen. (CPU could work in vblank, or on screen lines where that mode isn't used)


The CPU runs at 8 MHz internally and 4 MHz when accessing DRAM. Internal RAM is listed as 512 bits, and either that's a typo, or it's 64 bytes of on-chip RAM. It could very well be the latter. The 8 MHz speed switch is done automatically when accessing that on-chip memory area. (also probably all the chipset register space as well, graphics registers, I/O ports, etc)
That'd give a tiny scratchpad to use for bits of code and data, most likely mapped into zero page (or potentially able to be split for 32 bytes in page 0 and 32 for stack use or something like that). As zero page pseudo-registers that could probably make decent use of the 8 MHz speed and allow for compiled code in C to work much better than usual on 650x processors, especially sloppy C compilers that use zero page as 68k or x86 style register space. (ie lots of loading into zero page rather than executing outside of zero page; but with that scratchpad area running 2x as fast, you'd at very least lose less performance for that sort of code, and in some cases it might even run faster than not copying into fast RAM)


The sound system also doesn't appear to be the Ricoh sound chip I was thinking of, it's just a simpler set of 8 sound channels with 4 fixed sample rates (so no note-scaling via sample rate like the Amiga does, you need samples pre-scaled in RAM). You set the sample length and loop them, so it's still better than software mixing on the STe. It's closer to the DMA sound on the Falcon, but 8-bit and 40 kHz max.


The 16 color 320x192 mode would probably be useful for doing some software sprites and/or more complex backgrounds than the tiled mode, especially if packed pixels are used and not bitplanes. The A8 hardware uses packed pixels, so that would make sense, and that rotation mode would need to use packed pixels to work efficiently. (then again, the SNES uses bitplanes for sprites and tilemap graphics except for mode 7, which uses 8-bit chunky pixels)

No mention of hires monochrome mode, like the ST has, even though it'd use the same bitrate as the other 2 bitmap modes (4MB/s for those, same as all Atari ST screen modes) and 640x400 71.5 Hz monochrome would be a really nice feature to see in a low-cost 8-bit computer. Or similarly, they could've supported VGA analog RGB modes in 2 colors or 320x400 4 colors (or 320x200 line-doubled), 320x200 16 colors line doubled too if they used full 8 MB/s.

Also no 32 color bitmap mode, though that would have to use bitplanes, it would fit into the 320x5-bit line buffer, just with a bit more CPU cycle stealing. So you'd have a 32 color Amiga bitmap mode for graphics/art stuff and some games. Better would be using a 6.4 MHz pixel clock (8MHz x4/5) and use the same 50% CPU cycle stealing to do 256x192 32 colors with sprite DMA, or 320 pixels overscanned. (about 300 pixels visible on most TVs)
6.4 MHz pixel clock is also close to square in NTSC 60 Hz TVs and monitors, so even better for art.

Also no 256 color mode, which it lacks the palette RAM to do, but since it must have a ROM look-up table for emulating the 256 color A8 palette, they could've just used that for a 160x192 256 color mode (also good for 3D or pseudo 3D with chunky pixels, or for a 256 color version of that screen rotation mode). This would also be easier to add to a chunky pixel system vs any bitplane mode where you need extra on-chip buffers and bit-shifting logic to convert bitplane data to screen pixels.
Technically, the 320x5-bit line buffer could be used for 200x8-bits and a 200x192 256 color screen. (you'd want a 5 MHz pixel clock to allow sprite DMA, or just use 6.4 MHz for square pixels and a larger border) You also can't use the palette data for sprites anymore, so they'd be just 8 pixel wide 256 color sprites. You also couldn't place sprites outside of that 200 pixel area.
OTOH 160 pixels wide could use a 10-bit line buffer allowing 256 color framebuffer and normal 16 color sprites for 256+32 = 288 colors on screen at standard C64 160x192 pixel size and screen size. That also means those 16 sprites can more than fill the screen width and would give you some nice options for sprite-based parallax in the foreground.

They might have thought adding a higher color low-res mode was silly, but realistically it probably would've had a lot of useful situations, but on top of that would be great for 3D games. And it's certainly no more silly and no less useful than the lowres 16 color (or 9 color) GTIA modes. Mirroring actual GTIA modes would've been a 16-bit pixel mode using 80x192 pixels with the full 4096 colors. Or, better than that, 160x192 with 4096 colors using max CPU cycle stealing. Well, except you only have 1600 bits of line buffer space or enough for 133x12-bits max.

In terms of chip space and expense, special low-res high color modes would've been much cheaper than the rotation feature, though having SNES style full screen rotation or texture mapped ground planes would have certainly been impressive at the time, even more than it was in 1991. Though technically you could also do some of that in software, especially in a low-res 8-bit chunky pixel mode. (easier if you gave the CPU a hardware multiplier like in the Lynx ... or SNES, but much faster than that)




Also I'm not convinced the DRAM shortage alone killed this project, since it would've been super easy to just use 2 32kx8-bit SRAMs to populate the 2 RAM banks and then not even have to worry about DRAM interfacing. Keeping PSRAM open as an option would've further expanded this while using the same pinout and packaging as SRAM, so the same motherboards could use either (you just need some refresh logic to allow PSRAM to run its fastest, though scanning out the screen data could auto-refresh that and DMA sound cycling could probably handle it during vblank).
64 kB of SRAM would still be enough to keep it backwards compatible with the XEGS and 65XE, though PSRAM would've kept the price point closer to that of DRAM (or possibly cheaper than DRAM during the height of the shortage). DRAM is inherently cheaper to make, but it's also the only type of RAM that plugs into all those PCs and clones in 1988/89 where SRAM and PSRAM do not. (aside from 8-bit or 16-bit ISA bus cards populated with SRAM, or proprietary RAM upgrade slots on some boards, but almost all of those were populated with DRAM, too ... though I think the IBM convertible used SRAM upgrade cards)

And hell, using SRAM as main RAM in home computers was an old Tramiel thing when it was the cheaper option for one reason or another (usually surplus stock with nothing better to use it for). That's what the PET and VIC-20 used.

But also since this was a computer, not just a game console, and it would've used the same DRAM chips as the 65XE, 130XE, and Lynx at the time, with 128 kB minimum (2 64kB banks), this shouldn't have been a deal breaker.

  • Sad 1
Link to comment
Share on other sites

Kool Kitty89.  Wow, that's got to be some kind of record.  That is the longest single post I've ever seen, How is it that you know so much about memory timings, I/O and processors?  Bye the way I'm talking about the previous post.  I'll read the one above, when I have time...

Edited by mutterminder
Link to comment
Share on other sites

  • 4 months later...
On 1/30/2022 at 7:51 AM, leech said:

What it comes down to is they both were kind of crap in comparison to some of the other platforms.  The problem is the other platforms were ran by companies that were floundering, or just outright controlled by corrupt individuals.  Apple barely survived the 90s.  IBM compatibles survived because 'No one ever got fired for buying IBM' was a thing at one point, and propagated to home use.  The thing that neither of these (PC Compat, Mac) were more than mediocre at was gaming.  As soon as VGA and sound cards came about... well the rest is history.  Atari/Commodore/others just couldn't keep up with the technical improvements that came from the IBM open architecture.  Just think, if the reverse engineering of the IBM BIOS never happened, the computer industry today would be very very different.  No clue if it'd be any better, but it would definitely be different.

 

Now mind you... this is just discussing HOME computer usage.  What about all the other big iron competitors that all basically became consolidated into the x86 (and now more and more ARM)?  I'm talking about SGI, Sun, DEC, etc.  These systems were amazing for the time!  The end of them?  Well the infighting of the various Unix flavors.  Funny enough, now we basically have a few flavors of BSD, and the many distributions of Linux to replace those.  But the real reason we don't see them anymore is because of Linux being a free Unix-like operating system for x86 (commodity hardware), so people no longer had to pay 50k-200k to SGI for a nice workstation...

Well, what killed off the DEC Alpha platform had nothing to do with linux or BSD. It was the Intel Itanium, which promised a lot in theory,  but after years of trying, failed to deliver.  Somehow DEC were convinced by Intel that Itanium was the future. That decision killed DEC. 

MIPS, Sun SPARC etc... killed by Intel and AMD both advancing x86 and AMD's x64 faster than the other companies could develop theirs, whilst keeping prices low enough to compete. Motorola and IBM suffered the same fates with PPC in the desktop arena, mainly because their only desktop customer was Apple, which was a tiny market. It was too costly to try and compete.

 

The general conversation about Atari, CBM, Apple and IBM compatibles is lacking. A lot of places went from proprietary UNIX boxes, to cheaper commododity hardware running Windows NT, not Linux. The IBM compatibles eventually dominated, but IBM themselves floundered and lost the market. Lots of companies came along, all making the same basic product, but more cheaply than IBM. Intel were happy to help, and MS went to great lengths to help them too. The 'PC' didn't succeed because IBM had made a better platform, but because they lost control of one that was easily copied, allowing everyone to throw development at it to see what would stick. IBM didn't see it coming, nor did anybody else.

 

There were many companies solely designing graphics cards with advanced features. The likes of ATI, 3DFX and Nvidia produced custom chips that went in to common expansion slots, and eventually settled on standardised APIs. The same with sound cards etc. Companies like Atari and Commodore would struggle because they couldn't possibly match the combined resources of the developing PC market, even though that market had no clear plan. But today, most of those really big PC manufacturers have disappeared too, along with many successful peripheral manufacturers. Unfortunately,  Tramiel was right... business is war.

 

 

  • Like 1
Link to comment
Share on other sites

On 12/12/2023 at 4:39 PM, Gromit337 said:

The general conversation about Atari, CBM, Apple and IBM compatibles is lacking. A lot of places went from proprietary UNIX boxes, to cheaper commododity hardware running Windows NT, not Linux.

Really depends on which industry you're talking about.  Industrial Light and Magic, for example, definitely did not go to NT based systems, and switched from SGIs to Linux machines.

 

It also depends on when they did the switch.  I know there were many, many companies that went from using Solaris to Linux.  I don't think Windows NT really ever had that much wide spread adoption outside of a few places.  Most places that were Unix shops remained being *nix shops.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...