Jump to content
IGNORED

Its 1993, you're in charge of the Jag, what do you do?


A_Gorilla

Recommended Posts

Weren't some ideas carried over into nuon, not Jag II necessarily but jag generally?

Was the flare team involved with Nuon? I didn't know about that, it seems like Flare had a lot of unfortunate applications of their designs, I don't know much about more recent stuff, but from those I know of, the Jaguar might have been the Most successful. Flare 1/Konix Multisystem went unreleased -and seemed like a pretty bad idea in the case of the multisystem, and then the Jaguar, and possibly Nuon)

 

 

It makes you wonder, not only what might have happened at Atari with different management among other issues, but what might have happened to Flare partnered with another company, a more well funded game-oriented company. I doubt Sega would have though, given how the Japanese management turned down the Silicon Graphics chipset (unless that did have to do with the chipset and not stigma of foreign development), Sony had their own stuff, and NEC didn't seem particularly interested in western designers wither, though I'm not sure about that. Nintendo did in fact go with Silicon Graphics, so that at least shows they were willing to work internationally. If Sega really did decline the SGI chipset purely on technical grounds (indeed it did seem to take a good while for nintendo to get things worked out with the hardware), the Jaguar chipset really is along the lines of what Sega had been aiming at. If NEC had been willing to look to western developers (they were quite willing to outsource, with the PCE being designed by Hudson), the Jaguar chipset would have probably fit well with them too.

 

In any such case, Flare would have had the benefit of much more funding and a company in a much better position to support the product as well and manage the hardware cost. (in NEC's case, they could have manufactured the chips themselves and I think they had their own CD drives as well) In all the cases, the hardware price could have been subsidized by the companies, though probably not to the extent Sony did. Given the design, I'd imagine any of those companies would have opted for a MIPS R3000 CPU as well given the selection Flare provided. (unless Flare had been directed to use an alternate architecture from the 68x/X86/MIPS the Jaguar supported, like one of NEC's own RISC architectures or Hitachi's Super-H in sega's case)

Edited by kool kitty89
Link to comment
Share on other sites

Weren't some ideas carried over into nuon, not Jag II necessarily but jag generally?

Was the flare team involved with Nuon? I didn't know about that, it seems like Flare had a lot of unfortunate applications of their designs, I don't know much about more recent stuff, but from those I know of, the Jaguar might have been the Most successful. Flare 1/Konix Multisystem went unreleased -and seemed like a pretty bad idea in the case of the multisystem, and then the Jaguar, and possibly Nuon)

Yes, Miller & Mathieson designed the chip, and Jeff the Yak Minter was involved from the beginning as well. Nuon was an interesting beast - multi-core 128-bit VLIW chip. To get anything useful out of it required hand crafted (you even had to manually pack your instructions together since up to 5 - maybe it was 7 could be ran at once on a single clock cycle) assembly. There was a C compiler with some OpenGL like libraries, but performance was absolute crap. It was a non-optimized GCC compiler hastily adapted for the Nuon chipset. Sound familiar? It's amazing Minter got Tempest 3000 working at all. 100% assembly that was, including the shaders (long before shaders existed on the PC).

 

Pretty cool Nuon page (sadly not updated in ages) is here. My Nuon still sits next to the Jag - 2 Tempests and 2 VLMs :)

Link to comment
Share on other sites

I don't see how (the Jag II) could achieve the same raw performance as the MIPS in the Playstation 1.

Are you talking about the existing jag II prototype or the specs for the final design?

Jag II specs. There was no prototype that included the "C-friendly" JagRISC. That chip design was unfinished. The only chip that made it to early prototype was the graphics chip.

 

The Jag II had a lot of raw power -- but so did the Jag. Like the Jag, it wasn't very accessible to mortal programmers.

 

Certain consoles have been very accessible, including the Playstation 1, XBox 360, etc. If you're the underdog, ease of programming sure helps your chances. The Jag II took some steps in the right direction, but the 3D performance locked inside was very hard to tap compared to contemporary consoles. And the C performance, while nowhere near as bad as the Jag I, wasn't as good as the PS1 or N64.

 

The Nuon was also very Jag-like in its inaccessibility. Stephen summed it up well.

 

The Nuon reminds me of the Cell processor in the PS3 -- the Nuon "software" DVD decoder has a very similar architecture to the PS3's Blu-Ray decoder.

 

Like the Cell, the Nuon has one "main" processor with a cache and a number of stripped-down slave processors that must live inside scratchpad memory. In both cases, a flexible DMA engine ties it all together. Also like the Cell, the Nuon is strictly in-order, with ugly scheduling requirements exposed to the compiler.

 

On the plus side, this type of multicore design gets you the most raw power for your dollar. On the down side, you need amazing tools and very motivated developers to push it to its limit.

 

I don't know much about why the Nuon failed, but I know it shipped much later than planned and was not cost effective as a DVD player chipset when it did ship. Maybe if it had shipped sooner and cheaper, more device makers would have used it, which might have led to more games. It's hard to imagine game developers getting excited about it without a big install base.

 

- KS

Edited by kskunk
  • Like 1
Link to comment
Share on other sites

I don't see how (the Jag II) could achieve the same raw performance as the MIPS in the Playstation 1.

Are you talking about the existing jag II prototype or the specs for the final design?

Jag II specs. There was no prototype that included the "C-friendly" JagRISC. That chip design was unfinished. The only chip that made it to early prototype was the graphics chip.

 

The Jag II had a lot of raw power -- but so did the Jag. Like the Jag, it wasn't very accessible to mortal programmers.

Looking a bit at the Jag II chipset, it doesn't seem like they shifted towards the mainstream mass market as much as they could have in terms of graphics, they seemed to focus on a proportional succession of the Jaguar's strengths instead. (ignoring the C friendly CPU bit for the moment) My understanding is a bit limited, but from what I can get out of the technical description here: http://www.atarimuseum.com/videogames/consoles/jaguar/jag2specs.html For one thing, the addition of phong shading seems a bit unnecessary, though I'm not sure about the design as a whole. It seems like they're still relying on the custom RISC MPUs to perform the 3D math and rasterize triangles. (unless I'm mistaken, the blitter can't draw geometric primitives on its own)

Lacking a lot of those features makes sense on the Jaguar, when they started the design, it was far from clear what methods would become dominant, but for the Jaguar II, I'd have thought they'd follow more along the lines of what was becoming industry standard. (3D objects rendered using triangular primitives with an emphasis on texture mapping, followed by gouraud shading, filtering, antialiasing, etc)

I mean, Sega did a complete 180 with the Dreamcst compared to the Saturn, and while, if the Jaguar II was to be an evolution of and compatible with the Jaguar chipset, it couldn't be quite as radically different, it could have still shifted emphasis to what were clearly (by the mid 1990s) becoming the standards for "real 3D."

 

It may be a shame that video game hardware companies and PC graphics accelerator architectures didn't put more effort into flexibility and support for things like voxel rendering and advanced 2D, but it doesn't really make sense to try to go against all that when you're a minority party. Perhaps they could have found a niche market catering to those who preferred good shading, 2D, etc over the mainstay, but I'm not sure they could have done that. Even with focus shifting towards industry standard, with the hardware still being derived from the original Jaguar, there was going to be some capabilities biased to the original concept, so would undoubtedly share some advantage in particular aspects contemporary platforms neglected. (but it would likely be relatively few developers to take advantage of those aspects)

 

Being tied down to the 68k architecture (for compatibility) probably wouldn't have helped too much either, at least not with the planned 68EC020 for the Jaguar II, it probably wouldn't have been all that useful to the overall system, or if useful, still rather limited. Perhaps another 68k CPU could have been used, but I doubt anything above a 68EC040 would eb practical, and even then still probably a bit expensive. (an 060 would be out of the question)

 

Certain consoles have been very accessible, including the Playstation 1, XBox 360, etc. If you're the underdog, ease of programming sure helps your chances. The Jag II took some steps in the right direction, but the 3D performance locked inside was very hard to tap compared to contemporary consoles. And the C performance, while nowhere near as bad as the Jag I, wasn't as good as the PS1 or N64.

The Dreamcast and Xbox (especially the former) would fit that kind of developer-friendly category as well, wouldn't they?

 

Though, in the N64's case, there's always the problem of costly cartridges, something the Jaguar II shouldn't have had to deal with and the PSX obviously didn't. Still, using the custom architecture does kind of screw thing up for that. (had it been only used for the graphics portion it might have made a bit more sense though -except they'd still have had to find a commercial CPU with suitable cost/performance ratio, a bit late being tied to the 68k for compatibility though)

 

 

The Nuon was also very Jag-like in its inaccessibility.

Did Flare (or any of the members of the Flare team) have anything to Nuon?

 

I don't know much about why the Nuon failed, but I know it shipped much later than planned and was not cost effective as a DVD player chipset when it did ship. Maybe if it had shipped sooner and cheaper, more device makers would have used it, which might have led to more games. It's hard to imagine game developers getting excited about it without a big install base.
I'd imagine the PS2 had something to do with it: a video game console with out-of-the-box DVD video support at a competitive price with contemporary standalone DVD players and from Sony, who already had a ton of hype built up around the console and tied into their large established user base.

 

Hmm, that does make me wonder a bit though, why didn't any movies include interactive content for specific use with the PS2 in a manner similar to Nuon?

 

 

On the PS3 and Cell, I'd gotten the impression from comments on it that the PS3 is actually a bit more friendly to work with than the PS2. (not sure if that refers to the base hardware, tools, or both)

Link to comment
Share on other sites

I don't know much about why the Nuon failed, but I know it shipped much later than planned and was not cost effective as a DVD player chipset when it did ship. Maybe if it had shipped sooner and cheaper, more device makers would have used it, which might have led to more games. It's hard to imagine game developers getting excited about it without a big install base.

 

I'd imagine the PS2 had something to do with it: a video game console with out-of-the-box DVD video support at a competitive price with contemporary standalone DVD players and from Sony, who already had a ton of hype built up around the console and tied into their large established user base.

 

Nah, different market. NUON was promoted as a 3DO like licensing platform, and specifically as an advanced feature DVD player that also played games. Not as a game console in and of itself. It's aimed competitors were other DVD players, not high end game consoles of the time. I was actually at their private show area at the Winter CES back in 2001, probably the only outside person I know of to have taken pictures of everything for documentation. What killed it was lack of DVD industry adoption (very few titles NUON specific were released), and the cost of it's first generation chips. That raised the cost of anybody manufacturing their NUON licensed unit, and that combined with lack of a large library of NUON enhanced DVD movies (and even smaller game library) didn't give the consumer much of a reason to spend the extra to buy the units. VM Labs was working on a cost reduced version of the NUON chips, but went bankrupt before they could be put out. They were bought out by Genesis Microchip, who in turn is now owned by STMicroelectronics. My partner with atarihq.com, Keita Iida, was director of product marketing at VM Labs - that include game licensing, etc. Most of the people involved over there were actually ex Atari Corporation people.

Link to comment
Share on other sites

I don't see how (the Jag II) could achieve the same raw performance as the MIPS in the Playstation 1.

Are you talking about the existing jag II prototype or the specs for the final design?

Jag II specs. There was no prototype that included the "C-friendly" JagRISC. That chip design was unfinished. The only chip that made it to early prototype was the graphics chip.

 

The Jag II had a lot of raw power -- but so did the Jag. Like the Jag, it wasn't very accessible to mortal programmers.

Looking a bit at the Jag II chipset, it doesn't seem like they shifted towards the mainstream mass market as much as they could have in terms of graphics, they seemed to focus on a proportional succession of the Jaguar's strengths instead. (ignoring the C friendly CPU bit for the moment) My understanding is a bit limited, but from what I can get out of the technical description here: http://www.atarimuseum.com/videogames/consoles/jaguar/jag2specs.html For one thing, the addition of phong shading seems a bit unnecessary, though I'm not sure about the design as a whole. It seems like they're still relying on the custom RISC MPUs to perform the 3D math and rasterize triangles. (unless I'm mistaken, the blitter can't draw geometric primitives on its own)

 

 

Those specs seems wildly inflated compared to what I was told at the time, and also the info in the midsummer tech guide. There's definitely no phong shading in hardware :)

Link to comment
Share on other sites

Weren't some ideas carried over into nuon, not Jag II necessarily but jag generally?

Was the flare team involved with Nuon? I didn't know about that, it seems like Flare had a lot of unfortunate applications of their designs, I don't know much about more recent stuff, but from those I know of, the Jaguar might have been the Most successful. Flare 1/Konix Multisystem went unreleased -and seemed like a pretty bad idea in the case of the multisystem, and then the Jaguar, and possibly Nuon)

Yes, Miller & Mathieson designed the chip, and Jeff the Yak Minter was involved from the beginning as well. Nuon was an interesting beast - multi-core 128-bit VLIW chip. To get anything useful out of it required hand crafted (you even had to manually pack your instructions together since up to 5 - maybe it was 7 could be ran at once on a single clock cycle) assembly. There was a C compiler with some OpenGL like libraries, but performance was absolute crap. It was a non-optimized GCC compiler hastily adapted for the Nuon chipset. Sound familiar? It's amazing Minter got Tempest 3000 working at all. 100% assembly that was, including the shaders (long before shaders existed on the PC).

 

Pretty cool Nuon page (sadly not updated in ages) is here. My Nuon still sits next to the Jag - 2 Tempests and 2 VLMs :)

Same here. people often ask wjy I have 2 dvd players hooked up, then I show them the Samsing/Nuon with tempest, then they get it! :D

Link to comment
Share on other sites

My understanding is a bit limited, but from what I can get out of the technical description here: http://www.atarimuseum.com/videogames/consoles/jaguar/jag2specs.html

Like a lot of "console specs" on the internet, that is mainly just fanboys and/or confused magazine editors making up crazy stuff. ;)

 

The real Jag II used a 68000. It had a C-friendly RISC, with cache, which was both faster and more bus efficient than a 68020. The 68000 was there for backward compatibility and to boot the custom chips. (Sounds familiar!)

 

The emphasis was on fast texture mapping. In pure texture mapping benchmarks, it was almost double the speed of a Playstation 1. It had polygon rasterizer hardware in the blitter. It was also capable of bilinear filtering like the N64, but at a much lower speed -- possibly too low to be used in fast moving games. (Then again, the N64 was hardly a framerate king...)

 

The other smart move was a push away from CRY to RGB datapaths. Most features work equally well in CRY and RGB. This made its lighting effects 100% comparable with contemporary consoles.

 

Despite its benchmark performance, I still have questions about its real world performance. The bus does not seem very balanced. It uses the same size and speed memory as a Jaguar 1, which was notoriously starved for bandwidth. There is a second bus for audio, but too slow for the new RISC CPU. So the graphics bus would be loaded with at least 4 masters: Blitter texturing, GPU doing Transform and Lighting, object processor showing frame buffer, main RISC running game logic.

 

Also, the texture cache was software managed, whereas the PS1 was automatic. This would have a big framerate hit in ordinary scenes, which we know because the N64 has an almost identical texture cache design. Another downside is that color look-up occurred in the object processor, not in the blitter (as in other consoles), which wasted memory and memory bandwidth, two resources the Jag II did not improve on!

 

I'm not aware of any real game engines that run on the Jaguar II. My own opinion is that it could reach Playstation 1 performance and maybe have a few new graphical effects. It seems to have a lot of potential along with familiar pitfalls.

 

Anyway, the only real source of info is the "Midsummer" PDF that was released, but it's heavily technical...

 

- KS

Edited by kskunk
  • Like 2
Link to comment
Share on other sites

My understanding is a bit limited, but from what I can get out of the technical description here: http://www.atarimuseum.com/videogames/consoles/jaguar/jag2specs.html

Like a lot of "console specs" on the internet, that is mainly just fanboys and/or confused magazine editors making up crazy stuff. ;)

 

I hope you're not implying Curt (whose site that's a link to) as being a fanboy making up stuff. While those specs were provided by Markus Kirschbaum, Curt always go by actual internal Atari documents, emails, etc. for the type of material we research and release.

Link to comment
Share on other sites

I hope you're not implying Curt (whose site that's a link to) as being a fanboy making up stuff.

Of course not. Curt knows his stuff! I appreciate Curt's love of archiving. I don't expect him to fact check all the documents he does archive.

 

As for the specs themselves, I don't know their origin. Those rumors have been floating around the net for many years. I first saw some of those specs over a decade ago and it really made me scratch my head. Markus probably compiled them from another source, based on other sources, who in turn were based on other sources, and rumor mills, and so on.

 

This is a typical level of quality for "leaked" console specifications. The M2 has similar distortions in its "leaked specs".

 

- KS

Edited by kskunk
  • Like 2
Link to comment
Share on other sites

I'm not sure, but I vaguely recall Curt saying something like those earlier documents where later found to be incorrect.

 

And Atari didn't mind releasing grossly inflated technical specs. They did with the Panther and the Jaguar (remember those "850 millions pixels per second" talks :D).

Link to comment
Share on other sites

I don't know much about why the Nuon failed, but I know it shipped much later than planned and was not cost effective as a DVD player chipset when it did ship. Maybe if it had shipped sooner and cheaper, more device makers would have used it, which might have led to more games. It's hard to imagine game developers getting excited about it without a big install base.

I'd imagine the PS2 had something to do with it: a video game console with out-of-the-box DVD video support at a competitive price with contemporary standalone DVD players and from Sony, who already had a ton of hype built up around the console and tied into their large established user base.

Nah, different market. NUON was promoted as a 3DO like licensing platform, and specifically as an advanced feature DVD player that also played games. Not as a game console in and of itself. It's aimed competitors were other DVD players, not high end game consoles of the time. I was actually at their private show area at the Winter CES back in 2001, probably the only outside person I know of to have taken pictures of everything for documentation. What killed it was lack of DVD industry adoption (very few titles NUON specific were released), and the cost of it's first generation chips. That raised the cost of anybody manufacturing their NUON licensed unit, and that combined with lack of a large library of NUON enhanced DVD movies (and even smaller game library) didn't give the consumer much of a reason to spend the extra to buy the units. VM Labs was working on a cost reduced version of the NUON chips, but went bankrupt before they could be put out. They were bought out by Genesis Microchip, who in turn is now owned by STMicroelectronics. My partner with atarihq.com, Keita Iida, was director of product marketing at VM Labs - that include game licensing, etc. Most of the people involved over there were actually ex Atari Corporation people.

 

 

 

I only remember the Buckaroo Banzai DVD actually making use of the Nuon for its menus.

 

I finally saw a Nuon this weekend at the Davis Atari Show. It was interesting.

Link to comment
Share on other sites

Here is the full movie list

Only four DVD releases utilized NUON technology:

 

Planet of the Apes

The Adventures of Buckaroo Banzai Across the Eighth Dimension

Bedazzled

Dr. Dolittle 2

Here is a list of games movies etc released and unreleased

 

Eight games were released for the NUON:

 

Tempest 3000

Freefall 3050 A.D.

Merlin Racing

Space Invaders X.L.

Iron Soldier 3 (later recalled due to incompatibility with some players)

Ballistic (only available with Samsung players)

The Next Tetris (only available with Toshiba players)

Crayon Shin Chan 3 (Korean-only release)

[edit] Unreleased games

aMaze

Atari's Greatest Hits

Battleship: Surface Thunder

Boggle

Breakout

Bugdom

Bust-A-Move 4

Dragon's Lair

Dragon's Lair II: Time Warp

The Game of Life

Hoyle Card Games

Jeopardy

Knockout Kings

Madden NFL Football

Monopoly

Myst

Native II

Need For Speed II

New Scrabble

Pitfall: The Mayan Adventure

Pong

RC de Go!

Risk 2

Riven

Shanghai: Mahjongg Essentials

Sorry!

Space Ace

Speedball 2100

Spider-Man

Star Trek: Invasion

Tiger Woods PGA Golf

Wheel of Fortune

Yahtzee

zCards

[edit] Collections and samplers

Interactive Sampler (three different versions)

NUON Games + Demos (collection from NUON-Dome)

NUON-Dome PhillyClassic 5 Demo Disc (giveaway collection)

35.Cheese LAND 2

 

[edit] Homebrew development

During late 2001, VM Labs released a homebrew SDK which allowed people to be able to program apps/games for their NUON system. Only the Samsung DVD-N501/DVDN504/DVDN505 and RCA DRC300N/DRC480N can load homebrew games. The Samsung DVDN-2000 and the Toshiba cannot.[citation needed] The RCA DRC300N and RCA DRC480N cannot play commercial NUON games.[citation needed].

 

[edit] Homebrew releases

Several homebrew titles have been created for or ported to NUON. They are not commercially available and require the user to burn the material to a NUON-compatible CD-R.

 

Ambient Monsters

Atari / C64 Video Game Music Player

Atari 800 Emulator

Atari 2600 PacMan (hacked version of VLM's Chomp)

BOMB

Breakout

Chomp (sample game included with the second NUON SDK)

Decaying Orbit (port of the Yaroze game)

Doom (port of the shareware edition)

PacMan - Tournament Edition (hacked version of VLM's Chomp)

SameGame - Colors

SameGame - Shapes

Sheshell's Sea Adventure

Snake

Synth Demo

Yaroze Classics (features Katapila, Invs & BreakDown)

[edit] External links

http://www.nuon-dome.com/

http://www.vmlabs.de/ NUON Alumni Page

http://www.nuon-eternal.8m.com/

 

It is quite fun!

Link to comment
Share on other sites

And Atari didn't mind releasing grossly inflated technical specs. They did with the Panther and the Jaguar (remember those "850 millions pixels per second" talks :D).

Bwahaha, I always loved that spec. Although, it's only misleading, not totally incorrect. If you turn off all other processors and disable DRAM refresh, the Jaguar's blitter is capable of "rendering" (ha) a checkerboard pattern in a 1bpp framebuffer at a rate of 850 million pixels per second. You could even do slightly more interesting monochrome patterns.

 

Sadly, there would not be enough bandwidth left to actually display the pattern... Ah I love marketing!

 

But the linked Jaguar II specs are an utter fabrication. My theory is that some Jaguar fans dutifully compiled specs from the rumor mill. I don't blame the fans, I blame the magazine editors and rumor mongers on Usenet who post made-up stuff to get attention.

 

They didn't even get the chip codenames right, or the number of transistors, or number of pins, or... actually, I don't see a single correct spec in the entire list... It's such a weird document!

 

I mean seriously, what the hell was the Sinus-enabled JMPEG COMBI chip? Where does this stuff come from?

 

- KS

Edited by kskunk
  • Like 1
Link to comment
Share on other sites

The real Jag II used a 68000. It had a C-friendly RISC, with cache, which was both faster and more bus efficient than a 68020. The 68000 was there for backward compatibility and to boot the custom chips. (Sounds familiar!)

Is it the same RISC core as the Jaguar and did they fix the problem with Jerry tied to the 16-bit data bus due to the 68k acting as host?

 

The other smart move was a push away from CRY to RGB datapaths. Most features work equally well in CRY and RGB. This made its lighting effects 100% comparable with contemporary consoles.

So it supported gouraud shading in 16-bit RGB? (the Jaguar could already do 15/16-bit RGB rendering for flat shading, 2D, and textures, right?) Did they fix the issue requiring 32-bpp textures when using 24-bit RGB mode? (due to no color look-up for 24-bit RGB mode

 

It uses the same size and speed memory as a Jaguar 1, which was notoriously starved for bandwidth. There is a second bus for audio, but too slow for the new RISC CPU. So the graphics bus would be loaded with at least 4 masters: Blitter texturing, GPU doing Transform and Lighting, object processor showing frame buffer, main RISC running game logic.
So they clocked the RISCs faster but left the DRAM the same speed as the Jag 1? I'd have thought by the ~1996/97 EDO DRAM would at least have become a viable option (at least double, if not triple the DRAM speed over the Jag I). Contemporary systems had moved on to SDRAM, or in some cases, RDRAM (N64 PS3). Givne that the DRAM prices started falling again in the late 90s, that certianly should have facilitated things. They did go up to 8 MB of DRAM, right?

It doesn't make sense that they'd be using the same 75 ns FPMDRAM of 1993 for the Jag II though, especially with the single bus design. (it might have made more sense if they'd adoped multiple busses with at least 1 using fast, more expensive RAM -VRAM or SDRAM)

 

Even with 25 ns EDODRAM they'd be falling short of the N64's shared bus's peak bandwidth (562.5 MB/s), though the N64's case is a beit weird as the main RAM is on a 9-bit bus and from what I understand, a MMU controls all accesses to it from the 32-bit busses the RSP and CPU are connected through (given their individual speeds, neither would be able to saturate the bus alone), but on top of that there seems to be a lot of latency in that set-up as well and I don't have anywhere near the understanding to make a proper comparison.

Edited by kool kitty89
Link to comment
Share on other sites

Everything is documented pretty well in the tech reference - Gorf was kind enough to upload it here.

 

http://www.atariage.com/forums/topic/15557-jaguar-ii-tech-ref/page__view__findpost__p__1456621

 

All of the system parts have full 64 bit access to memory though ... but it's less important for the sound dsp, as it has it's own ram to fetch samples from ( 8 bit or even 4 bit to save costs )

The memory is faster - 33MHz rather than 26, but it's still cheap dram.

Link to comment
Share on other sites

Is it the same RISC core as the Jaguar and did they fix the problem with Jerry tied to the 16-bit data bus due to the 68k acting as host?

Yes, same core as before. And yes, Puck (the new Jerry) had a 64-bit bus with bus mastering. In English, it is as fast as Tom at bus accesses.

 

However, we don't have a netlist for Puck. It's not clear if it was really well into implementation or still just designs on paper. We do know it was never prototyped. Anything about the "new RISC" or Puck is based on the design document linked by Crazyace. It's hard to know if compromises would be required during implementation.

 

So they clocked the RISCs faster but left the DRAM the same speed as the Jag 1? I'd have thought by the ~1996/97 EDO DRAM would at least have become a viable option

The RISCs were clocked at exactly the same speed as the DRAM.

 

As Crazyace noted, they had hoped to push the DRAM (and thus RISC clocks) to 33MHz (a 25% boost). But it was the same type of DRAM, and same amount, 2MB. Somewhere, I picked up the idea the prototypes only worked at 26.6MHz, but that doesn't mean they wouldn't get to 33MHz.

 

They could have moved to 4MB given a hypothetical 1997 release, but they weren't thinking that way when the project was killed. DRAM was still expensive in early 1996 -- I don't think anybody had a crystal ball as good as our 20/20 hindsight. ;) Sony's PS1 in particular was still bleeding money, because of its 3.5MB DRAM. Sony had bet on DRAM prices dropping and got screwed. A mistake like that would kill a company like Atari, but Sony could suck it up.

 

You're correct that new DRAM types were appearing on the scene, but they were still a good premium over standard DRAM. More importantly, using a different kind of DRAM could have impacted backward compatibility. Using the exact same DRAM as the Jaguar I meant that when running at 26.6MHz, the machine may as well have been a Jaguar I. In fact, all indications are that Jaguar I compatibility is quite good on the surviving prototypes.

 

- KS

Edited by kskunk
Link to comment
Share on other sites

The prototype is described in the appendix of the tech guide - They mention running at 26.6MHz and the various oberon bugs.

 

In my opinion, JagII wouldn't match either the PS1 or Saturn - in places it would be close, but really it's more what the jaguar should have been.

Link to comment
Share on other sites

The RISCs were clocked at exactly the same speed as the DRAM.

Hmm, maybe I'm missing something again. The Jag used 75 ns FPM DRAM, right? So that would be rated for ~13.33 MHz, right? Hence the 106.4 MB/speak bandwidth figure for the 64-bit bus.

 

They could have moved to 4MB given a hypothetical 1997 release, but they weren't thinking that way when the project was killed. DRAM was still expensive in early 1996 -- I don't think anybody had a crystal ball as good as our 20/20 hindsight. ;) Sony's PS1 in particular was still bleeding money, because of its 3.5MB DRAM. Sony had bet on DRAM prices dropping and got screwed. A mistake like that would kill a company like Atari, but Sony could suck it up.

Going to 4 MB wouldn't be as efficient though, it's not until 8 MB that you'd be able to use 2 MB x16-bit DRAM chips. I see what you mean about prices though, but couldn't they have kept that portion of the design open in case the market made a change prior to release? (or was the Jag II really planned for a 1996 release -which would seem a bit rushed, just like the Jag was in 1993)

 

The prototype is described in the appendix of the tech guide - They mention running at 26.6MHz and the various oberon bugs.

 

In my opinion, JagII wouldn't match either the PS1 or Saturn - in places it would be close, but really it's more what the jaguar should have been.

The more I find out about the Jaguar II, the more it seems like it's just a bug-fixed Jaguar with soem enhancements to match current market trends (RGB and fast texture mapping) Somehow I'd gotten the impression of the Jag II being a lot more capable, not just that article I mentioned, but previous discussions with Gorf.

Wasn't it supposed to include a 3rd JRISC for game logic, AI, and such?

Link to comment
Share on other sites

Hmm, maybe I'm missing something again. The Jag used 75 ns FPM DRAM, right? So that would be rated for ~13.33 MHz, right?

It's just a matter of perspective. The page mode cycle time is 13.3MHz. To achieve that speed, the DRAM controller runs at 26.6MHz. The ideal Jag II page mode cycle time would thus be 16.67MHz. As we discussed before, the prototypes weren't that fast.

 

But couldn't they have kept [an 8MB] design open in case the market made a change prior to release?

The Jaguar (and Jag II) memory map doesn't support that kind of DRAM arrangement. There are two banks of DRAM, each up to 4MB. Each bank must be wired to a separate set of chips. The Jag II documentation states clearly that only 2MB (1 bank) and 4MB (2 banks) are options.

 

However, both the Jag I and II can support 8MB of memory if you use two 8MB banks, or 16MB of chips... half of each chip is wasted. Not exactly the most cost effective path. ;) If you tried to add external glue to split the 2 4MB banks across a single 8MB bank, the chipset will thwart you by trying to leave two pages open at once.

 

External glue would let you hook up 16 chips to reach 8MB, but that is also cost-ineffective. They really didn't plan for that.

 

Somehow I'd gotten the impression of the Jag II being a lot more capable. Wasn't it supposed to include a 3rd JRISC for game logic, AI, and such?

The Jag II is a lot better than the Jag I, no doubt about it. The "3rd JRISC" is the "main RISC CPU" we've been discussing. That existed on paper but not in silicon. It was part of Puck, aka Jerry II.

 

In my opinion, JagII wouldn't match either the PS1 or Saturn - in places it would be close, but really it's more what the jaguar should have been.

The N64-style bilinear texture mapping is pretty slick. And like the Jaguar I, the Jag II had boatloads of programmability not present on other consoles, allowing for unique game engines. But I agree that a typical 3D game would not reach PS1 performance on the Jag II.

 

I don't think the original Jaguar could have fit all the features of the Jag II. The chips in the Jag II used newer process tech to pack in way more transistors. John Mathieson, designer of the Jag I chipset, said they were already pushing the outer limits of ASIC process tech in 1993.

 

If you're saying the Jag I should have come out 2 years later with more features, well... we're back on the thread topic again at last. ;)

 

With perfect hindsight, I think the Jag I could have been reorganized with an emphasis on C code and texture mapping, without adding any transistors. But quite a bit would be cut out in the compromise, and it would be a much less "interesting" architecture.

 

And to really return to the thread topic, I think even radically improved technology wouldn't have saved the Jag I, but I've said as much in this thread already!

 

- KS

Edited by kskunk
Link to comment
Share on other sites

Hmm, maybe I'm missing something again. The Jag used 75 ns FPM DRAM, right? So that would be rated for ~13.33 MHz, right?

It's just a matter of perspective. The page mode cycle time is 13.3MHz. To achieve that speed, the DRAM controller runs at 26.6MHz. The ideal Jag II page mode cycle time would thus be 16.67MHz. As we discussed before, the prototypes weren't that fast.

Ok, thanks for clarifying that. What's the limit of FPM DRAM though? I seem to recall somthing around 50 ns or maybe 45 ns. (granted, they'd need to use a speed that the other chips could be clocked at as well)

 

But couldn't they have kept [an 8MB] design open in case the market made a change prior to release?

The Jaguar (and Jag II) memory map doesn't support that kind of DRAM arrangement. There are two banks of DRAM, each up to 4MB. Each bank must be wired to a separate set of chips. The Jag II documentation states clearly that only 2MB (1 bank) and 4MB (2 banks) are options.

 

However, both the Jag I and II can support 8MB of memory if you use two 8MB banks, or 16MB of chips... half of each chip is wasted. Not exactly the most cost effective path. ;) If you tried to add external glue to split the 2 4MB banks across a single 8MB bank, the chipset will thwart you by trying to leave two pages open at once.

Oh... did they alter tha memory map fromt he Jaguar? I remember it being mantioned that the Jaguar supported up to 8 MB of space for DRAM, but perhaps that was in the context of 2 seperate banks of 4 MB as well?

 

If using different RAM types really would complicate compatibility, perhaps it would have made the most sense to keep the main, slow bank as the only one avialable to for general use, and (rather than a slow bank for sound) add a bank of faster RAM dedicated to the GPU. Or, if not fast RAM, a block with similar speed to main that the graphics portions could saturate without contention. Give that EDODRAM wasn't quite as new anymore (and being suplanted by SDRAM as the high-end stuff by the late '90s), that might be possible, I doubt VRAM would have been particularly cost effective though, probably less so than SDRAM by that point.

 

EDIT: I just noticed the midsumer tech doccuments list either 64-bit DRAM or 16-bit SDRAM being used, that seems a bit odd. (BTW, it doesn't seem that tech heavy, in fact, a lot of it seems more straightforeward than many online discriptions, including that erronious one I linked to before)

 

The Jag II is a lot better than the Jag I, no doubt about it. The "3rd JRISC" is the "main RISC CPU" we've been discussing. That existed on paper but not in silicon. It was part of Puck, aka Jerry II.
The 68k still acts to boot the system though, right (the "CPU" isn't self booting)? Is it the main CPU which has the cache, is it a shared system cache, or ere there caches on more than 1 RISC?

 

 

 

The N64-style bilinear texture mapping is pretty slick. And like the Jaguar I, the Jag II had boatloads of programmability not present on other consoles, allowing for unique game engines. But I agree that a typical 3D game would not reach PS1 performance on the Jag II.

You know a lot of people complain about the N64's textures, especially the "vaseline" smear inducing filtering compared to the pixelated textures of the PSX or Saturn.

I'm more on the fence personally, it was perhaps overused a bit, but soem of the textures are so low resolution that it's bad either way. (a number of factors for that on the N64 though, the cartidge size, the texture cache, the restricted support for alternate/custom RSP microcode, etc)

I think the biggers things ont he N64 were antialiasing, perspective correction, and perhaps good use of gouraud shading -I don't think it really had an advantage over the PSX for the latter though other then use with lighting effects. (not sure how much the Z buffer contributed) I think the perspective correction is probably the biggest feature, followed by AA.

 

If you're saying the Jag I should have come out 2 years later with more features, well... we're back on the thread topic again at last. ;)

Yep, but that would only have been possible if Atari had another stable source of revinue leading up to that and with the computers dead and the Lynx doing poorly (except perhaps in select regions like UK) there wasn't much chance of that.

 

With perfect hindsight, I think the Jag I could have been reorganized with an emphasis on C code and texture mapping, without adding any transistors. But quite a bit would be cut out in the compromise, and it would be a much less "interesting" architecture.

Yeah, though for an early 90s design, that would never have been obvious, at least not all the features. I thin it would have been reasonable to assume that the market would not respond kindly to extensive use of a foreign architecture being supported nowhere else being intended as a primary component of the system; if extensive programming in C couldn't have been foreseen, at least common assembly languages would be. (with prominent consoles using common CPU architectures like 650x, Z80, or 68k) I'd immagine that's the reason they chose to use the host porcessor in the configuration they did, the problem being that the 68000 really doesn't mesh well with the set-up at all. The 020 isn't ideal, but it's a lot better at least; for anyone other than Atari (trying to make it super cheap with them in a vulnerable position) they probably would have gone with a MIPS R3000 instead given the architectures Flare supported and the timeline. (the cost and especially cost/performance ratio would likely have been worse for any other CISC choice past 68EC020 -especially since most of the low-cost X86 chips lacked onboard cache and/or used a 16-bit external bus)

 

Other things like texture mapping vs shading and polygons vs other rendering methods could not have been foreseen either at the time, so impossible without hindsight. By the ealry 90s, it still wasn't clear that polygons would be the main form of rendering to depict 3D environmnents: raycasting was quite popular and voxel rendering had come in as well. On top of that, most polygon based games either used flat shading with decals as textures (ro very limited use of textures) but much more likely to have gouraud shading used extensively if anything; games that did use a lot of texture mapping most notably included Wolf4D and Doom, neither of which used polygons anyway.

With all things considered, it at least got soem texture support, though no hardware triangle rasterization. (better than investing in hardware expressly rasterizing quads I suppose) However, it might have been reasonable for them to have focused on RGB rendering specifically rather than the propritary CRY format, not just 15-bit RGB, but perhaps steps could have been taken to facilitate 24-bit RGB as well, like allowing color look-up for textures used in 24-bit rendering. (a 16-bit RGBI mode might have been an interesting feature as well, so you'd have 2 shades of every 15-bit RGB color/shade)

Witht he unstable nature of rendering choices at the time, goign with a flexible system was probably the safer route. (but as they say: jack of all trades, master of none)

 

 

And to really return to the thread topic, I think even radically improved technology wouldn't have saved the Jag I, but I've said as much in this thread already!

I don't think sheer power limitations have much to do with the Jag failing at all. On the hardware side it's obviously tied to the difficulty of extracting the power that is there on top of the lack of dedicated devlopers, who might be able to tap the hardware, shying away due to the unstable nature of the company marketing it. (with a few exceptions)

Even with the unconventional hardware not catering to what would later become solidified as industry standard, it could have been a good system (if pushed by a huge company, perhaps even altering the industry standard), the hardware bugs, flaws, and general difficulty to work with the system were the problems on that side of things. Now, on top to hardware, you've got it being championed by a company who's in rather bad shape at the time and very limited presence in its main market. Not only had Atari Corp had a limited presence in the US in general, but its emphesis in the most recent past was on the ST line of computers, not consoles. So on top of financial problems, management problems, and hardware issues, the name had much less value than it once did.

Again, in Europe it wasn't quite the same story though, and from that stand point, it might have even been prudent to put Europe over North America in terms of importance. (or at leas the countries Atari Corp had the most prior success in) That might have been their best shot: the more substancial brand recognition, the greater succeptability to viral marketing, and quite a few notable 3rd party developers all support the European market.

 

Still though, if the hardware had been at least reasonable to work with, not unreasoanbly demanding for developers to acheive at least acceptable performance, that woudl have been huge. Atari seems to have been in too poor shape financially to really offer significant funding for 1st party games, or at least a signficant number of timely, quality releases and the 3rd party support was limmited due to the hardware problems and lack of insentive of the system in general. Without enough funding they were hard pressed to arrange for a proper launch of th esystem or comprehensive marketing, but I must say it seems management in general was problematic as well and what limited resourses there were, seem to have been used inefficiently to a fair extent.

 

 

 

 

One past arguement was that Atari could have been in decent financial shape if they had another product on the market market; one prevous point was the Panther, but that obviously wouldn't be a good option and given the time, it doesn't seem like flare could have gotten a pre-jaguar out reasonably either. 1992 would be a bit late with the Genesis and SNES strong ont he market in both US and EU and even if parts of the Jaguar chipset might have been viable (like the Object processor and/or blitter), the market would be extremely competitive) I mean, if Flare had produced something precedign the mature (or near mature) Jag chipset, but avoided the fatal flaw of the Panther requiring SRAM, somthign at least practical to include with a decent chunk of RAM. (and by 1992 that shoudl be more like 512 kB at least)

 

The Lynx made fairly little impact, and even if that had been handled in the best possible way (minimal form factor, no backlight to reduce cost/power suage, and allowing ROM to accessed directly) I doubt it would have been able to really cut in on the gameboy market, though perhaps it would have been enough to at least establish a stable market presense and sourse of revinue. (which Atari sorely needed in the early 90s)

Unless they could have miraculously hung onto the computer market a bit longer (in Europe if anywhere), the only other possibility goes back to the Panther, or a home video game consoles, but not the Panther hardware obviously, and prior to the early 90s. At very latest, they'd probably need to get in by '88, before any of the new consoles started to get established and the market. Nintendo owned the market, but with aging hardware, and not in Europe (with it split between Nintendo, Sega, and home computers).

So one possibility would have been an ST derived game system, but not the 1985 ST, one with the BLiTTER, and perhaps the STe's sound, though plenty of off the shelf options were available for sound (namely yamaha FM chips). It probably wouldn't have held up especially well against the MD and SNES on hardware terms and probably not in brand recognition in the US (Europe is another matter), but it might have worked, and I beleive was being considered prior to the Panther. In Europe, it might have been a bit superfluous with the ST itsself, or perhaps not given the STe's limited popularity, so relatively few games took advantage of it. (conversly that would mean, direct ports form the ST could be a problem for such a console) If such a consoel could establish a significant price advantage over contemporary game consoles, it might have had a better chance. Of course, it really comes down to proper software support and marketing, which is really up in the air. (if Katz was still there for the launch, that could probably have helped a lot givne his experience and proven skills for marketing video games -a bigger what-if would surround his not leaving for Sega)

 

 

 

Of course, this whole thign widens to Atari Corp as a whole though, and things liek what might have been if jack Tramiel had continued to head the company. (som owuld say, or rather have said that Sam's management contributed to Atari Corp's decline to a significant extent)

Edited by kool kitty89
Link to comment
Share on other sites

What's the limit of FPM DRAM though? I seem to recall somthing around 50 ns or maybe 45 ns.

50ns isn't unreasonable for an FPM controller (with fast enough DRAM), but it gets much harder every few ns closer you get to the absolute limit. It's much easier to get peak bandwidth out of SDRAM, which is why it took over.

 

I remember it being mentioned that the Jaguar supported up to 8 MB of space for DRAM, but perhaps that was in the context of 2 seperate banks of 4 MB as well?

Correct. Jag II had the exact same memory map as Jag I: 2x4MB DRAM, 6MB cart, 2MB ROM/registers. They didn't change anything.

 

EDIT: I just noticed the midsumer tech doccuments list either 64-bit DRAM or 16-bit SDRAM being used

Very cool, I never noticed that before. On the surface, it seems like a plausible idea -- one 16-bit SDRAM could supply the bandwidth of 64-bits of FPM DRAM.

 

But the whole Oberon/Puck bus sharing scheme seems totally incompatible with SDRAM, so maybe it was just an idle fantasy they quickly discarded when implementation started.

 

It doesn't appear they were planning for it when Atari shut down. I have the "final" Oberon/Jag II netlists in front of me and the chip pinout doesn't even contain SDRAM control signals. And in the Midsummer docs, there is only that one little mention buried in a block diagram -- all the actual memory controller details and registers show only FPM DRAM support.

 

The 68k still acts to boot the system though, right (the "CPU" isn't self booting)? Is it the main CPU which has the cache, is it a shared system cache, or ere there caches on more than 1 RISC?

Only the main RISC (they call it the RCPU) would have had a cache. It's also an instruction cache only (not a data cache). But that's still a very nice performance boost. Sadly the RCPU never made it off paper onto silicon. I wonder how far they got.

 

There's no hint it can boot the system -- more than once, they mention that the 68K has to boot the system.

 

Switching to a 68020 is not an option if you want to retain backward compatibility with the Jag I. So it makes sense they kept the 68K around. But I'm sure they expected Jag II games to completely avoid it and use the RCPU only.

 

You know a lot of people complain about the N64's textures, especially the "vaseline" smear inducing filtering compared to the pixelated textures of the PSX or Saturn.

The N64 texture quality was due to the tiny texture buffer. It was difficult to use large textures. The lack of mass storage didn't help either -- there was really no space for detailed textures, like there were on CD-based consoles.

 

Note that the Dreamcast used very similar texture filtering to the N64/Jag II, but you don't hear anyone complaining about those textures! That was because of the large supported texture size and CD storage.

 

Of course, the Jag II's N64-like texture cache might have encouraged developers to use tiny smeary textures like the N64.

 

- KS

Edited by kskunk
  • Like 1
Link to comment
Share on other sites

In my opinion, JagII wouldn't match either the PS1 or Saturn - in places it would be close, but really it's more what the jaguar should have been.

The N64-style bilinear texture mapping is pretty slick. And like the Jaguar I, the Jag II had boatloads of programmability not present on other consoles, allowing for unique game engines. But I agree that a typical 3D game would not reach PS1 performance on the Jag II.

 

I don't think the original Jaguar could have fit all the features of the Jag II. The chips in the Jag II used newer process tech to pack in way more transistors. John Mathieson, designer of the Jag I chipset, said they were already pushing the outer limits of ASIC process tech in 1993.

 

If you're saying the Jag I should have come out 2 years later with more features, well... we're back on the thread topic again at last. ;)

When Atari first showed the JagII documents to the company I worked at, we already had a Sega Saturn devkit - and an import PSX had already shown what was coming ( via Ridge Racer and Toshinden ) - Given the timescale to write games, the JagII would have come out after the N64 - when the PSX and Saturn were already settled in to the market.

I agree that JagII in 93 would be a miracle - but not launching anything until 95 would be just as bad for the company.

 

With perfect hindsight, I think the Jag I could have been reorganized with an emphasis on C code and texture mapping, without adding any transistors. But quite a bit would be cut out in the compromise, and it would be a much less "interesting" architecture.

 

And to really return to the thread topic, I think even radically improved technology wouldn't have saved the Jag I, but I've said as much in this thread already!

 

- KS

 

A complete reorg wouldn't be a Jaguar, though :) , that's why I settled on the idea of just fixing the bugs - and letting the machine run more as a dual processor risc machine with a sleeping 68000 :) - or even no 68000 at all. John Carmack's idea of a phrase buffer on blitter out would be nice though.

 

More software and less cost to Atari is the key though. - The machine only sold 125k units during it's lifetime , so it's a miracle that there were actually as many good games as there were!

My feeling/hope would be that a 'CD' unit would have picked up more sales at the start, and given developers more confidence in the unit. It would also have allowed Atari to improve relations with reviewers, as review CD's could be sent out much more freely.

 

In a perfect ( Atari related world ) the Jag would be released in 92, not 93 :)

Link to comment
Share on other sites

A complete reorg wouldn't be a Jaguar, though :) , that's why I settled on the idea of just fixing the bugs - and letting the machine run more as a dual processor risc machine with a sleeping 68000 :) - or even no 68000 at all.

Yeah, not to mention they probably wouldn't have had the time/resources to a full reorganization once the market started showing different trends. Really though, it seems like Flare must have been forced into implementing the design in an impractical manner, the 68k doesn't mesh well with the chipset in a single bus configuration, so I can't imagine that they planned to use such a configuration originally. With a dual bus design it would make at least some sense though, but that wasn't the case. (besides, there's cost trade-offs with a dual bus layout vs just using a more suitable CPU)

 

Does the Jaguar even support bus interleaving with the 68k in a manner like the Amiga?

 

More software and less cost to Atari is the key though. - The machine only sold 125k units during it's lifetime , so it's a miracle that there were actually as many good games as there were!

Didn't the final number end up being closer to 100k more than that if you include all the remaining units dumped in price after discontinuation. (around 250k were supposed to have been produced) It may not have sold a lot of units, but at least it was on the market for a few years and several games were released after it was discontinued.

In that context, the 32x has a surprising number of good games as well in spite of being on the market less than a year (though I think it sold a bit better than the Jag in that time), granted they had some pretty good first party support and it was promoted by a prominent player in the market at the time.

 

My feeling/hope would be that a 'CD' unit would have picked up more sales at the start, and given developers more confidence in the unit. It would also have allowed Atari to improve relations with reviewers, as review CD's could be sent out much more freely.

Kskunk made a good point about the file system though, Atari needed to have a standard file system available for the CD unit. (be it the main console or add-on)

 

Another odd thing is that they seemed to be planning on releasing the Jag II without a CD drive, the doccuments seem to imply that a CD unit would be separate, like the Jag. (or perhaps they hadn't finalized the decision yet)

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...