Jump to content
IGNORED

Is my understanding correct? (Jaguar chip setup)


Recommended Posts

So I've read quite a bit over the weekend regarding the Jaguar and I've got a few questions.

 

From what I understand when developing a high performance game that pushes the Jaguar to the limit it is best to turn off the 68000 and rely only on the two RISC processors. The question then becomes which processor to use for controlling the game logic. Further reading (mostly from the v8.pdf) is that the DSP would be the ideal candidate for this. Because it can access all memory and also be accessed from anywhere.

 

It seems the DSP can do all the game logic, play background music if need be, play sound samples, and send requests out to the Graphics chip. But since Jerry can do graphics too it could handle the user interface (character status on screen, text, etc.). Tom could of course have its own code and just sit and wait for instructions.

 

It really seems to be a balancing act on how to extract as much performance as possible with making sure both processors are being taken advantage of to their full potential.

 

I do have a question as I don't see it clearly stated - the Jaguars screen resolution seems very programmable. What is the maximum resolution the Jaguar can handle - including colors? What are the challenges in maintaining this resolution? How are colors restricted in this equation?

 

The type of application I am thinking of is a first person shooter. However, it will not have background music during the game - only a looping sound file for ambient sound so that should free up cycles that would have been spent on a .MOD player. Also I think 256 colors would be more than enough - but if this makes an impact this could be reduced to less than 256 colors.

 

Are their any source code released for a game that shows heavy RISC usage?

Link to comment
Share on other sites

The 68000 was only added to the jaguar as a means to get the RISC chips running and control simple tasks, they thought about a few options including the 68030 but decided on the 68000 for a number of reasons including cost and availability.

Both RISC processors are intended to be used together, the GPU is tailored to graphics and I would suggest this should be its purpose. The other RISC chip is a general purpose DSP and is intended to be more flexible for non graphic processes.

 

With regards to code, Lars hannig has produced some and there is an aftermarket jaguar development CD with libraries and source code to help get you rolling.

 

 

Link to comment
Share on other sites

The Jaguar is all about trade-offs. There are five major processors in the Jaguar: Three are general purpose CPUs (the M68K and the two RISCs) and two are specialized (the OP and the blitter). The common currency between all of them is the main system bus; figuring out how to slice and dice that is probably the biggest challenge in using the system effectively. The RISCs have a slight advantage in this area in that they have their own local dedicated storage; this gives them the ability to stay off the main system bus if desired.

 

The RISCs are almost identical in their capabilities, and the distinctions between them are pretty trivial: The one designated "DSP" has 8K of local RAM vs. the one designated "GPU", which has only 4K, and they differ only by about eight or so instructions.

 

As far as colors go, the Jaguar can display 24-bit RGB color and can have a horizontal resolution of around 1280 pixels, but such a resolution probably isn't practical. Vertical resolution maxes out at either around 300 to 320 in noninterlaced mode, and around 600 to 640 in interlaced (PAL machines will give you slightly more). Again, it's all about trade-offs; higher resolutions and higher color depths will cost more bus cycles. But the Jaguar can comfortably drive an 8 BPP (256 colors) display.

 

And as far as "pushing the Jaguar to the limit" goes, don't believe a lot of what you read there, a few vocal people seemed to have an interest in pushing their own personal agendas to the detriment of any real science or measurement. And to me, it seems just silly that you wouldn't utilize all five processors and leverage them to their fullest potential if you needed to. Especially in the case of running the M68K on the main system bus while the two RISCs are running in their own local memory.

  • Like 7
Link to comment
Share on other sites

And since even in 2013 we have to witness replies like "THE 68000 IS EVIL TURN IT OFF TURN IT OFF NOOOOOW ONLY GPU IN MAIN OR YOU'LL BURN IN HELL BURN THE WITCH BURN IT BURN IT BUUUUUURRN!!!!!!!!!!!!!!!!11111111111111111oenoeneonoeneoneoenoenoenoenoneleventeen (foam comes out of mouth", here's a small quote from someone who sat down and did the math (no pun intended) regarding this whole nonsense (original post here):

 

I think the "evil" of the 68K has been way overstated. It has the lowest priority on the bus so it won't hinder the GPU & Blitter that much.

While I was developping Tube, I made a few tests regarding this.

To render the tunnel effect, the GPU has to access memory rather intensively.

I mesured the performance with the 68k polling the GPU memory to know when the GPU is done.

So we have both the GPU and the 68k accessing the bus intensively.

Then I tried with the 68K turned off. I got a +15.5% performance gain. Not bad, but it's not +300% like you could believe.

And this is close to a worst case scenario. In actual gameplay code, the 68K will execute more complex instructions (long jumps, complex addressing modes) and will be off the bus more often, leaving more cycles for the GPU.

In the case of Tube, the gameplay code is very light. When I remove it, the gain is less than 1%.

The problem is when you give too much work to the 68K. For example, in a advanced 3D game, if you compute all the physics calculations with the 68K you may end up in a situation where the GPU is waiting for the 68K to complete its work.

 

In the case of "Fallen Angels", I made a benchmark to see if it was worth the trouble to do some of the gameplay code using GPU in main.

To do this benchmark I wrote a box collison routine. It isn't very "GPU in main" friendly because it involves frequent jumps. But it is pretty typical to the kind of code I was going to need.

In the benchmark there are 200 collision tests done.

The result are:

voxel rendering alone, 68K turned off: 107 ticks

voxel rendering 68K turned off then the 68K does 200 collisions test: 134 ticks

voxel rendering then 200 collision tests by the GPU in main, 68K turned off the whole time: 124 ticks

voxel rendering with the 68k doing 200 collision test in parallel, then 68k off : 108 ticks.

 

I made more recent tests by removing most of the gameplay code... the gain is microscopic (less than 1%). So that's why I think moving the whole gameplay code to the GPU would result in worse performance.

 

Maybe the DSP could help a bit but it's already used by the sound engine. Anyway I think the gain would be very small.

 

So there you go. If you have something that runs at 10 fps, you can potentially bump it up to 11.5 fps. And you have to write the logic in the RISCs. And you'll run out of local RAM in record time. And you'll have to jump into the main RAM to execute code. And by then your code will be in such a mess from rewriting and refactoring that you'll have a hard time making any significant progress.

 

Or, you can keep the 68000 turned on and let it help you when it can at the cost of a few fps :).

 

(bottom line of course is: experiment! Don't take anything for granted. The hardware engineers of the VCS thought that the hardware was only good for "Tank" and "Pong" and maybe a couple more games, and look what happened there :))

  • Like 8
Link to comment
Share on other sites

(bottom line of course is: experiment! Don't take anything for granted. The hardware engineers of the VCS thought that the hardware was only good for "Tank" and "Pong" and maybe a couple more games, and look what happened there :))

Whole heartedly endorse this comment. Everyone has an opinion but it should only be used as guidance, you might find something that someone has missed. The best things I've learned is through experimentation. Making mistakes along the way, getting frustrated because it fails to work like you thought yields the bigger "YES" that you shout when it works.

Have fun and share your experiences :)

  • Like 4
Link to comment
Share on other sites

kskunk has contributed to a thread on js2 with some interesting insight that I think deserves mention here. It's not directly related to the points in this thread, but well worth considering before someone jumps in with both feet. As Tursi said, one particular part of it is particularly quotable, so I'm going to do just that (bold) along with a bit more from a couple of his posts (as I think the whole 'environment for creating playable, fun games' stuff is valuable and often dismissed ('though less important when considering porting an existing title). Meh, I'll shut up, here's the good stuff:

 

Tempest 2000...is far from "optimized". Its gameplay was refined and polished on the "forbidden" 68K, eschewing texture mapping for cheap shaded polygons and creative sprite effects. With little time wasted on graphics, it shipped early. Atari derisively called it a "make weight" game, with no idea that solid gameplay would create one of their most successful/famous titles.

 

Other developers broke their backs trying to meet Atari's demand for fancy textured graphics, instead of focusing on gameplay. Check out Fight For Life. Or any of the leaked Atari memos to Jag developers in 1995.

 

Sadly, the Jaguar is a bottomless pit of engine development. Everyone who worked on the Jaguar has a story about how much better their graphics could be "if only" they started over or had another year or whatever. (I'm one of them: When JagChris brought up blitter hacks, I went back to the hardware and found a new texturing hack: In parallel, the blitter can generate addresses while the GPU reorders memory access to exploit page locality. It's faster per-pixel, but you lose so much GPU that small polygons suck. See, hard to go one post without tripping over new Jaguar hacks! If only Carmack had known blah blah 640x480... tongue.gif )

 

And that's what's wrong with the docs, homebrew or official. They list a few things to avoid, then explain how the hardware is a blank slate - fully programmable. Want to texture a triangle? Just code a span rasterizer. How? Hey, you're the programmer. Do it however you want. (Each way has its own shortcomings.) And why triangles anyway? Maybe you should use quads. The hardware doesn't care, heck, it doesn't even know what a polygon is.

 

Soon you learn that straightforward approaches suck, and you have a logic analyzer hooked up to the bus, writing microbenchmarks to figure out why the heck everything is slower than implied. You're wondering if polygons were a bad idea, maybe it's time to start over with voxels.

 

This is no way to make an enjoyable game! So the AAA titles that refused to give up graphics or gameplay shipped really late - some would say too late. The ones that had high resolution textured polygons, great performance, and fun gameplay - actually those don't exist, they're only hypothetical. But given a few years and another rewrite, who knows... wink.gif

 

...

 

Doom is an exception. The generation's leading expert at bending bottlenecked hardware to his will, did the Jaguar port purely for his own masochistic fun. In an interview around the time the Jag launched, Carmack said 'nice system, but Atari probably can't make it a success', and 'my coworker derisively calls DOOM jaguar my "reward" for writing DOOM pc.'

 

Of course, it helps to start with a finished game! Like Sonic said, start with gameplay, then deal with the tech. NBA Jam (arcade), Rayman (SNES, unreleased), Doom (PC) were all ports of mostly-finished games. They started with working art, sounds, music, and gameplay, and focused only on technical issues. It's no fun to polish gameplay on a buggy system with poor tools - and many of the Jaguar's original efforts suffered for that.

 

Most who could fund a great game 'knew' Atari would fail. As a teenage game developer, I pitched the Jag to my boss, who explained exactly why Atari would fail. Self-fulfilling prophecy or just adult wisdom? Like the rest of us, I thought he was wrong; it's just a matter of a making a great game. Years later, I sat down with the docs, assembler, and logic analyzer. I knew untapped power lurked inside,waiting all these years for someone, finally, to unlock it.

 

After trying, I get it now. The Jaguar is a kit of ingredients by somebody who had never made a cake. The docs describe how great each ingredient will taste in a hypothetical 3D cake, but it's all from scratch, so only talented chefs need apply! Midway into mixing, you realize there's not enough flour and only a teaspoon of sugar and the eggs are rotten. Sure, a really great chef will still come up with something delicious. But it's not going to taste like Sony's instant brownie mix, and it's going to be a lot more work.

 

It's amazing how many tasty games we have, especially once you've seen the ingredients up close... smile.gif

 

- KS

  • Like 5
Link to comment
Share on other sites

Has anyone figured out the method to the Engineers madness yet? I mean, the architecture seems so fragmented compared to say a Playstation; is it just bad development tools and lack of RAM that hold it back from N64 levels of performance?

 

Not sure it was madness, it seems to have been a pretty good stab at making something better than the existing consoles at the time.

 

N64 performance? No that would be a huge development budget, 3 more years (a lifetime in hardware terms), a knowledge of what games console hardware should be doing and how they should have you do it, etc. All kinds of reasons. The gap between the release of the SNES and the Jaguar is pretty much the same as the gap between the Jaguar and N64, yet nobody feels compelled to compare SNES hardware to Jaguar hardware and it makes just as little sense to do so.

 

PlayStation was the console that changed everything. Jaguar was a console trying hard to be better at everything that went before. Revolution vs evolution? Dunno, but there was no way something like the Jaguar was ever going to be seen as "cool" and appeal to 20-somethings, clubbers and ravers, all those people ready to throw their cash around for new experiences, which is where the PlayStation and its image were unrivaled.

 

Lack of memory? Not so much, 2mb + up to 6mb cart space isn't insignificant compared to what went before. Lack of dedicated graphics memory, lack of sizable local memory for RISCs or ability to effectively use main ram without jumping through hoops, probably.

  • Like 5
Link to comment
Share on other sites

And since even in 2013 we have to witness replies like "THE 68000 IS EVIL TURN IT OFF TURN IT OFF NOOOOOW ONLY GPU IN MAIN OR YOU'LL BURN IN HELL BURN THE WITCH BURN IT BURN IT BUUUUUURRN!!!!!!!!!!!!!!!!11111111111111111oenoeneonoeneoneoenoenoenoenoneleventeen (foam comes out of mouth", here's a small quote from someone who sat down and did the math (no pun intended) regarding this whole nonsense (original post here):

 

 

 

So there you go. If you have something that runs at 10 fps, you can potentially bump it up to 11.5 fps. And you have to write the logic in the RISCs. And you'll run out of local RAM in record time. And you'll have to jump into the main RAM to execute code. And by then your code will be in such a mess from rewriting and refactoring that you'll have a hard time making any significant progress.

 

Or, you can keep the 68000 turned on and let it help you when it can at the cost of a few fps icon_smile.gif.

 

(bottom line of course is: experiment! Don't take anything for granted. The hardware engineers of the VCS thought that the hardware was only good for "Tank" and "Pong" and maybe a couple more games, and look what happened there icon_smile.gif)

 

It is somewhat sad to see the subject still described in these terms.

 

If anyone wants to return to what I said (as one of the first to talk about how to run GPU in main) about turning off the 68k and using GPU in Main, back in 2009 it is still on my blog

 

http://atariowlproject.blogspot.co.uk/2009/10/atari-jaguar-homebrew-whats-this-lay.html

 

In short Dr Typo's comments reflect pretty much what i said back then. It's not a panacea it's just a possibility which can in some circumstances be useful, but not always

 

It's a pity that it has drawn so much negative comment, and it very much makes me regret ever mentioning it.

Link to comment
Share on other sites

It's a pity that it has drawn so much negative comment, and it very much makes me regret ever mentioning it.

 

It wasn't you mentioning it that did it. It was the buffoons, idiots and sheep lauding on about it like it was the second coming of Christ.

  • Like 6
Link to comment
Share on other sites

I have zero understanding of the various technical issues but..

 

Sounds like you are a bit overqualified to comment compared to some of the experts who have decided this is the way forward despite not even knowing how to use a linker ;)

Link to comment
Share on other sites

There are many philosophies when it comes to designing games. Some people prefer to concentrate on the game play first, get that right and let the rest fall into place. Some like to bolt as much existing code together to get where they want and have their fun that way in as short a time as possible. Others produce tech and attempt to reverse engineer a game around it. The latter has arguably the most potential for greatness, but also the highest percentage of fail and broken promises. This is simple common sense.

 

20 year old hardware isn't what you'd think of as being a playground for creative freedom and nurtured design. Specific coding tricks, hardware hacks and general leet coding techniques are great but they don't equal good games. So while it makes perfect sense to understand the Jaguar hardware to a good level, there's no point trying to pole vault before you can crawl. Foundations, basics, groundwork, experience - all important stuff on the way to releases.

 

"I'm even distrustful of tech demos, because as AvP shows, a great tech demo can turn sluggish once you add good level design and gameplay." -kskunk

 

Another gem from the aforementioned js2 thread. How about this for a revelation - forget jumping through hoops to rinse out 0.5% more POWA and get the "good level design and gameplay" part right first... then see about pushing the envelope regarding performance and shiny, because at the end of the day, people play games not tech demos and one will leave you with maybe a lesser-performing game, the other will leave you with a youtube video and "what could have beens".

 

It's a matter of seeing a bigger picture, planning going forward and not being stubborn and rigid in your thinking - aiming for perfection but staring gloriously in raptures of self-importance back up one's jacksie on a bed of broken promises isn't a good look.

 

tl:dr version - tech doesn't make games, people do

  • Like 7
Link to comment
Share on other sites

Sounds like you are a bit overqualified to comment compared to some of the experts who have decided this is the way forward despite not even knowing how to use a linker ;)

 

Since I'm the only one around here lately struggling with linker issues i'm assuming this is aimed at me. Where it came from though I have no idea.

Link to comment
Share on other sites

Since I'm the only one around here lately struggling with linker issues i'm assuming this is aimed at me. Where it came from though I have no idea.

 

I don't think you advocate throttling the 68k. I think the answer may be something in between. I heard the 68k was supposed to coordinate between the processing assets. Nothing wrong with that arrangement methinks. Taking Atari_Owls research into account there must be a way to mitigate the drawbacks.

Link to comment
Share on other sites

Since I'm the only one around here lately struggling with linker issues i'm assuming this is aimed at me. Where it came from though I have no idea.

 

Just a general comment aimed at all the armchair experts who have inhabited this forum for the past decade harping on about 'GPU IN MAIN!' while having zero experience or knowledge of what they are talking about.

 

Wasn't aimed at anyone in particular.

  • Like 5
Link to comment
Share on other sites

 

WHAT ABOUT RUNNING teh program code in a virtual 64bit CPU that get sprocessed my a nano kernel that splits btyecode up between all the processors?

 

Keep it shhh, BUT.. what people haven't thought of is that between the 64bits, there are 63 gaps.. We can use these to almost double the powa!!!!111oneoneone

  • Like 5
Link to comment
Share on other sites

The issue is that the 68000 uses a slow 16-bit bus. It never asks for anything other than a word, and requires 4 cycles of the 68000 clock for that single word. There are far better things that could use all those clock cycles... like the BLITTER and OP. The 68000 really needed its own local ram to run in. That's one thing I found really helped 32X games - it has a similar setup: a 68000 controlling two RISC processors. But the 68000 in the MD part can run out of the MD ram without affecting the bus of the two RISC processors in the 32X part. Running the 68000 from rom, which does affect the 32X bus, can cut the game speed in half! Best thing I did for my 32X port of Wolf3D was to move the 68000 code into "local" ram to get it off the rom bus.

Link to comment
Share on other sites

The issue is that the 68000 uses a slow 16-bit bus. It never asks for anything other than a word, and requires 4 cycles of the 68000 clock for that single word. There are far better things that could use all those clock cycles... like the BLITTER and OP. The 68000 really needed its own local ram to run in. That's one thing I found really helped 32X games - it has a similar setup: a 68000 controlling two RISC processors. But the 68000 in the MD part can run out of the MD ram without affecting the bus of the two RISC processors in the 32X part. Running the 68000 from rom, which does affect the 32X bus, can cut the game speed in half! Best thing I did for my 32X port of Wolf3D was to move the 68000 code into "local" ram to get it off the rom bus.

 

And here's the point where the thread reached an infinite loop. As the old 80s/90s scrollers from demos used to say at the end of the text, "Let's wrap............"

  • Like 1
Link to comment
Share on other sites

And here's the point where the thread reached an infinite loop. As the old 80s/90s scrollers from demos used to say at the end of the text, "Let's wrap............"

 

Guys you do realize that in the interview DrTypo was talking about his experiences and tests in 2D right? You guys did notice his caveat for 3D stuff didn't you?

 

And making a loopback statements disregarding ChillyWillies testimony of his own substantial experience coding using the Jag and similar setups instead of asking him why he supposed maybe he got different results than Typo's tests for 2D stuff is perhaps a little rude?

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...