Jump to content
IGNORED

Is the -bit argument defunct now?


thomasholzer

Recommended Posts

So did you even own a Jag ever?
Why is this a prerequisite for having an opinion?

:roll:

 

 

I've never driven a ferrari, yet i am of the opinion that it is fast.

I've never dropped acid, yet I'm of the opinion it messes up your perception.

I've never been in space, yet I'm of the opinion that it would be cool.

I've never been to Rome, yet I'm of the opinion that it would be historically interesting.

All that is true, but how can it be if I've never done any of those things?

 

Someone who drives a Bugatti Veyron might not think a Ferrari is that fast. Someone who drives any racecar or racebike may not think so either. What about astronaught who flies 10's of thousands of miles per hour? It's all relative...I dropped acid once and it actually IMPROVED my perception...i.e. It's like drinking 10 cups of coffee really fast... I bet a lot of people would never want to go to space (including me) because it's dangerous as hell and plus dealing with no gravity would be cool for 1 minute then it would suck...imo. I've never been to Rome either, but I've seen plenty of great tv programs on it so I could care less about spending thousands of dollars to go there! ;) All of this is OPINION...not TRUTH!!! :twisted:

Link to comment
Share on other sites

Yet that is easily falsifiable, as there were no General Purpose processors in the Jaguar that were 64 architectures.

 

To what extent is the general-purpose processor the bottleneck in something like the Jaguar (or, for that matter, in newer systems).

 

Suppose one were designing a game for three machines:

 

-1- Jaguar performance, exactly as-is

 

-2- Jaguar, with the 68000 speed cut in half but the other chips' speed increased by 50%.

 

-3- Jaguar, with the 68000 speed increased by 50% but the other chips' speed cut in half.

 

Which machine would yield the best performance on what types of games?

Link to comment
Share on other sites

To what extent is the general-purpose processor the bottleneck in something like the Jaguar (or, for that matter, in newer systems).

 

Suppose one were designing a game for three machines:

 

-1- Jaguar performance, exactly as-is

 

-2- Jaguar, with the 68000 speed cut in half but the other chips' speed increased by 50%.

 

-3- Jaguar, with the 68000 speed increased by 50% but the other chips' speed cut in half.

 

Which machine would yield the best performance on what types of games?

Putting aside for a moment that 64-bit doesn't necessarily mean more performance, the answer is simple: Drop the 68000 and increase performance on the other two general purpose processors. Tom and Jerry were theoretically where all the work was being done anyway, so why keep the 68000 around? As I recall from a discussion from JB, the 68000 was intended more for booting and general control of the machine than anything important. It wasn't supposed to be used for as much as many programmers used it for.

 

Which is still a moot point. Tom was a 32-bit RISC core, Jerry was a 32-bit RISC core, and the Motorola 68000 was a 16-bit core. Not a 64 bit core in sight. The closest thing is the Object and Blitter processors which were not general purpose processors any more than an FPU makes a microprocessor "64-bit".

Link to comment
Share on other sites

:rolling: :rolling: :rolling: :rolling: :rolling:

 

You usually strike me as halfway intelligent, are you just trying to spread false information to get the fanboys all riled up?

 

The PS2 can not do double in hardware, the FPU on it is limited to 32bit, the Gekko can do 64bit double operations. Yet, the PS2 has a 128bit databus, the Gekko is limited to 64bit.

Are you strictly defining a processor as N bit, because of the address bus? Because that would be wrong as well.

 

Putting aside for a moment that 64-bit doesn't necessarily mean more performance

 

Sure it does, in MIPS terms, which do you think is faster:

 

lw t0,(t1)
lw t1,4(t1)
sw t0,(t2)
sw t1,4(t2)

 

or

 

ld t0,(t1)
sd t0,(t2)

 

?

Link to comment
Share on other sites

Now that I know what you're referring to, it's kind of silly to be dragging out a bunch of tech info. But I promised you an explaination, so here we go:

 

You can't be serious. Do you know what a double precision floating point value is?

The user manual for the 750CXe* explains the situation:

 

The 750CX/CXe/CXr is an implementation of the PowerPC

microprocessor family of reduced instruction set computer (RISC) microprocessors with extensions

to improve the floating point performance. The 750CX/CXe/CXr implements the 32-bit portion of

the PowerPC architecture, which provides 32-bit effective addresses, integer data types of 8, 16, and

32 bits, and floating-point data types of single and double-precision.

 

The emphasis is mine. You'll note that the manual states that it's a 32 bit processor despite the double-precision floating point unit. This is actually quite normal. Single and Double precision floating point numbers were described by the IEEE 754 standard, and are thus implemented on nearly all Floating Point Units and Coprocessors. The x86 Intel Architecture actually has had 80-bit extended precision floating point support since the creation of the 8087 coprocessor back in 1980.

 

To explain what we're talking about when someone refers to an 8-bit, 16-bit, 32-bit, or 64-bit processor, here is a diagram of the basic CPU architecture:

 

CPU_block_diagram.png

 

In general, the instruction set can be of any size, so it has little impact on the CPU's ability to compute data. Thus the two most important attributes are the Registers and the Arithmetic Logic Unit (ALU). For most purposes, the two will always match. (I'll get into an exception in a moment here.) The number of bits the registers and ALU can handle simultaneously is referred to as the natural "word size" of the processor. That word size is what the CPU is most efficiently designed to work with. It can actually be an arbitrary number of bits, but since the standardization of the byte, it has usually been powers of 2.

 

Thus a 32 bit processor is a CPU with registers and an ALU of 32 bit wide. The address space usually follows this size, though not always. The reason why the address space and the word size match is that it's more efficient to make use of the ALU to compute branching and other address modifications than it is to add special logic to handle an odd address size. If the actual addressable memory is lower than the word size, then no harm is done. However, if the addressable memory is larger than the address size, then the CPU will need to be designed with some sort of workaround to jump to the higher addresses. The most common workaround is segmented memory; something with which I'm sure we all remember from the 8086 days of accessing 640K (actually 1024K, but who's counting?) of memory with a mere 64K of address space.

 

Now, on to that exception I was talking about. There are times when it makes sense to have a register larger than the ALU can handle. The most common example of this is so that the CPU can read in multiple words of data and operate on them using parallel arithmetic or floating point units. The operation is the same across all words, so it's a very specialized case. Thankfully, 3D graphics and multimedia are exactly the sorts of operations that need parallelism. For example, if you've got 30,000 vertices to translate and rotate in 3D space (a pretty common occurance when you render 3D graphics) it makes sense to use a 128 bit register to compute 4 32-bit words in parallel. At that rate, you'll only need 7,500 operations rather than the 30,000 you'd be looking at otherwise.

 

(In "Big-O" notation, that would O(n/4) time to execute.)

 

Instructions that make use of large registers like this are known as "Single Instruction, Multiple Data" or "SIMD" for short. Since the performance boosts are so obvious, you can find these instructions in nearly every modern processor on the market. Intel has MMX, SSE, and SSE2. The PowerPC has Altivec. The Emotion Engine plays itself off as a 128 bit processor while it's actually executing two 64-bit values in parallel, and the Gekko has been modified to allow for 2 32-bit Integer or 2 32-bit Floating Point values to execute simultaneously.

 

Recap

 

1. The "bitness" of a processor is determined by:

  • Register Size
  • ALU
  • Address Space (Optional)

2. Nearly all Floating Point Units support 64-bit, double-precision values, but do not affect the "bitness" of the processor.

 

3. SIMD allows for multiple ALUs and/or FPUs to operate in parallel.

 

--

 

While I'm a bit bemused by this whole thing, I'm honestly surprised no one here knew this. I honestly hope you all find this 10 minute introduction useful. I'm not an expert in CPU design by any means, but I'm happy to share that which I know. :)

 

* The 750CXe is the base for the Gekko processor.

Link to comment
Share on other sites

Putting aside for a moment that 64-bit doesn't necessarily mean more performance

 

Sure it does, in MIPS terms, which do you think is faster:

 

lw t0,(t1)
lw t1,4(t1)
sw t0,(t2)
sw t1,4(t2)

 

or

 

ld t0,(t1)
sd t0,(t2)

 

?

Certainly the second. But that doesn't necessarily (<---key word) mean that the 64 bit CPU is going to be faster. For example, a DMA transfer across a 512-bit wide bus is going to be 8 times faster than a CPU that is able to move 64 bits for every bus cycle. In cases where the CPU can't keep the bus fed at that rate, a DMA transfer would be even faster. That leaves the processor free to do arithmetic, something that isn't going to be improved substantially when dealing with 32-bit or smaller values. (In fact, it would probably be slower since you need to move twice the data across the bus as you do with a 32-bit processor. You would need to double the bus width to compensate for this.)

 

The arithmetic is only going to turn out faster if you're doing 64-bit math. A 32-bit processor would need more cycles to do the computations correctly while a 64-bit processor would be able to do the arithmetic in a single operation.

 

Of course, if your architecture requires the CPU to do move massive quantities of data around the memory bus, then you're definitely going to get a performance boost from programs tuned to take advantage of a 64-bit CPU. But the performance advantage is not a given.

Link to comment
Share on other sites

The problem is that there IS no definition for "bits" in the context of game systems.

There's not a clear definition in advertising. There *is* a clear definition (as clear as these things can get, anyway) in the technology industry.

Well, there isn't really a clear definition even there. If you say "this processor has a 32-bit data bus", then that's all fine... but saying "this processor is 32-bit" is ambiguous. Another example which no one has mentioned is the CD32... it uses a 68EC030 processor, which has a 32-bit data bus, but the address bus is limited to 24-bit (as far as I know, it's actually the same as a "real" 68030, except 8 pins of the address bus just don't exit the chip). So is the 68EC030 still a "true" 32-bit processor anymore?

 

Things like multi-core processors and bitslice processors muddy things up even more. The Cell processor seems rather confusing in this respect too.

 

Atari could get away with advertising it as 64-bit, but that doesn't magically confer 64-bit performance advantages. Goatdan's point (the one I was responding to) was that the Jaguar was 64-bit, yet didn't show 4 times the performance.

Oh, I would never try to say that the Jaguar's graphics were 4 times better than the Genesis or SNES. The Jaguar actually compares pretty well with the Saturn and Playstation when it comes to 2D graphics (although it gets spanked at 3D, since the graphics chips just don't have the abilities that were built into the Saturn and PSX). I'm not sure why they bothered making the chips 64-bit, or if it provides any real advantage whatsoever in the Jaguar.

 

I think it's perfectly valid to judge the general power of game systems based on their graphical abilities, as it's these chips that seem to make the most visible difference. However, the size of the data bus in the graphics chip is not as important as most companies would have you think, and it definitely doesn't define the system as a whole.

 

--Zero

Link to comment
Share on other sites

Which is still a moot point. Tom was a 32-bit RISC core, Jerry was a 32-bit RISC core, and the Motorola 68000 was a 16-bit core. Not a 64 bit core in sight. The closest thing is the Object and Blitter processors which were not general purpose processors any more than an FPU makes a microprocessor "64-bit".

 

All right, then. Suppose the three machines were:

 

- Jaguar as is

 

- Jaguar with Tom and Jerry slowed down, while the Object and Blitter processors are sped up

 

- Jaguar with Tom and Jerry sped up, but the Object and Blitter processors slowed down

 

My guess would be that some types of games would benefit from having having the 64-bit chips go faster, even if the other chips were slower, while other games would not. System performance may also benefit from the wider bus architecture even if the processors running code don't take advantage of it directly. For example:

 

-1- On some systems, outputting video uses up a fair chunk of available bus throughput. On the Amiga 1000, for example, and all the Amigas with 16-bit chipsets (regardless of whether they have 32-bit processors), displaying a 640x200x16 display will use up most of the memory bandwidth in "chip RAM". Even if one kept the 68000 (16-bit data bus), making the "chip RAM" 32 bits wide would cut the bus utilization in half, thus tripling the speed of other operations that needed the bus.

 

-2- On many 80386 machines (and probably other platforms as well) the memory system is 64 bits wide, even though the processor is only 32 bits. This is done because dynamic RAM chips have a recovery time following each access. If successive memory accesses are to opposite sides of the 64-bit bus, one side can be performing an access while the other side is recovering from the previous access.

 

Pushing to a 64-bit wide memory bus on the Jaguar may have been motivated by marketing, but would likely not be without performance benefits.

Link to comment
Share on other sites

All that you posted, only matters on paper, not in the real world (e.g. actually programming CPU's in assembly). ;)

 

Nice copy & paste job though. LOL

 

EDIT: P.S. the registers on the R5900 (Emotion Engine) are 128bit wide, something I miss on the Cell. Although, I can still use Altivec instructions to move 128 bits of data at a time. :P

Link to comment
Share on other sites

There's not a clear definition in advertising. There *is* a clear definition (as clear as these things can get, anyway) in the technology industry.
Well, there isn't really a clear definition even there. If you say "this processor has a 32-bit data bus", then that's all fine... but saying "this processor is 32-bit" is ambiguous. Another example which no one has mentioned is the CD32... it uses a 68EC030 processor, which has a 32-bit data bus, but the address bus is limited to 24-bit (as far as I know, it's actually the same as a "real" 68030, except 8 pins of the address bus just don't exit the chip). So is the 68EC030 still a "true" 32-bit processor anymore?

Read my post above. I gave an explicitly clear defintiion which has zip to do with the address bus and little to do with the address space.

 

Things like multi-core processors and bitslice processors muddy things up even more. The Cell processor seems rather confusing in this respect too.

Not really. A multicore processor is just two instances of however many bits your word size is. The Cell processor is confusing because it's a mish-mash of a 64-bit Power CPU controlling X number of SIMD-dedicated Floating Point Units (usually about 8) each of which is capable of computing 4 single precision numbers or 2 double precision numbers per operation, connected to a memory controller and two external I/O chips by a high speed bus that resembles a miniturized crossbar.

 

That's the long answer, anyway. The short answer is: 64-bits.

 

Atari could get away with advertising it as 64-bit, but that doesn't magically confer 64-bit performance advantages. Goatdan's point (the one I was responding to) was that the Jaguar was 64-bit, yet didn't show 4 times the performance.

I'm not sure why they bothered making the chips 64-bit, or if it provides any real advantage whatsoever in the Jaguar.

Again, the 64-bitness is a misnomer. The coprocessors use 64-bit chunks of data because that's what (presumably) makes sense. It has no material impact on the central processors, however. Modern CPUs are full of 64-bit, 128-bit, 256-bit, and 512-bit interconnects and logic, but that doesn't automatically convey them extra "bitness" status.

 

I think it's perfectly valid to judge the general power of game systems based on their graphical abilities, as it's these chips that seem to make the most visible difference. However, the size of the data bus in the graphics chip is not as important as most companies would have you think, and it definitely doesn't define the system as a whole.

*shrug* In that case I've got a 512 bit powerhouse sitting in front of me. The CPU is only 64 bit, but it's only the graphics that matter, right?

 

All right, then. Suppose the three machines were:

 

- Jaguar as is

 

- Jaguar with Tom and Jerry slowed down, while the Object and Blitter processors are sped up

 

- Jaguar with Tom and Jerry sped up, but the Object and Blitter processors slowed down

And the answer is...

 

(Drumroll please....)

 

 

 

 

 

 

brrrrrrrrrrrrrrrump!

 

 

 

 

 

 

You have a whole lot of mental masterbation that doesn't accomplish anything. You more than anyone should know that hardware design is a tradeoff. You get more of this by reducing that. The designers of the Jag chose what they chose, and it worked well enough in its day. That still doesn't make the system a "64-bit machine", nor does it matter. The games weren't competitive enough on the market. Period. End of Story. End transmission, carrier lost.

 

Now go to bed. It's after midnight. :P

Link to comment
Share on other sites

All that you posted, only matters on paper, not in the real world (e.g. actually programming CPU's in assembly). ;)

Not entirely true, but close enough. No one ever said that "bitness" was a critical factor. In fact, I thought that was the point we were making in the first place? ;)

 

Nice copy & paste job though. LOL

I hope you're referring to the bit from the IBM manual. Otherwise, I'd be quite offended. I wrote all that myself! :twisted:

 

(Granted, I did steal images from Wikipedia. But that's just par for the course. ;))

 

EDIT: P.S. the registers on the R5900 (Emotion Engine) are 128bit wide, something I miss on the Cell. Although, I can still use Altivec instructions to move 128 bits of data at a time. :P

 

1. Yes, I know. But the point I was making is that the ALU is two 64-bit units in parallel, running out of the same register. (i.e. Superscalar architecture) Which means that the Emotion Engine is a "64-bit" SIMD chip. Not that that really means anything other than the fact that you can't compute 128-bit math. Not exactly a big loss. It's just more paper tigers. :)

 

2. Can you use the SPE's instead of the PPE to do the transfer? The SPE's are basically the same configuration as the Emotion Engine, plus or minus a few data operations. Or is there a bandwidth problem? (Sorry, I haven't studied the cell architecture well enough to know the "when and how" of what data goes where.)

Link to comment
Share on other sites

2. Can you use the SPE's instead of the PPE to do the transfer?

 

The SPE's (internally called SPU's "Synergistic Processing Unit"), have local memory (256K each - basically fast RAM), which is not connected to the main RAM. To get data in and out of them, you need to use DMA transfers. One nice thing is that they can talk to both the PPU's and the GPU's bus.

 

The SPE's are basically the same configuration as the Emotion Engine, plus or minus a few data operations. Or is there a bandwidth problem?

 

Actually, they are very different to the Emotion Engine, more powerful in some aspects, but less in others.

 

(Sorry, I haven't studied the cell architecture well enough to know the "when and how" of what data goes where.)

 

Here is a very cool source of information, hosted by a friend of mine (Mike Acton):

 

http://www.cellperformance.com (completly within the legal realm!)

Link to comment
Share on other sites

Of course, that just goes to show my point. I know all those systems are 36 (only this generation is in a grey area due to there actually being 64 bit processors now) But.....had I said all thos systems were 32 bit, instead of giving improper 128 bit info....instead of one person bashing me for it, how many would have?

 

As I said, it's not really relevant. What is bit? Is that a referance to the processor? ( I refer to this, as I said, graphics are maxed out till we get new display options anyways) Or the graphic processor? (Most companys will say their this, but it doesn't matter how much graphic processign you have if the primary processor can't use it)

 

Anyhow, it's all a trade off.

 

[edit. Oh yeah, adn there is a one bit processor. It's called a light switch and runs at 60hz :P

Edited by Video
Link to comment
Share on other sites

  • 2 weeks later...

I missed a lot of this, so I might as well bring it back up for a few seconds, since my post was referred to and I never defended it...

 

Again, the Jag offered very little to convince player to purchase a Jag over one of the half a billion other options on the market. That's simply the way it was. Nintendo changed the market forever when they introduced franchise characters. You either got with the program, or you got out of the market. :)

 

No denying that.

 

Rayman could have been a good mascot. However, he belonged to Ubisoft, not Atari. If Atari was smart, they would have gotten their hooks into Ubisoft and ensured that Rayman was theirs. (Much like Sony did with Naughty Dog and Psygnosis.) Unfortunately, there was a very small window between when Rayman was released for the Jag and the Playstation/Saturn. (~2 months if wikipedia is to be believed.) Atari would have had to be more proactive if they wanted to make Rayman "their" mascot.

 

Actually, Rayman came out of the PS1 / Saturn first, and then the Jaguar slightly after that. Rayman was originally stated to be a Jaguar exclusive, but the thing that many Jag fans believe is that Sony and Sega paid a ton to get him off the Jaguar and on their consoles first, but the truth is probably that Ubisoft saw the writing on the wall for Atari and figured they could make more money by putting him on systems that would have larger installed user bases.

 

Cybermorph really isn't that bad a pack-in game.

Compared to Super Mario World and Sonic the Hedgehog, it's downright atrocious. Think about it from the perspective of a consumer in 1993. They could purchase an SNES for $199 or less and get Mario World for free, purchase a Genesis for a similar price and get Sonic, -OR- they could purchase a Jaguar for $250 and get Cybermorph.

 

Actually, here is the one place where I do differ from you a bit -- Cybermorph was a good game in many ways to show off the power of the Jaguar. The reason why is that there had never been a game like it -- Cybermorph introduced the go anywhere, do whatever you want concept in a console game. Before that point, 3D games were all like Virtua Racer, Virtua Fighter, Star Fox and so on -- there was a goal, and you had to maintain that goal. Console first person shooter like Doom hadn't made their way to the PC yet, and they still had a distinct end to the level. Cybermorph was something completely new where you could fly where ever and do whatever you wanted while exploring.

 

Cybermorph hasn't held up against these other games as well because being a complete pioneer, while it was good, the innovations and changes that followed it quickly allowed people to forget about it. Heck, by the time BattleMorph rolled around, there were lots more objectives and things to do. The honeymoon period for Cybermorph was very short, and I think the biggest problem that Atari really had with it was that they packed in Cybermorph for far too long. Had Atari decided to sell Cybermorph with consoles for only the first three months, and then switch to a different property, it probably would've worked a lot better.

 

For the majority of consumers, the choice was a bit of a no-brainer. Especially for the parent purchasing the consoles for their kids, who were more likely to warm up to the colorful cartoon characters than some dark-looking game with a bald, green lady as the only character. :P

 

I can't deny this. And that was part of the reason Cybermorph aged so quickly. With no characters to latch onto or care about, Cybermorph didn't have the same sort of buzz about it that something like Mario does even years later.

 

The problem is that there IS no definition for "bits" in the context of game systems.

There's not a clear definition in advertising. There *is* a clear definition (as clear as these things can get, anyway) in the technology industry. Atari could get away with advertising it as 64-bit, but that doesn't magically confer 64-bit performance advantages. Goatdan's point (the one I was responding to) was that the Jaguar was 64-bit, yet didn't show 4 times the performance. Yet that is easily falsifiable, as there were no General Purpose processors in the Jaguar that were 64 architectures.

 

See, this is the part of this whole argument that I don't agree with -- The technology industry has clear definitions of things for their own purposes like manufacturing chips, but when it comes to game systems... who knows? The reason that I believe the Jaguar to be 64-bit is that you could pass a 64-bit command over the bus at once. Now, regardless of where this command was doing or if it was even practical to use (it wasn't), to me and to many people who program, the ability to pass a 64-bit command makes the machine capable of handling 64-bits.

 

Advertising a system as having 64-bit "advantages" doesn't give it any advantages, but realistically the advantage that having a larger bus gives a person isn't necessarily the most important spec anyway. This is where the marketing for Atari went wrong, and was the point I was trying to make -- consumers were taught to believe that 64 bit meant that the graphics would be four times better than the SNES, and eight times better than the NES based on what Atari was advertising, and when it didn't look like that, consumers automatically believed that it was because Atari had lied about the bits that it used. This problem only grew when Atari used the bits argument to say that they were twice as powerful as the PS1 and Saturn, when it was obvious that they were not.

 

It didn't matter what size the chips were, it was just the reality of how it looked that mattered.

 

Also, the problem with game systems is that while you do have strong and firm rules about chips and all that fun stuff, what does define a game system? Is it the graphics chip as many systems would have us believe (If I recall right -- Genesis, TurboGrafx, Jaguar, Dreamcast, N64)? Or is it the main processor? And if it is the main processor, doesn't that make the Intellivision half the power of the Xbox? Well, obviously not -- because there is so much more that goes into it. The fact was and is that Atari's use of the 64-bit claim was just as justified as many of the claims that came before it, so saying that Atari couldn't call themselves a 64-bit system when it was already a well established rule to use the graphics power as the benchmark in advertising.

 

So, while the clear definition in the technology industry may say to you that the Jaguar was only 16 or 32 bits, the clear definition in the gaming industry based on what came before it states that the Jaguar was 64 bit.

 

And... at the end of the day... who cares? As I originally stated, the Jaguar didn't have the power that Atari advertised it as having, and the system suffered because Atari kept advertising it in that way. Had Atari brought out some franchise characters, or some big exclusive titles, or some peripherals like the Jaguar VR, Atari might have 'won the war'. But Atari instead focused on an at best flawed advertising campaign, and the Jaguar never got off the ground because of it.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...