Jump to content
IGNORED

Is the -bit argument defunct now?


thomasholzer

Recommended Posts

blah... blah... blah...

 

I'm sorry, we're you saying something trollboy? ;) :P

 

 

:D

Nope, not really. Just fiction.

 

But really after that things started getting foggier with the 32X add-on, the 3D0, Jaguar, Saturn. Since they combined so many different kinds.

 

IIRC, the Jag was the last system to try to use bits as a selling point. I think the fact that games and systems were really evololving so rapidly around this time, and the introduction of CD based systems that allowed a lot more storage to make games go to places perviously unheard of, that it was better to sell on the possibilities instead of raw power. I don't think arguing bits mattered at all to any company, it was just mudslinging, kinda like all those political ads we see every year.

 

Sony and MS have both had pissing contests for the past 5 or 6 years on raw power but it has really been ineffective and left out of advertising for the most part. Now, just 1 good game makes the difference and I doubt anyone cares to even know about raw power.

 

I have to say I haven't really heard the word 'innovative' being used THAT much.

 

I have no idea when using the word became so popular. I know for at least the past 6 or 7 years, it's been attatched to anything. Not a day goes by that I don't hear/see a car ad, read a game review, check out any technology news, or stumble across an infomercial that talks about innovation. And that's just a few examples. It's kinda like the word bling. It's so watered down now that it has so little impact when hearing/reading it. It's used in many instances it shouldn't be. All of my examples above I really did read or hear at some point lately and I don't think any of them are innovative. Useful, creative, or cool maybe, but none are groundbreaking. I know it's just a buzzword that's clever marketing and used to make something sound unique or interesting or maybe even elite, but I find it hard to believe that everything out there is innovative all of a sudden.

Link to comment
Share on other sites

You know, I never gave a shit about bits or polygons etc. I like fun games, no matter what platform. Which is why I still play my 2600 in addition to my 360.

 

I realize that its a marketing strategy to get people to buty games, but if you really love video games, you play every fun game you can affford to get your mits on regardless of the system. Thats what I can't undersatnd about fanboys and system bashers. If you arent playing the 360 and the Wii, you are missing out. Just like if you didn't play the Genesis and the SNES you were only having half the fun.

Link to comment
Share on other sites

The Jaguar was 64 bit, but it sucks compared to the graphics on the N64 (at least the 3D stuff). The Dreamcast was 128 bit, but the 32 bit Xbox sure looked better than it.

 

The bit argument was made by a bunch of companies who wanted to set themselves apart by using a stat which they could brag was better than the competition. This all came crashing down with the Jaguar, which is why many believe it wasn't a 64-bit machine -- because the jump from 16 to 64 should've made the Jaguar four times as powerful as the SNES, and games like Donkey Kong Country were clearly equal on a power footing to the 2D Jaguar offerings, and games like Star Fox were nearly equal to things like Cybernaut for the Jaguar.

What the h--l? There was not a single general purpose processor in the system that was 64 bit. Even the primary GPU was a 32 bit RISC processor! The "64 bit advantage" of the Jaguar was nothing more than Atari marketing hype over a few extra chips related to graphics coprocessing. By that logic, my PC is a 256 bit monstrosity! :roll:

 

The reality of the situation was that the 64 "bit" thing was a complete non-starter. People didn't fail to purchase the Jag because of its "bitness" or failure to deliver on "4D gaming experiences!", but because it didn't have any killer games to sell the system. Nintendo had Mario, Sega had Sonic, Sony had Crash Bandicoot, and TurboGrafx had Bonk. The Jag OTOH was promoted by the likes of "Cybermorph" and "Blue Lightning", both of which were blasted by critics left and right. Not a great foot to start out on. :|

 

Remember the early 90's slogan, "It's the Economy, stupid?" It pretty much that way for game machines. "It's the games, stupid!" You can have the most incredible technology in the world, and it won't sell your machine unless you have the games to back it up. And I don't mean just any game, I mean the ones that make people plonk down hundreds of dollars just to play that one game. RPGs are perhaps the most recent example of games that drive sales. Zelda, Final Fantasy, and Blue Dragon are games that are selling today's systems. Without those games, consumer interest in the platform tends to be poor at best.

 

Or to put it another way, there is no doubt in my mind that the Jag could have pulled off a game like Starfox. The graphical quality may or may not have been as good, but it could have done it. Of course, Nintendo wasn't going to port it to the Jag because that was one of those games that was selling the N64 instead. ;)

 

FWIW, the N64 was a true 64 bit machine equipped with a 64 bit central processor. However, the choice of a 64 bit processor was one of scaling down the MIPS architecture rather than attempting to offer up distinct performance advantages. The processor was mostly utilized as if it were a 32 bit machine. It's quite likely that if Nintendo had replaced the CPU with a 32 bit version with similar performance characteristics, no one would have noticed the difference.

 

--

 

Getting back to the main topic, the reason why "bitness" stopped being an issue is that it stopped holding any real meaning in either a technical sense or a marketing sense. When the microprocessor industry was young, being able to load 16 bits, perform arithmetic on 16 bits, and then store 16 bits in the same number of cycles that it took an 8 bit processor to do the same, meant that the 16 bit processor was going to be superior in any case when 16 bit numbers were necessary. Since 8 bits couldn't hold very much, the 16 bit advantage was easy to realize. The same was true of the move from 16 bit to 32 bit, but with a lesser impact. Programmers had gotten quite good at using only 16 bits, so the move to 32 bits required a lot of new software to take advantage of it.

 

When 64 bit processors came along, they were actually quite useless to the consumer market. As a result, they stayed in the high-end workstation and server market where the need to crunch very large numbers came in handy. Of course, a lot of the technologies developed to crunch those numbers slowly became useful in the consumer market. So 32 bit chips started showing 64-bit features like double-wide Floating Point Units and SIMD instructions designed to work on as many bits as possible per clock cycle. The result is that today's processors are no longer straight 32-bit or 64-bit, but exhibit different tendencies depending on what instructions are run, how much memory needs to be addressed, whether streaming is needed, etc., etc., etc.

 

So as you can imagine, the metrics moved on to more important points. First it was the number of raw polys pushed. Then it was the number of textels. The latest fad seems to be how much I/O can a machine push in comparison to the size of its z-buffer and/or framebuffer. I'm sure that as technology pushes onward, the metrics will change again. :)

Link to comment
Share on other sites

IIRC, the Jag was the last system to try to use bits as a selling point.

 

Nintendo made a big deal about the GBA being a 32 bit handheld (it's even printed right on the box, IIRC). I'd call it the last system to be marketed in this way. Before that, they also obviously made a big deal about the N64 being 64 bits... and that came after the Jaguar.

 

And it was just all about marketing; it gave people an easy to understand benchmark for comparison. A 64 bit system was not necessarily twice as good as a 32 bit system but to many people, it seemed logical that it would be - even if that logic was wrong. So it made for good PR.

 

And I don't think that impact has been lost over the years, I think it's more that systems are so complex these days that any of them could legitimately claim to be whatever ridiculous number of bits. They probably all have 256 bit parts in them, even the Wii.

 

I personally think HD and supported resolutions has replaced "bits" for marketing purposes, especially given that HD is an easily quantifiable thing that's fairly easy to understand (unlike bits) and which can be seen by any layperson. Of course, that doesn't mean people can't argue about it... whether it matters at all, whether a system can really advertise "1080p" support when 99% of its games run at 720p or 1080i, etc. But this is really the new measure of a system's power, it seems. It's even replaced polygons. Do you know how many polygons the PS3, Xbox 360 or Wii can push? Probably not - but I'll bet you know the highest resolution each one supports.

Edited by spacecadet
Link to comment
Share on other sites

Waits for the Jaguar fanboys to come tromping thru. :lol:

Hi, sorry I'm late :)

 

Really though, "bits" is completely meaningless thanks to the fact that it never really applied to anything consistent across all consoles. Some companies used the width of the data bus of their processor, some used the address bus, some took some number from their graphics chip, and some just added numbers together (I'm talking about SNK here, NOT the Jaguar :P ). And even if everyone DID all get together and decide that it was going to be the width of the processors data bus, it's still completely meaningless because all the other parts of a system generally have more to do with the actual processing power of the system. It's like judging the power of a car purely on engine displacement, when obviously there are many other factors involved.

 

So basically, it's stupid, always has been stupid, and always will be stupid. Everyone please stop calling things n-bit. It just serves to make you look ignorant to those that actually know what's going on inside the systems.

 

--Zero

Link to comment
Share on other sites

>>>>>Ze_ro said: So basically, it's stupid, always has been stupid, and always will be stupid. Everyone please stop calling things n-bit. It just serves to make you look ignorant to those that actually know what's going on inside the systems.<<<<<

 

So Nintendo and Sega are stupid then ? (made them millions though)

Edited by thomasholzer
Link to comment
Share on other sites

What the h--l? There was not a single general purpose processor in the system that was 64 bit. Even the primary GPU was a 32 bit RISC processor! The "64 bit advantage" of the Jaguar was nothing more than Atari marketing hype over a few extra chips related to graphics coprocessing. By that logic, my PC is a 256 bit monstrosity! :roll:

 

From Wikipedia:

  • "Tom" (contains 3 video-related processors), 26.59 MHz
    • Graphics processing unit (GPU) – 32-bit RISC architecture, 4K internal cache, provides wide array of graphic effects
    • 64-bit object processor – programmable; can behave as a variety of graphic architectures
    • 64-bit blitter processor – high speed logic operations, z-buffering and Gouraud shading
    • 64-bit DRAM controller (not a processor)

    [*]"Jerry" , 26.59 MHz

    • Digital Signal Processor – 32-bit RISC architecture, 8k internal cache
    • CD-quality sound (16-bit stereo)
      • Number of sound channels limited by software
      • Two DACs (stereo) convert digital data to analog sound signals
      • Full stereo capabilities

      [*]Wavetable synthesis, FM synthesis, FM Sample synthesis, and AM synthesis

      [*]A clock control block, incorporating timers, and a UART

      [*]Joystick control

    [*]Motorola 68000 (processor #5)

    • General purpose control processor, 13.295 MHz

Despite my previous complaints about people claiming anything is n-bit, Atari DID have valid reasons to refer to the Jaguar as being 64-bit, even if it was mostly a marketing ploy. This is the Jaguars main graphics processor they're referring to here, it's hardly a trivial part of the system. The object processor and blitter are responsible for pretty much all of the Jaguars best effects. Just because the graphics don't look as good as the N64 doesn't mean that Atari lied about specs on it's graphics processors. They're a different design, and they can easily be 64-bit even without providing better graphics. And before anyone blames Atari for the whole "quoting numbers from the graphics processor" thing, they were definitely NOT the only company to do it, and they weren't even the first.

 

I was one of the guys who bought a Jaguar when they were new, so this whole thing is a huge pet peeve of mine.

 

And it was just all about marketing; it gave people an easy to understand benchmark for comparison. A 64 bit system was not necessarily twice as good as a 32 bit system but to many people, it seemed logical that it would be - even if that logic was wrong. So it made for good PR.

Bingo. I find it especially funny when people claim that the Atari 2600 is "4-bit" due to the fact that it's obviously nowhere near as good as a NES. Trying to explain the logistics of a 4-bit processor to people is exhausting.

 

I personally think HD and supported resolutions has replaced "bits" for marketing purposes, especially given that HD is an easily quantifiable thing that's fairly easy to understand (unlike bits) and which can be seen by any layperson. Of course, that doesn't mean people can't argue about it... whether it matters at all, whether a system can really advertise "1080p" support when 99% of its games run at 720p or 1080i, etc. But this is really the new measure of a system's power, it seems.

I don't know about that... I see the HD resolutions are more of a "we support this" kind of thing rather than something that's actually taken as a measurement of processing power. It seems to me that these days there really aren't any indications of processing power being thrown around... it's mostly just relative. That is, "Xbox > Gamecube > PS2", and "PS3 > 360 > Wii" and such, without any actual numbers being bandied back and forth.

 

So Nintendo and Sega are stupid then?

No, in fact I think their marketing departments did a great job with this, even though I think the whole thing is stupid, it certainly was effective.

 

--Zero

Link to comment
Share on other sites

What the h--l? There was not a single general purpose processor in the system that was 64 bit. Even the primary GPU was a 32 bit RISC processor! The "64 bit advantage" of the Jaguar was nothing more than Atari marketing hype over a few extra chips related to graphics coprocessing. By that logic, my PC is a 256 bit monstrosity! :roll:

 

From Wikipedia:

  • "Tom" (contains 3 video-related processors), 26.59 MHz
    • Graphics processing unit (GPU) – 32-bit RISC architecture, 4K internal cache, provides wide array of graphic effects
    • 64-bit object processor – programmable; can behave as a variety of graphic architectures
    • 64-bit blitter processor – high speed logic operations, z-buffering and Gouraud shading
    • 64-bit DRAM controller (not a processor)

    [*]"Jerry" , 26.59 MHz

    • Digital Signal Processor – 32-bit RISC architecture, 8k internal cache
    • CD-quality sound (16-bit stereo)
      • Number of sound channels limited by software
      • Two DACs (stereo) convert digital data to analog sound signals
      • Full stereo capabilities

      [*]Wavetable synthesis, FM synthesis, FM Sample synthesis, and AM synthesis

      [*]A clock control block, incorporating timers, and a UART

      [*]Joystick control

    [*]Motorola 68000 (processor #5)

    • General purpose control processor, 13.295 MHz

Despite my previous complaints about people claiming anything is n-bit, Atari DID have valid reasons to refer to the Jaguar as being 64-bit, even if it was mostly a marketing ploy. This is the Jaguars main graphics processor they're referring to here, it's hardly a trivial part of the system. The object processor and blitter are responsible for pretty much all of the Jaguars best effects. Just because the graphics don't look as good as the N64 doesn't mean that Atari lied about specs on it's graphics processors. They're a different design, and they can easily be 64-bit even without providing better graphics.

Congratulations on proving my point. :roll:

 

The 64 bit object, blitter, and memory controller chips do not a 64 bit system make. As I said, by that logic, my PC is a 256 bit monstrosity. (Some graphics cards even have 512 bit capabilties.) The CPU is the core of the system, and what is usually referred to for "bitness". It was 32 bit. You can try stretching it and say that the GPU was a general purpose core. It was 32 bit. The end result is, you can only get 64 bits if you "do the math" (32 + 32 = 64).

 

Of course, we could retcon the definition. But then I get to claim that my Pentium III is a 128 bit chip! Whoohoo! 128 bit power in the palm of my hands! Now you're playing with POWER! Streaming SIMD Extension POWER!

 

And before anyone blames Atari for the whole "quoting numbers from the graphics processor" thing, they were definitely NOT the only company to do it, and they weren't even the first.

For example?

Edited by jbanes
Link to comment
Share on other sites

NEC - TurboGrafx 16 - It is on every game box.

Fair enough. It still doesn't make the Jag a 64 bit machine, but at least it (somewhat) excuses their advertising. ;)

 

As I said, though: It's all about the games. The Jag didn't win or lose because of its hardware, it lost because of its games. (Or lack thereof.) The silver lining is that we can now go back to these systems for dirt-cheap prices and evaluate their merits in a market vacuum. Which means that fun can be found even in games that never would have sold a system. :cool:

Link to comment
Share on other sites

I personally think HD and supported resolutions has replaced "bits" for marketing purposes, especially given that HD is an easily quantifiable thing that's fairly easy to understand (unlike bits) and which can be seen by any layperson. Of course, that doesn't mean people can't argue about it... whether it matters at all, whether a system can really advertise "1080p" support when 99% of its games run at 720p or 1080i, etc. But this is really the new measure of a system's power, it seems.

I don't know about that... I see the HD resolutions are more of a "we support this" kind of thing rather than something that's actually taken as a measurement of processing power.

 

The only reason *not* to support HD is because the system doesn't have the rendering power to do it. So the implication is always the lower the resolution, the less power. That may not *always* be true, but it's the same as the whole bit argument - a 32 bit system was not necessarily less powerful than a 64 bit system, but many people thought it was. The same is true of HD vs. non-HD. (I think it's more true of the latter, though.)

 

A lot of people - including Sony themselves - made a big deal of the fact that the PS3 could do 1080p and the Xbox 360 could only do 1080i. That's not just a "we support it" thing, that's saying the PS3 has the power to push around double the amount of data that the Xbox 360 can. (I'm not totally confident in my math, but just as a square that's double the total size of a smaller square actually has four times as much surface area, I'm pretty sure that doubling just the vertical resolution - leaving the horizontal res intact - gives you double the total amount of data.)

 

Eventually MS released a firmware update that lets the 360 do 1080p also, but it's been problematic in various ways and from what I understand, hardly any games support it. Then again, hardly any games do 1080p on the PS3 either. So there's an argument there.

 

But the argument *is* about power, not just about standards support. High resolutions at high frame rates have always been a measure of power, even when bits were the main marketing point... but now that bits are no longer marketed and HD is gaining in popularity, resolution seems to be the main measure of power cited in both marketing and in online fanboy rants.

Link to comment
Share on other sites

That's a really good question. I've never really paid that much attention to the Bittage of the Game systems. For one reason, Bittage, while it does matter, isn't the only thing that matters. You can have a high bit console, that pushes numbers rather slowly, and a low bit console can keep up rather nicely by haveing a high hz rating on it's low bit processor. There's also memory, lots of memory is useually better. Does the thing have a graphic chip, or use the processor to do that (eats processing power)

 

Anyhow, if you think Bits is all that matters, look at the last gen. PS2, 128 bit. GameCube, 128 bit. X-Box, 32 bit.

 

I'm not sure if the 360 is 32 bit (I've heard 64, but I think it's 32) and the PS3 is 128. I have no Idea about the Wii, but I fiugre sinde it's a modified Cube, it's probably the same bittage as the cube.

 

in the 8 bit days, you had the Atari 2600, and Coleco, all 8 bit, and the Intellivision, 16 bit. Then back to Nintendo and SMS, 8 bit.

 

Bittage never really made much difference, but it's always been the thing people went by when buying stuff.

 

It's kind of like Graphics cards in computers. Saying it's X bit is kind of moot, since a 2D display can't display any better image weather you run 32 bit 'graphics' or 10,000 bit graphics. The human eyeball can only process so much itself, and computers are already way beyond that. The bittage of cards now doesn't actually ahve to do with it's graphics, but with processing. It may have a 256 bit processor, but it's still putting out 24 or 32 bit graphics.

 

Anyhow, just a thought

Link to comment
Share on other sites

Anyhow, if you think Bits is all that matters, look at the last gen. PS2, 128 bit. GameCube, 128 bit. X-Box, 32 bit.

 

I'm not sure if the 360 is 32 bit (I've heard 64, but I think it's 32) and the PS3 is 128. I have no Idea about the Wii, but I fiugre sinde it's a modified Cube, it's probably the same bittage as the cube.

 

The Gamecube, XBox, PS3, 360, and Wii are 32 bit. Only the PS2 has a 128 bit CPU due to its Digital Signal Processing design. None of which really matters as they're all capable of incredible amounts of 128-bit SIMD instructions.

Link to comment
Share on other sites

The Gamecube, XBox, PS3, 360, and Wii are 32 bit. Only the PS2 has a 128 bit CPU due to its Digital Signal Processing design. None of which really matters as they're all capable of incredible amounts of 128-bit SIMD instructions.

 

Pretty much all of that is incorrect. :ponder:

 

The discussions do crack me up though, keep going. :D

Link to comment
Share on other sites

Pretty much all of that is incorrect. :ponder:

Apologies. The 360 and the PPE of the PS3 use the updated, 64-bit Power architecture.

 

I stand by the rest, with the exception that the public intel on the Broadway chip (Wii) may be incorrect. It's understood to be a G3-based processor, but perhaps you know better CPUWiz?

Edited by jbanes
Link to comment
Share on other sites

The only reason *not* to support HD is because the system doesn't have the rendering power to do it. So the implication is always the lower the resolution, the less power. That may not *always* be true, but it's the same as the whole bit argument - a 32 bit system was not necessarily less powerful than a 64 bit system, but many people thought it was. The same is true of HD vs. non-HD. (I think it's more true of the latter, though.)

 

A lot of people - including Sony themselves - made a big deal of the fact that the PS3 could do 1080p and the Xbox 360 could only do 1080i. That's not just a "we support it" thing, that's saying the PS3 has the power to push around double the amount of data that the Xbox 360 can. (I'm not totally confident in my math, but just as a square that's double the total size of a smaller square actually has four times as much surface area, I'm pretty sure that doubling just the vertical resolution - leaving the horizontal res intact - gives you double the total amount of data.)

 

Huh? I've been doing 1080i on my xbox for years now.

 

Oh and the only difference between 1080i and 1080p is one is interlaced, one is not. The resolution is the same.

Link to comment
Share on other sites

I believe the "bit" argument was defunct before it ever began. Doesn't the Intellivision have more horsepower than the 5200 or ColecoVision? Yet they were still released to compete with one another.

 

I think "bits" came around as a selling point after the NES's heyday.

 

In my not so humble opinion, this thread needs locked just as badly as any other "bit" thread in which the Jaguar is mentioned.

 

Jaguar + definition of bits + Atariage forums = bad.

Link to comment
Share on other sites

I believe the "bit" argument was defunct before it ever began. Doesn't the Intellivision have more horsepower than the 5200 or ColecoVision? Yet they were still released to compete with one another.

At a whopping 984 kilohertz, the Intellivision had significantly less horsepower. It was able to compete because of good graphics hardware. It did have more "bits", though. :)

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...