Jump to content
IGNORED

The Bit Wars: Was it all BS?


Recommended Posts

Hi Everyone,

 

Growing up, I remember a lot of marketing hype concerning a system's "bits". The NES, and pretty much everything else around that time was an 8-bit machine. But Genesis and SNES were so much more superior with 16-bits! Remember when the Jag came out? "Do the math!". Systems were even named after their respective bits, such as the Turbographx 16 and N64.

 

Is it all BS? The Intellivision was 16 bits, but I do not see it outdoing the performance of the NES, 7800 or SMS. The PS3, I believe, is a 32-bit system. It certainly outdoes it's bit brothers, the PS1, Sega 32X and Game Boy Advance. So what was the big deal? Did the bit wars really amount to anything after all?

Link to comment
Share on other sites

Hi Everyone,

 

Growing up, I remember a lot of marketing hype concerning a system's "bits". The NES, and pretty much everything else around that time was an 8-bit machine. But Genesis and SNES were so much more superior with 16-bits! Remember when the Jag came out? "Do the math!". Systems were even named after their respective bits, such as the Turbographx 16 and N64.

 

Is it all BS? The Intellivision was 16 bits, but I do not see it outdoing the performance of the NES, 7800 or SMS. The PS3, I believe, is a 32-bit system. It certainly outdoes it's bit brothers, the PS1, Sega 32X and Game Boy Advance. So what was the big deal? Did the bit wars really amount to anything after all?

 

As we transitioned from "8-bit" computers to "16-bit" to "32-bit", nearly every part of the machine doubled in capability. This made each successive generation about twice as powerful per megahertz as the previous generation. In this sense, "bits" was an important way to measure performance at the time.

 

Graphics actually scaled along with the rest of the system: "8-bit" computers usually had "2-bit" graphics, "16-bit" computers had roughly "4-bit" graphics, and "32-bit" computers started with roughly "8-bit" graphics*. This may seem like a good thing, - the graphics certainly looked richer - but this actually cancelled out the advantages conferred by having "more bits." See, if a pixel has twice as many bits, it takes twice as long to process :). This was less of an issue for game consoles than home computers, since 2D** game consoles didn't usually process pixels in isolation.

 

But once we hit "32-bit" there was no obvious advantage to scaling up every aspect of the machine one more time. A "32-bit" machine can address only 4GB of memory, but no game system has ever had even half as much memory. A "32-bit" machine can do math on numbers that can have only four billion unique values, but in games we don't need more precision than that. Upping the whole system to "64-bits" would have provided no practical advantages and several disadvantages, such as wasting memory for big integers and pointers. So nobody made a wholly "64-bit" game system as of 2011.

 

Machines that we know as "64-bit" had a few "64-bit" parts but remained largely "32-bit". The first truly all-around "64-bit" game machines are recent PCs, though a recent GPU still isn't completely "64-bit."

 

A recent GPU has 32-bit pointers and 32-bit integers, but its registers are 1024-bit or 2048-bit and its pixels can be up to 128-bit. It wouldn't make sense to pick any of those numbers and claim that the GPU "has that many bits."

 

*yes, there are exceptions.

 

**hoo boy, "2D" console vs. "3D" console. that can be as ambiguous as "32-bit" vs. "64-bit."

Edited by bmcnett
Link to comment
Share on other sites

Think about the current change among Windows based PC's. I bought a Win7 machine a while back and the OS is 64 bit. Everything from about Win98 up was 32 bit OS, hence the FAT32 file system.

 

Bits does not automatically equate to superior performance. The programs have to be written to take full advantage of the improved data transfer capabilities.

 

I was told more than once to think of bits as the number of lanes on a highway. Obviously, a 16 bit operating system (or game console) could transfer double the amount of data in a standard measure of time as an 8 bit operating system. Think of superhighways around some major cities. It is apparent that 6 lanes in a single direction can transfer more passengers (data bits) than 2 lanes.

 

Intellivision and Genesis were both 16 bit machines, but it is obvious that one did better at programming its data to take advantage of the 16 bits. Add to this comparison the fact that processor chips, system RAM and VRAM play very important roles in overall performance, and there are many reasons for the observed differences between the 16 bit game systems.

Link to comment
Share on other sites

Intellivision and Genesis were both 16 bit machines, but it is obvious that one did better at programming its data to take advantage of the 16 bits. Add to this comparison the fact that processor chips, system RAM and VRAM play very important roles in overall performance, and there are many reasons for the observed differences between the 16 bit game systems.

 

It's more accurate to say that Intellivision was an 8-bit machine, because its overall capabilities were more comparable to 8-bit machines. Intellivision's CPU had 16-bit registers, but Genesis' CPU had 32-bit registers and we don't call it a 32-bit system either.

 

The most striking difference between Intellivision and Genesis would be the graphics processors: Intellivision had 1-bit+ pixels, while Genesis had 4-bit+.*

 

You could have swapped the GPUs in an Intellivision and Genesis and every Intellivision game would have looked as good or better, while not one Genesis game would have looked as good.

 

*If you count "tile" bits as fractional because they spread out over a tile's worth of pixels...

Edited by bmcnett
Link to comment
Share on other sites

The bit wars are a crock.

 

Case in point:

 

C64 uses a 6502

Atari 2600 uses a 6502

PC Engine uses a 6502

 

PC Engine games are way better than Atari 2600 and C64 games.

 

Music wise, maybe you like the SID more

RPG wise, is up to preference (Western vs. Japanese)

 

but visual/action/everything destroys the C64...

 

even though they are both 8 bit machines with the same CPU architecture.

 

Yeah the PCE is a bit of a coked out 6502, but still! 8 bits.

 

The whole thing was a frigging joke. Bits don't mean anything.

 

Jaguar was 64 bits, and games on the frigging Genesis were way cooler/funner/better.

Link to comment
Share on other sites

The bit wars are a crock.

 

They make sense only when one focuses on the GPU, where bits-per-pixel in games is a reasonable proxy for "bits in the system" in marketing literature:

 

1-2 bits: "8-bit" (2600, intellivision, colecovision, 5200, nes)

4-8 bits: "16-bit" (genesis, snes, turbografx-16)

16-32 bits: "32-bit" (ps1, n64, dreamcast, ps2, game cube, xbox, wii)

Edited by bmcnett
Link to comment
Share on other sites

"Bits" do have meaning, but that meaning was not translated well by the marketing guys. One reason, possibly the main reason, is that it applies to so many components of the computer: CPU, buss, memory, graphics, ... and they don't have to be the same.

 

You could only really call a particular device a "16 bit machine", for example, if every component operated at 16 bits, and very few were that simple.

 

It's analogous to comparing motor vehicles performance by measuring the fuel burn-rate, when the drivers experience really depends on many factors such as horsepower, weight and weight-distribution, tires, weather, terrain, and especially, the purpose to which the driver is putting the vehicle. Plowing a field is much different than winning at Daytona.

  • Like 1
Link to comment
Share on other sites

IIRC, Genesis claimed to be 16-bit but was 32 (68000), Jaguar claimed to be 64 bit but was also 32 (68000), Dreamcast claimed to be 128 bit but was 32? SNES actually was 16-bit (65816). TurboGraFX/16 claimed to be 16-bit and was 8-bit (custom 65C02). So yeah, I'd say a lot of it was probably bull.

 

Not really true.

 

The 68k is a 16-bit CPU that is 32-bit internally, hence the Atari ST - the ST stood for sixteen thirtytwo.

 

So the Genesis/Megadrive is 16-bit exactly as Sega advertised it.

 

In the Jaguar the 68k is just a boot chip, all it meant to do is menial tasks like read the joypads and assist with math. The Jaguar has a custom chip set with no CPU but the GPU which is the chip that does the most work in the machine is 64-bit and the data bus is 64-bit too.

 

Like you say the Turbografx 16 was actually 8-bit but looked 16-bit because it had a 16-bit graphics chip and the same can be said for the Lynx which had the same set up.

Link to comment
Share on other sites

and this is why the bit wars were stupid.

 

Everyone just picked the best number they could find and went ITS GOT THIS.

 

Games are more important than bits. Bally Astrocade! WOO

 

That's why people stopped talking about bits a long time ago. Nowadays they talk about "generations" which is a better term because it clearly relates to marketing and has no technical basis.

 

Every old system had all kinds of "bits" that didn't match (Atari 800 had an 8-bit data bus, a 16-bit address bus, and <3 bits per pixel at 160 wide) and so the argument that every part of a system must be at least N bits or else the whole system wasn't N bits has always been crazy. You could always find a part with less bits somewhere in the system. Amiga's sound was 8 bits per sample and its graphics were 6 bits per pixel, and its CPU registers were 32 bits, for example. Was Amiga a "16-bit computer" anyway? By any non-crazy measure, yes it was.

 

The Turbo Grafx-16 is a great example of a 16-bit system with 8-bit parts - in this case an 8-bit CPU. Graphics is what people really care about, and that's why the TG16 with its 16-bit GPU competed with the SNES and Genesis, not with the NES or Atari 2600 which had the same 8-bit CPU. It never really mattered how many bits the CPU "had."

Edited by bmcnett
Link to comment
Share on other sites

To me "bits" represents the CPU's internal data format. Hence, 68000 is "32-bit" (ask an Amiga fanboy and he'll prolly give you the same explanation), and any system using a 68000 is (as far as I'm concerned) also "32-bit".

 

This is what I've always thought, too. But everyone is correct that bits don't make the system. Well, they do, but it takes the whole system (all it's parts) to make that system's performance what it is.

Link to comment
Share on other sites

Even as a naive 12-13 year old kid I started to strongly suspect that the "bits" thing was BS. Reason being, the real world “power” of the various consoles seemed to have, at best, a very loose correlation to “bit rating”.

 

As it turns out, even after living through all console generations from VCS to modern day, and trying to educate myself on the basic technical details along the way, I still don’t have a definitive answer for what “bit rating” means in the context of game consoles. The closest thing I have to a definitive answer is more like a non-answer. Namely, that bit ratings are nothing but marketing ploys—disingenuous, reductive, and almost arbitrary.

 

Really, it doesn’t matter if you’re talking game systems, computers, audio gear, recording gear, etc., the underlying marketing principle is the same: higher numbers beget the perception of superiority.

 

It’s amazing at how willing the common consumer is to swallow these numbers whole without ever bothering to gain even a basic understanding of what the specs actually mean or what benefit they provide in the real world.

 

Graphics is what people really care about, and that's why the TG16 with its 16-bit GPU competed with the SNES and Genesis, not with the NES or Atari 2600 which had the same 8-bit CPU.

 

Exactly. This overly simplified view that graphical capability and overall system capability are one and the same is probably what allowed the marketing guys to perpetuate the "bits" bullshit for so long. The whole thing quickly starts to lose significance as soon as one realizes that various systems which ostensibly have the same "number of bits" appear to have capabilities that are markedly different.

Link to comment
Share on other sites

Exactly. This overly simplified view that graphical capability and overall system capability are one and the same is probably what allowed the marketing guys to perpetuate the "bits" bullshit for so long.

 

Though you're right that "bits" didn't refer to any specific quantity, the term wasn't meaningless. If anyone had tried to market a "16-bit" game system with three-color sprites, people would have laughed. There was a quantum leap between three-color sprites and fifteen-color sprites, where suddenly you get multiple shades for skin, hair, and clothing. Games went from looking like a coloring book to looking like a comic book. Everyone could feel that when the "16-bit" systems arrived with their fifteen-color sprites.

 

People were willing to pay for that quantum leap, and if the marketers had said "fifteen colors!" Atari could have marketed its 2600 as "128 colors!" which misses the point entirely!

 

Later, when the next generation after "16-bit" started rolling in, there was confusion about the next quantum leap: would it be truecolor sprites, CD video or textured triangles? There was a period between SNES and PS1 when consoles tried all three. In the end, the market decided against truecolor sprites. You look at 2D games on Jaguar vs. Genesis and the lack of colors on Genesis doesn't turn you off, if you're like most people. People didn't like CD video either, because it wasn't pretty like a movie OR interactive like a game.

 

So the next quantum leap became textured triangles, and by that time the marketers found the term "3D" so the talk about bits began to die. Nintendo and Jaguar kept talking 64 bits but nobody knew what they were talking about, since one looked worse than PS1 and the other looked better.

 

So aside from a brief period when NES and SNES were on the market, "bits" was pretty useless and confusing. Don't know why we'd talk about it today! ;)

Link to comment
Share on other sites

...It’s amazing at how willing the common consumer is to swallow these numbers whole without ever bothering to gain even a basic understanding of what the specs actually mean or what benefit they provide in the real world.
Sometimes Marketing gets it right: Microsoft successfully doubled the power of Windows 3.2 by taking advantage of hardware that was THOUSANDS of times faster when they introduced Windows 7.
Link to comment
Share on other sites

They make sense only when one focuses on the GPU, where bits-per-pixel in games is a reasonable proxy for "bits in the system" in marketing literature:

 

1-2 bits: "8-bit" (2600, intellivision, colecovision, 5200, nes)

 

Sega Master System was 8-bit and had 4bpp tiles ^^

 

to ask the question....has any 64bit games apeared on ANY current gen system (incl. mac or pc)

 

I believe the PS3 and 360 both have 64-bit CPUs

Link to comment
Share on other sites

They make sense only when one focuses on the GPU, where bits-per-pixel in games is a reasonable proxy for "bits in the system" in marketing literature:

1-2 bits: "8-bit" (2600, intellivision, colecovision, 5200, nes)

Sega Master System was 8-bit and had 4bpp tiles ^^

 

Read: "reasonable proxy." After all, this is a marketing term. "SMS was 8-bit" is focusing on CPU register width, which itself was never important to game marketers or their market. People wanted the 15-color sprites and "16-bit" was a catchphrase for that. Was SMS 16-bit? I don't remember how it was marketed but I think they could have gotten away with it (Turbo Grafx-16 did.) The SMS hardware palette was a little cartoony compared to the others. Not too many flesh tones to choose from.

 

to ask the question....has any 64bit games apeared on ANY current gen system (incl. mac or pc)

I believe the PS3 and 360 both have 64-bit CPUs

 

The games use the 32-bit registers and 128-bit registers, but not the 64-bit registers. It's safe to say that the CPUs are capable of 64-bit, should anyone ever want to go there.

Edited by bmcnett
Link to comment
Share on other sites

Exactly. This overly simplified view that graphical capability and overall system capability are one and the same is probably what allowed the marketing guys to perpetuate the "bits" bullshit for so long.

 

Though you're right that "bits" didn't refer to any specific quantity, the term wasn't meaningless. If anyone had tried to market a "16-bit" game system with three-color sprites, people would have laughed. There was a quantum leap between three-color sprites and fifteen-color sprites, where suddenly you get multiple shades for skin, hair, and clothing. Games went from looking like a coloring book to looking like a comic book. Everyone could feel that when the "16-bit" systems arrived with their fifteen-color sprites.

 

Maybe my memory is failing me, but honestly, I don't remember any talk whatsoever about bits until the Genesis emerged, emblazoned with the iconic "16-bit" tag on its chassis. Sure, maybe tech-geeks talked about such things during the NES era and earlier, but it wasn't spoken about at all by the layperson until the Genny/TG16/SNES era.

 

Regardless, I think we're kind of saying the same thing here. I agree that the ruse could only be stretched so far before people would laugh at it. My only point was that so long as the average consumer was seeing "better graphics" from one console to the next, the companies were able to get away with touting ambiguous system specs (in this case "bits" of unclear context) to their advantage in marketing, and nobody really questioned it (my system has 16 BITS and yours only has 8, na na na na). But then, after the "16-bit" era, the number of "bits" wasn't as hot of a topic in marketing anymore, probably because the companies couldn't find a deceptively simple way of making the "bit count" correlate to performance. As has been mentioned, Atari rode the bit-wagon right into the rocks with their embarrassing Jaguar ad campaign, all for a system which pretty much sucked the high hard one (at least, to everyone except those rare video game completist types that have an irrational fascination with the underdog) despite the fact that it was sporting way more "bits" than the competition.

 

Sometimes Marketing gets it right: Microsoft successfully doubled the power of Windows 3.2 by taking advantage of hardware that was THOUSANDS of times faster when they introduced Windows 7.

 

Ha. Well, look at it this way: 7 is numerically a little more than double 3.2, so applying the prevailing logic of the 8-to-16 bit transitional era, it makes perfect sense that the power would double. :)

Edited by Cynicaster
Link to comment
Share on other sites

Bits are only importing when dealing with pc/mac. Bit count for graphics is important. No photographer would like to work with a videocard that can only display 8 bit color. Because a image wouldn't look sharp because of color dithering. You will need a 24 bit color resolution to work with photo's.

8_bit.png

8 bit color depth

Truecolor.png

24 bit color depth

Also when installing memory, it is important to know how many bits your windows version is. Because installing 8Gb in a windows 32bit version is useless because a 32bit operating system can only address 4 GB memory max.

 

But with consoles, it's all about the games. If sony or microsoft would have released they're current console with only the original pong installed on it, and no option to play other games, you would never buy the system. You would be happier with a a2600 and would say it has better games then the new systems.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...