Jump to content
IGNORED

How many bits is a 2600?


Recommended Posts

8 bit all the way. In fact - both the Atari 2600 and the NES use 6502 processors. The Atari's is just a cost reduced version (the 6507) that is identical in every way - except it lacks the upper address pins. So, the 6507 can only address 8k of total memory. The cartridge slot can only address 4k of that memory space. The rest is used for the Atari's internal RAM (128bytes), the TIA, the RIOT, etc.

 

-Ian

Kinda/sorta 16 bit. Officially it's a 16 bit CPU but has 10 bit hardware support...

 

The registers in the CPU are 16 bit but the instruction set and carts are 10 bits wide.

So? The 8088 has an 8-bit bus, but everyone still thinks of it as a 16-bit CPU.

 

The CP1610 will happily execute from 16-bit wide ROM and make use of 16-bit wide immediates and 16-bit displacements on branch instructions and read/write 16-bit data in 16-bit wide RAM. The 10-bit opcodes are a clever hack though to save cost on mass produced ROMs. Space Patrol actually uses a 16-bit wide ROM and makes use of the handful of extra cycles it gives you by not having to read 16-bit data from two consecutive ROM locations. (I needed every cycle I could get.) Arnauld Chevallier's games also use 16-bit wide ROM. There's no advantage to restricting ourselves to 10 bits anyway.

 

The Atari 2600, though, is thoroughly 8 bit, as is the NES. The faster instruction rate of the 6502 / 6507 for register-register ops makes up for the narrower ALU width much of the time. The dearth of registers (ie. relying heavily on the zero page in lieu of actual registers) brings the instruction rate back down though for complex code. *shrug*

Edited by intvnut

With a tool or weapon you can make lots of bits. Enough to rival modern systems. But modern systems are sometimes larger than the 2600, so this is not 100% accurate.

 

 

"A cashier said that I had to present two pieces of identification. So I broke it in half and said, 'I can make as many pieces as you need'".

NES is 8-bit, does anyone ever associate bit levels with a 2600?

 

 

8-bit, just like Colecovision, NES, 5200, 7800, XEGS, Sega Master System etc.

 

processing power isn't really what differentiates them in a meaningful way as opposed to differences in graphics architecture and available memory.

  • 6 months later...

i have seen so many people call the atari 4-bit or 1-bit, but those kinds of processors couldn't do anything compared to the 8-bit processor that the atari has. speaking of 1-bit, how many bits is the old pong machine? i have heard so many people say that it is 1-bit, but i think thats not right.

i have seen so many people call the atari 4-bit or 1-bit, but those kinds of processors couldn't do anything compared to the 8-bit processor that the atari has. speaking of 1-bit, how many bits is the old pong machine? i have heard so many people say that it is 1-bit, but i think thats not right.

Pong doesn't have a CPU, so it's hard to answer that.

I believe that the whole bit-counting started when the consoles with 16-bit CPUs showed up. I guess they emphasized the term "16-bit" a lot, and they also had to emphasize that the inferior products were just "8-bit" systems. It was pure marketing. Bits are everywhere in a computer. Memory can be counted in bits, the address bus has a number of bits, as does the data bus. Many computers and consoles have multiple processors in them, each with different specifications. With all of this, it's hard to tell how many bits a system actually has. Companies will use whatever sounds best.

 

Before the 16-bit consoles we didn't assign bits to consoles, that's why the 2600 is not known as a "X-bit console". The NES and the SMS, for example, were labeled as 8-bit consoles because they we still pretty much alive when the 16-bit era started. So when someone nowadays tries to label the 2600, they assume that since it came before the NES, it must have less bits.

 

Before the 8-bit CPU's, there were in fact 4-bit CPUs, but they were very limited and were only used in calculators and things like that, they wouldn't do a very good job trying to control a complex system with video, audio and things like that. They were not commercially successful because they weren't very useful.

 

So yeah, the CPU on the 2600 is pretty much the same as the one in the NES. The NES CPU runs about 50% faster than the one on the 2600 though, but what makes the real difference is the video chip. The TIA needs constant CPU attention while rendering the screen in order to show anything meaningful, so there is very little free time left to process game logic. The NES has enough video memory for a whole frame, meaning that it can generate a full video frame without any CPU intervention. Video changes must be made during vertical blank, so the CPU is free for game logic while the screen is being rendered by the PPU (Picture Processing Unit).

I believe that the whole bit-counting started when the consoles with 16-bit CPUs showed up.

I agree, though I think it probably started with 16-bit computers before being applied to consoles.

 

Marketing people needed a concise way to explain the benefit of a newer system that primarily operates on 16-bits at a time, so they started calling the machines "16 bit". They couldn't convey the advantage just by talking about how many MHz the processor ran at.

 

The term became increasingly fuzzy as time went on and architectures got more complex. By the time of "64 bit" it had become meaningless.

 

I saw some kid refer to modern game consoles as "512 bit". Who knows what the hell they're talking about with that nonsense now.

Edited by gdement

I'm still trying to figure out how many licks it takes to get to the tootsie roll center of a tootsie pop.

I thought that smart-ass owl cleared that up years ago.

 

Maybe we should get a second opinion. Youtube video forthcoming...

So yeah, the CPU on the 2600 is pretty much the same as the one in the NES. The NES CPU runs about 50% faster than the one on the 2600 though, but what makes the real difference is the video chip. The TIA needs constant CPU attention while rendering the screen in order to show anything meaningful, so there is very little free time left to process game logic. The NES has enough video memory for a whole frame, meaning that it can generate a full video frame without any CPU intervention. Video changes must be made during vertical blank, so the CPU is free for game logic while the screen is being rendered by the PPU (Picture Processing Unit).

 

For those of you that collect computers, this is the same thing as comparing a Commodore VIC-20 to a Timex Sinclair 1000

 

The VIC-20 has a powerful video chip that does the graphics processing and rendering separate from the CPU.

 

The TS-1000 is set up without such a chip and the screen rendering is done by the CPU.

 

Run a BASIC program on both and the VIC will run circles around the TS-1000 even though the TS-1000's clock speed is over 3x that of the VIC-20.

  • 10 years later...
On 9/15/2009 at 9:17 PM, tokumaru said:

I believe that the whole bit-counting started when the consoles with 16-bit CPUs showed up. I guess they emphasized the term "16-bit" a lot, and they also had to emphasize that the inferior products were just "8-bit" systems. It was pure marketing. Bits are everywhere in a computer. Memory can be counted in bits, the address bus has a number of bits, as does the data bus. Many computers and consoles have multiple processors in them, each with different specifications. With all of this, it's hard to tell how many bits a system actually has. Companies will use whatever sounds best.

 

Before the 16-bit consoles we didn't assign bits to consoles, that's why the 2600 is not known as a "X-bit console". The NES and the SMS, for example, were labeled as 8-bit consoles because they we still pretty much alive when the 16-bit era started. So when someone nowadays tries to label the 2600, they assume that since it came before the NES, it must have less bits.

 

Before the 8-bit CPU's, there were in fact 4-bit CPUs, but they were very limited and were only used in calculators and things like that, they wouldn't do a very good job trying to control a complex system with video, audio and things like that. They were not commercially successful because they weren't very useful.

 

So yeah, the CPU on the 2600 is pretty much the same as the one in the NES. The NES CPU runs about 50% faster than the one on the 2600 though, but what makes the real difference is the video chip. The TIA needs constant CPU attention while rendering the screen in order to show anything meaningful, so there is very little free time left to process game logic. The NES has enough video memory for a whole frame, meaning that it can generate a full video frame without any CPU intervention. Video changes must be made during vertical blank, so the CPU is free for game logic while the screen is being rendered by the PPU (Picture Processing Unit).

Sorry for replying 11 years later. The only "console" i can think of that actually had a 4-bit CPU was the Nintendo Game & Watch series

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...