Jump to content
IGNORED

Atari 8bit is superior to the ST


Marius

Atari 8bit is superior to the ST  

211 members have voted

  1. 1. Do you agree?

    • Yes; Atari 8bit is superior to ST in all ways
    • Yes; Atari 8bit is superior to ST in most ways
    • NO; Atari ST is superior to 8bit in all ways
    • NO; Atari ST is superior to 8bit in most ways
    • NO; Both systems are cool on their own.

  • Please sign in to vote in this poll.

Recommended Posts

Yes, even Pentium is deterministic if you know when the CPU throttles, what state cache is in, what speed is the memory, etc.

 

No it isn't. (assuming original Pentium 1)

For instance, you'd also need to know the state of branch target buffer (used to predict outcome of branch instructions), AND the seed used for the replacement algorithm, because entries in the BTB are "randomly" replaced, rather than least recently used.

 

And as soon as we get past the Pentium MMX, we get into micro-ops, register renaming, out of order execution, etc.

Link to comment
Share on other sites

Atari 8-bit did have backward compatibility (moreso than PC)

 

No, they didn't. The XL and then the XE line were the same machine with different plastic around it for the most part. The XL brought us the PBI (but lost the 4 joystick ports and the RAM/ROM modules, and changed the OS!) and the XE brought us 128k. Very little changed in the essential guts of the machine-and in fact some of the changes broke backward compatibility. When they went 16-bit with the ST they didn't support backward compatibility at all (it was a "rip and replace" upgrade.)

 

On the PC though (and I was a PC user from 8086 until today), you had faster processors, better graphics cards, bigger hard drives, joystick ports, better sound cards, and all the while you could still run your old Lotus-123 spreadsheet program that you had paid $$ for. It's only recently got difficult to do backward compatiblity because the processors are now too fast-but the silver lining is that virtual computing can now be done, and we can emulate or virtualize almost any other system and OS that we want (now that is backward compatibility!) :)

Link to comment
Share on other sites

Saying it's actually, in real life usage, 3bpp is like saying your car has 8 cylinders but the mfr restricted it to 5. That's the point, not if somewhere internally those 2bpp from the PF and 1 msb from the char index get munged together to get 3bpp because if that was the end of the story that mode would be 8 colour not 4/5.

 

In RAM, each pixel is represented by two bits, with one of the combinations (11) being modified by the MSB of the character code, which then changes every instance of the "11" pixel in that 4x8 block to use a different color register. I wouldn't call that 3bpp either, and I think it's closer to what Andy mentioned, a "poor man's" C64 atrribute system.

Link to comment
Share on other sites

Saying it's actually, in real life usage, 3bpp is like saying your car has 8 cylinders but the mfr restricted it to 5. That's the point, not if somewhere internally those 2bpp from the PF and 1 msb from the char index get munged together to get 3bpp because if that was the end of the story that mode would be 8 colour not 4/5.

 

In RAM, each pixel is represented by two bits, with one of the combinations (11) being modified by the MSB of the character code, which then changes every instance of the "11" pixel in that 4x8 block to use a different color register. I wouldn't call that 3bpp either, and I think it's closer to what Andy mentioned, a "poor man's" C64 atrribute system.

 

It not only kind of sucks that it only works on 1 bit pair but it's a bit nasty being limited to 128 chars per char bank which means splitting the screen and wasting ram repeating graphics in more than one charset. All round sucky but kind of necessary for anything I've been looking at porting.

 

 

Pete

Link to comment
Share on other sites

No, they didn't. The XL and then the XE line were the same machine with different plastic around it for the most part. The XL brought us the PBI (but lost the 4 joystick ports and the RAM/ROM modules, and changed the OS!) and the XE brought us 128k. Very little changed in the essential guts of the machine-and in fact some of the changes broke backward compatibility. When they went 16-bit with the ST they didn't support backward compatibility at all (it was a "rip and replace" upgrade.)

 

On the PC though (and I was a PC user from 8086 until today), you had faster processors, better graphics cards, bigger hard drives, joystick ports, better sound cards, and all the while you could still run your old Lotus-123 spreadsheet program that you had paid $ for. It's only recently got difficult to do backward compatiblity because the processors are now too fast-but the silver lining is that virtual computing can now be done, and we can emulate or virtualize almost any other system and OS that we want (now that is backward compatibility!) :)

 

There is another problem that PCs largely eliminate. Not only does upgrading a PC make your existing software run better (usually...) PC games aren't coded to the lowest common denominator machine. Typical A8 configurations could be anywhere from 16K to 128K of ram and the XEs could additionally point the CPU and ANTIC at different banks of memory. A8 games early in the lifecycle targeted 16K 400 and later on usually didn't assume more than 48K available. Even the 16K under rom on the 800XL wasn't a frequent target. Our C-128 friends saw most things targeting the C-64; IIgs users often had to be content with II stuff; ST games tended to presume a 512K 520ST with the blitter and any additional ram and features on later machines disregarded. I suspect most Amiga titles presumed an A500.

 

Once we hit the late eighties, PC publishers started assuming a state of rolling upgrade. Game targets tend to skew towards the current high end when development starts since they figure most gamers will have upgraded by the time of release. Once the 320x240x256 square pixel mode, OPL2/3 or Sound Blaster, and a true 32-bit proc were in most every machine and we started seeing titles like DOOM the practice of coding games to high spec machines started and continued since then.

 

Apple had an opportunity to get that process started since most II series had slots and the architecture was mostly open. But even they tended to shy away from incremental upgrade and by the time we had the IIGS it really didn't matter anymore.

Link to comment
Share on other sites

There is another problem that PCs largely eliminate. Not only does upgrading a PC make your existing software run better (usually...) PC games aren't coded to the lowest common denominator machine.

 

I don't know about now, because I gave up on PC gaming when the consoles started satisfying me - somewhere around the Sega Dreamcast era. But I remember when I got my first VGA setup, lots of PC games were still EGA. Seems like for a while, the Adlib and original Sound Blaster (8-bit) were targeted after I'd upgraded to SB-16 and then whatever "wavetable synth" card (can't remember now) that I got after that. Later - when I got my first setup that would run 1600 x 1200 (and I don't remember what it was) it seemed that not a lot of stuff used it - but probably good because it would have been slow anyway. What finally caused me to quit was (as mentioned previously) I put in Sega Rally (PC) and it wouldn't run on my new system because in its check for DirectX it couldn't recognize the later version. So they don't always make your existing software run better. Some games would try to overwrite your newer DirectX with the older one that was on the installation disc, as well. Perhaps things have improved during the interim, but back in the 3Dfx/Power-VR era, it seemed you could buy a new whiz-bang video card and the game would be pitched at the masses, not you.

 

Once we hit the late eighties, PC publishers started assuming a state of rolling upgrade. Game targets tend to skew towards the current high end when development starts since they figure most gamers will have upgraded by the time of release. Once the 320x240x256 square pixel mode, OPL2/3 or Sound Blaster, and a true 32-bit proc were in most every machine and we started seeing titles like DOOM the practice of coding games to high spec machines started and continued since then.

I guess since consoles continually gain game market share (at least according to news blurbs I seem to remember) the PC gaming market has shrunk comparatively, and a larger percentage of what is left are "hardcore" PC gamers with expensive, late-model equipment, so perhaps in this situation, high-spec machines comprise a larger percentage of PC gamers (who are left) so things may be different now.

Link to comment
Share on other sites

There is another problem that PCs largely eliminate. Not only does upgrading a PC make your existing software run better (usually...) PC games aren't coded to the lowest common denominator machine. Typical A8 configurations could be anywhere from 16K to 128K of ram and the XEs could additionally point the CPU and ANTIC at different banks of memory. A8 games early in the lifecycle targeted 16K 400 and later on usually didn't assume more than 48K available. Even the 16K under rom on the 800XL wasn't a frequent target. Our C-128 friends saw most things targeting the C-64; IIgs users often had to be content with II stuff; ST games tended to presume a 512K 520ST with the blitter and any additional ram and features on later machines disregarded. I suspect most Amiga titles presumed an A500.

I agree that devs tended to support the lowest common denominator but that was more of an issue on 8 bits than 16 bit machines.

Many Amiga titles may have been written so they *could* run on the A500 with 512K, but that did not mean they didn't take advantage of more RAM if you had it. RAM was the most common upgrade and 1MB was very common.

I do think many later programs chose to support the original chipset instead of adding AGA support for the 1200/4000. That is more like what was experience with the C128 or IIgs. But then some older programs were upgraded to add AGA support.

Link to comment
Share on other sites

I agree with what most people are saying about games geared towards each systems. Seems like Atari & Commodore had games with minimal system requirements where when VGA & Sound Blasters because available for the PC, games started to take advantage of it. I remember DOS games where you had to select if you wanted to play the game in CGA, EGA, or VGA, with PC speaker, adlib, or Sound Blaster. I think software manufacturers could had made games that can include options for more enhanced abilities on the ST & Amiga. I have a 130XE and disappointed that very few games ever took advantage of 128K with separate Antic & CPU access.

Link to comment
Share on other sites

If you make the shaded image I posted fullscreen (the way it was meant to be) and view the ST image full screen, you can see all the missing shades and how crappy it actually is. I don't know how the other images got into the comparison-- I was giving example of shaded image that looks worse on ST.

 

I posted two ST versions of your picture? Did you look at both of them? I think the 2nd one actually looked pretty close ( if not identical )

 

The only way to get the additional shades is to flicker them which won't count as a fair comparison (60Hz vs. 30Hz) since A8 can also do the same.

Link to comment
Share on other sites

More computer vs computer bashing ongoing here. People are using it just like the Atari 8 vs Commodore 64 crap and are coming in here and saying it's much better to the A8. I do say its not only the hardware, but it is how the programmer utilizes all the available features. Most of us experience programmers can come up with something on the Atari 8-bit system that is difficult to do on systems like the C64 or 520ST. Vice-Versa, there are somethings hard to do the other way. With the STs higher res + color depth, someone can make some nice games. If I understand people correctly, the first ST had issues with side scrolling type games, had to do much of it via the CPU. It comes down to the type of game being played on the system. Boulderdash is superior on the Atari 8-bit because that game utilizes the hardware. You can come up with different games looking superior on other systems.

...

True, except that one with more hardware support will win.

 

When the PC with VGA (AGP, PCI express) cards with built in BLiTTER, 3D Acceleration, & Independent Video Memory, we had a versatile system that had everything needed to make great games.

 

Except a few things like joystick controllers and ability to program directly to hardware to guarantee response times on all systems.

Link to comment
Share on other sites

Is this the RoF ship? If so then why are people still banging on about an image looking worse because it's less shades when it's been converted from a low quality image in the first place? Someone model or draw a 320x200x16 versionthen scale that to 80x200x16 and 320x200x8 and see which looks best then.. Oh, no need, I already posted a similar comparison pages back.

 

 

Pete

Link to comment
Share on other sites

Yes, even Pentium is deterministic if you know when the CPU throttles, what state cache is in, what speed is the memory, etc.

 

No it isn't. (assuming original Pentium 1)

For instance, you'd also need to know the state of branch target buffer (used to predict outcome of branch instructions), AND the seed used for the replacement algorithm, because entries in the BTB are "randomly" replaced, rather than least recently used.

 

And as soon as we get past the Pentium MMX, we get into micro-ops, register renaming, out of order execution, etc.

 

There's nothing random. Even the BTB has some algorithm for its so-called "randomness". It's just that it's too difficult to determine the state of the machine so it's considered indeterminate. I'm sure if someone knew all the algorithms the circuits were created for and knew the state of the machine, he can tell EXACTLY how many cycles the next instruction will take.

 

Of course, on A8 it's very simple to tell and Copper enforces the synchronization by having high priority over the 68000.

Link to comment
Share on other sites

I'm so glad this has degraded into Atari vs C64 Part II. Now I can stop reading it.

...

It hasn't. It's more A8 vs. ST only some guy "randomly" comes every 24-48 hours and makes claims about C64 to try to throw the discussion off-topic. I doubt he even uses a real C64-- bet he's running a C64 emulator so he doesn't see the chip failures and experience slow loading times. He may even have set up the emulator to show 256 colors to prevent ruining his eyes from the ugly looking graphics.

 

The ST isn't going to be better at all things either so that narrows it down to most ways or on their own.

If you are honest about it I think most leans more towards the ST than the 8 bit but most?

I think that's open to interpretation.

 

But "most" doesn't matter as we know from C64 scenario. Most people bought the piece of crap because of price. Which has more things being done in hardware is the better machine (better hardware) regardless of what software exists for it.

  • Like 1
Link to comment
Share on other sites

I find it easier not to argue that point in this case - it just muddies things :) - and to display 5 colours I have to go up to 3 bitplanes ( Funnily enough it was actually quicker to go directly to 4 planes for the Boulderdash scroll code )

 

Your worse case is yet to come-- combining multiple items from the list.

 

It's 3bpp for those who know how the AN0..AN2 works per color clock.

 

Sounds interesting :)

 

CTIA doesn't know graphics modes really so everything is just 3bpp interpreted as: BAK, PF0, PF1, PF2, PF3, Set Hires, Clr Hires, Vsync. Of course, GTIA modes take two consecutive 3bpp and form a 6bpp at 80*240. Only 320*200 takes two of the three bits as two hires pixels.

Link to comment
Share on other sites

In RAM, each pixel is represented by two bits, with one of the combinations (11) being modified by the MSB of the character code, which then changes every instance of the "11" pixel in that 4x8 block to use a different color register. I wouldn't call that 3bpp either, and I think it's closer to what Andy mentioned, a "poor man's" C64 atrribute system.

 

No, the way the hardware works is what I wrote in my previous message. ANTIC is repeating the bit from the MSB of char value to form 3bpp for GTIA. They could have easily had a different bit for each color clock and GTIA wouldn't know the difference and would happily show 160*240*5 linear graphics. So you are as wrong as Andy with the exception that Andy also made a fool of himself by declaring that I was lieing.

Link to comment
Share on other sites

I'm so glad this has degraded into Atari vs C64 Part II. Now I can stop reading it.

...

It hasn't. It's more A8 vs. ST only some guy "randomly" comes every 24-48 hours and makes claims about C64 to try to throw the discussion off-topic. I doubt he even uses a real C64-- bet he's running a C64 emulator so he doesn't see the chip failures and experience slow loading times. He may even have set up the emulator to show 256 colors to prevent ruining his eyes from the ugly looking graphics.

 

The ST isn't going to be better at all things either so that narrows it down to most ways or on their own.

If you are honest about it I think most leans more towards the ST than the 8 bit but most?

I think that's open to interpretation.

 

But "most" doesn't matter as we know from C64 scenario. Most people bought the piece of crap because of price. Which has more things being done in hardware is the better machine (better hardware) regardless of what software exists for it.

 

When the best you can do is keep saying "piece of crap" and "ugly colours" over and over you just prove how much of an A8 fanboy you are.

 

 

Pete

  • Like 1
Link to comment
Share on other sites

In RAM, each pixel is represented by two bits, with one of the combinations (11) being modified by the MSB of the character code, which then changes every instance of the "11" pixel in that 4x8 block to use a different color register. I wouldn't call that 3bpp either, and I think it's closer to what Andy mentioned, a "poor man's" C64 atrribute system.

 

No, the way the hardware works is what I wrote in my previous message. ANTIC is repeating the bit from the MSB of char value to form 3bpp for GTIA. They could have easily had a different bit for each color clock and GTIA wouldn't know the difference and would happily show 160*240*5 linear graphics. So you are as wrong as Andy with the exception that Andy also made a fool of himself by declaring that I was lieing.

 

Unless I'm mistaken 3bits = 0-7 = 8 values so it still doesn't matter what the hardware does with the msb it STILL only gives FIVE not eight colours, ergo it's not a full 3 bit colour register.

 

 

Pete

Link to comment
Share on other sites

Yes, even Pentium is deterministic if you know when the CPU throttles, what state cache is in, what speed is the memory, etc.

 

No it isn't. (assuming original Pentium 1)

For instance, you'd also need to know the state of branch target buffer (used to predict outcome of branch instructions), AND the seed used for the replacement algorithm, because entries in the BTB are "randomly" replaced, rather than least recently used.

 

And as soon as we get past the Pentium MMX, we get into micro-ops, register renaming, out of order execution, etc.

 

There's nothing random. Even the BTB has some algorithm for its so-called "randomness". It's just that it's too difficult to determine the state of the machine so it's considered indeterminate. I'm sure if someone knew all the algorithms the circuits were created for and knew the state of the machine, he can tell EXACTLY how many cycles the next instruction will take.

 

Of course, on A8 it's very simple to tell and Copper enforces the synchronization by having high priority over the 68000.

 

More over simplification. It's only easy to tell on the A8 IF you've specifically written code that takes every branch, every LD/ST over page boundaries etc etc into account and evens them all out over a scanline. To do that with anything other than some demo routines is nigh on impossible. Yes you can tell what the DMA is doing, but you can on the C64 as well as other machines, people have been doing it for years on those, not just talking about it.

 

 

Pete

  • Like 1
Link to comment
Share on other sites

 

How thick do you 3 look, I would suggest you all apply for disability benefit so your hospital stays can be paid for in the asylum :)

 

So you also discriminate against disabled people, why am I not surprised.

 

He cannot answer the arguments (at least 19 of them on top of the proof that it's a piece of crap) so just has to rely on calling people names.

Link to comment
Share on other sites

If you make the shaded image I posted fullscreen (the way it was meant to be) and view the ST image full screen, you can see all the missing shades and how crappy it actually is. I don't know how the other images got into the comparison-- I was giving example of shaded image that looks worse on ST.

 

I posted two ST versions of your picture? Did you look at both of them? I think the 2nd one actually looked pretty close ( if not identical )

 

The only way to get the additional shades is to flicker them which won't count as a fair comparison (60Hz vs. 30Hz) since A8 can also do the same.

 

You posted that picture , didn't say anything about 60/30Hz :) - I'm willing to let you try 30Hz to reproduce the other picture if you like.

 

The other way would be to false colour the pallette slightly - so the inbetween shades are slightly off hue.

RGB's {0,0,0},{0,1,0},{1,1,1},{1,2,1} .... {6,6,6},{6,7,6},{7,7,7} - or some other calibrated pallette.

 

There wont be an exact match, and your picture is slightly biased already as it's made for the resolution, rather than being converted from a true colour original.

Link to comment
Share on other sites

CTIA doesn't know graphics modes really so everything is just 3bpp interpreted as: BAK, PF0, PF1, PF2, PF3, Set Hires, Clr Hires, Vsync. Of course, GTIA modes take two consecutive 3bpp and form a 6bpp at 80*240. Only 320*200 takes two of the three bits as two hires pixels.

 

Does GTIA really give 6bpp? How does that work - I thought it's only 4 bit's max?

Link to comment
Share on other sites

It not only kind of sucks that it only works on 1 bit pair but it's a bit nasty being limited to 128 chars per char bank which means splitting the screen and wasting ram repeating graphics in more than one charset. All round sucky but kind of necessary for anything I've been looking at porting.

 

 

Pete

 

Given the way Player pairing works - it would have been cool if the high bit had ored in PF3 with PF0/PF1/PF2 rather than just swapping PF3 for PF2 ... I imagine that would have been really easy to implement as well.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...