Jump to content
IGNORED

Space Harrier XE


Recommended Posts

This indicates that artifacting is "well-behaved."

 

http://www.atariarchives.org/c1bag/page203.php

 

Are you speaking total phosphers across the screen or just within the border when you say 600-800?

 

BTW, Compute's book (this section is by David Diamond) is over-simplified and wrong in its description of the cause of artifacting. Artifacting happens because a pattern of on-off pixels creates a waveform at the 3.58MHz color carrier frequency and fools the TV into thinking the signal contains color information. It has nothing to do with the placement of red, green, and blue phosphors on the tube.

Link to comment
Share on other sites

I haven't seen what it looks like on PAL, but that could be a third option.

The 3rd and 4th photos in this post show an example of PAL artifacting.

 

I wasn't speaking against VBXE but about standard A8 modes using RGB. You don't gain much with a RGB output vs. standard RF/Svideo/Composite A8 signals. Perhaps, if you want to simulate an 80-column display (which Atari wasn't meant for), yeah artifacting would stink things up. The standard 40-column text mode on A8 was set up with the colors to minimize artifacting (since there it was undesirable) whereas in other scenarious it was desirable.

Even with Atari's low 160px color resolution RGB output can give an advantage. With S-Video the chroma signal is quadrature-modulated, with carrier frequency of 3.58MHz. Because of this modulation, when colour changes between 2 pixels, the change is not instant - the boundary between those pixels is slightly "fuzzy". Should be most visible with a pattern of 2 alternating hues with identical luminance.

 

In RGB output no modulation takes place, the display is driven directly by voltage straight from the RGB cable - so change between colours in consecutive pixels is razor-sharp.

 

 

Does using only palette entries from the grayscale (8 CTIA grayscales, 16 GTIA) section of the palette avoid artifacts?

With composite output? Not really. After all, artifacting is achieved by displaying an alternating pattern of high-res pixels, and high-res is always monochrome on Atari. When displaying lower horizonal resolutions, slight artifacts on pixel boundaries might be visible, too - probably it depends on quality of the TV's demodulator.

 

(I know some other machines had modes disabling the color carrier for higher resolution grayscale/monochrome still using a standard definition 15.7 kHz vsync video signal -like CGA)

In CGA it disables the whole colour signal including colorburst - thus indicating to a TV that the signal is black & white. On Atari you can't turn colorburst off.

Edited by Kr0tki
Link to comment
Share on other sites

Given most monitors are at least color clock resolution, I would have to see it to believe it. And unless it works with those Commodore type dual composite/rgb type monitors, you would need to carry an extra monitor as well.

 

And about artifacting, most software using 320+ resolution assumes it exists, so it's not that useful to not have it.

 

Does using only palette entries from the grayscale (8 CTIA grayscales, 16 GTIA) section of the palette avoid artifacts? Obviously if you have the Y/C outputs, you could simply use only luma -including plugging luma into a composite monitor, but I was wondering more about the standard composite video/RF out. (I know some other machines had modes disabling the color carrier for higher resolution grayscale/monochrome still using a standard definition 15.7 kHz vsync video signal -like CGA)

 

The CTIA/GTIA palette is YUV based, isn't it? (so, technically speaking, Y'PbPr output might make more sense, though transcoding to RGB isn't much different other than the fact that newer SDTVs in the US support Y'PbPr but not RGB -the TMS9928 outputs natively in Y'PbPr iirc)

 

However, by that logic, grayscale images using a dedicated luma line should look no better in RGB (or Y'PbPr) than the native Y/C output, assuming the same monitor is used.

 

...

Yeah, any noise caused by conversions would be minimal with separation of the audio, luma, and chrominance. Just an aside, I think using chroma and luma components on Amiga's HAM mode would have worked out better than using RGB as that would be a more compressed representation of the colors and allow modifying chroma or luma on a per pixel basis instead of R,G,B separately.

 

1084 isn't colour-clock resolution, it's well over double. It displays Amiga and VBXE resolutions just fine.

It's just a SDTV monitor (standard ~15 kHz Vsync), being analog there's no hard limit on horizontal resolution, color is limited by the carrier signal, but the only thing limiting horizontal resolution for luma is beam precision and phosphor dot pitch, which should indeed be far higher than 160 dots across. (even old TVs should have over 400 across, newer ones tend to be closer to 600-800 I believe, -obviosuly HDTVs and VGA/SVGA monitors are far higher; perhaps some really small/cheap old sets have lower, but I doubt even close to 160, probably few lower than 300)

...

There's a limit although it may not be "hard" depending on TVs and other factors. I don't know where you are getting those 600-800 numbers. Obviously, for digital encoding you can encode whatever resolution you want but once you send through NTSC standards of 3.57Mhz, you run into resolution limits. Also, B&W encoding can be higher frequency than 3.57Mhz so you can get higher resolutions in luma mode. All you have to do is try putting some high frequency color changes and see what it looks like. The letter "m" is a good test since that's pretty high frequency in a 6*6 or 8*8 grid.

 

And MPEG type video encoding tends to have chroma at 1/2 the luma resolution anyway. (though that generally far exceeds the NTSC or even PAL color limits, for DVD at least, for 320x240 VCD, chroma resolution would correspond quite well to NTSC)

...

Humans see luminance more than chrominance so even if you do 160*240 chroma (4:1 subsampling), it looks pretty good.

 

That's why it looks like crap because the software assumes artifacting and it doesn't occur (for the most part) on PAL. Also, when you adjust the 1084 monitors hsize/vsize, you are rescaling the image to another grid that most likely doesn't correspond to the exact number of RGBs on the physical screen so there's some distortion being added there even with RGB mode. Only LCDs with exact mapping of pixel coming in to pixel going out can you obtain a perfect image.

I disagree: with high precision beam and very fine dot pitch, the picture can indeed be sharper (or near the same) as a native resolution display on LCD/Plasma/OLED digital displays, as is the case with good VGA/SVGA monitors. (any physical advantage in pixel sharpness is tempered by the higher contrast and black level a CRT is capable of -less so for Plasma, but that's not really an issue int he context of computer monitors, and plasma has other disadvantages -namely pixel size, cost, and power consumption)

...

You can approach the same exactness as a 1:1 LCD, but I don't see how you can get a better picture using interpolated/scaled pixels.

 

Of course, you can still display higher resolution than the native dot pitch and not require precise alignment, but the coarser the dot pitch the more difficult it is to display really sharp graphics. (there's also other factors, like dot pattern: ie square grid, staggered square dots, or a delta pattern -fine pitch delta pattern possibly being the best, though the difference is really minimal)

...

If you view a block of pixels together the analog scaling is better than LCD doing the scaling digitally but for 1:1 LCD, it's the best you can get.

 

With RGB signal the whole concept of colour clocks becomes irrelevant. The signal is sent from the source straight to the screen without any modulation or YUV encoding.

Doesn't CTIA/GTIA use a native YUV derived palette (or digital representative thereof, or the similar, but distinct YCbCr)? The CRT monitors themselves use RGB natively for the display, but that's another issue. (and transcoding YUV/Y'PbPr/RGB really doesn't introduce perceptible loss)

 

Yeah, you got a bunch of encoding/decoding/scaling going on in every domain on CRT monitors. Here's a link to resolution stuff-- not that even at 4.2Mhz they are claiming resolution of 320..650.

 

http://www.maxim-ic.com/appnotes.cfm/an_pk/734

Link to comment
Share on other sites

I haven't seen what it looks like on PAL, but that could be a third option.

The 3rd and 4th photos in this post show an example of PAL artifacting.

 

Enclosed is picture of artifacting on XEGS machine. Looks pretty useful to me to easily get colors in high resolution w/o resorting to PMG or DLIs.

 

I wasn't speaking against VBXE but about standard A8 modes using RGB. You don't gain much with a RGB output vs. standard RF/Svideo/Composite A8 signals. Perhaps, if you want to simulate an 80-column display (which Atari wasn't meant for), yeah artifacting would stink things up. The standard 40-column text mode on A8 was set up with the colors to minimize artifacting (since there it was undesirable) whereas in other scenarious it was desirable.

Even with Atari's low 160px color resolution RGB output can give an advantage. With S-Video the chroma signal is quadrature-modulated, with carrier frequency of 3.58MHz. Because of this modulation, when colour changes between 2 pixels, the change is not instant - the boundary between those pixels is slightly "fuzzy". Should be most visible with a pattern of 2 alternating hues with identical luminance.

 

In RGB output no modulation takes place, the display is driven directly by voltage straight from the RGB cable - so change between colours in consecutive pixels is razor-sharp.

...

I don't notice much difference at that resolution, but at higher resolutions it does blur the pixels. And also the smoothening is a good thing; the blockiness makes the 160*200 images looks terrible on emulators. Enclosed is a sample image at higher resolution that looks pretty good.

post-12094-127368649918_thumb.jpg

post-12094-1273686506_thumb.jpg

Link to comment
Share on other sites

I don't notice much difference at that resolution, but at higher resolutions it does blur the pixels. And also the smoothening is a good thing; the blockiness makes the 160*200 images looks terrible on emulators. Enclosed is a sample image at higher resolution that looks pretty good.

How did they get all those colors on the checkmark (right picture) - surely that is not a result of artifacting.

Link to comment
Share on other sites

I don't notice much difference at that resolution, but at higher resolutions it does blur the pixels.

You mean, that S-Video blurs the pixels, or that RGB does it?

 

Enclosed is a sample image at higher resolution that looks pretty good.

I'm seeing a lot of blur, also in vertical direction (which definitely cannot be caused by S-Video nor by RGB) - is it because of the camera being out of focus, or does the display really look like this?

 

How did they get all those colors on the checkmark (right picture) - surely that is not a result of artifacting.

Whaddya mean, it's all within standard Amiga's possibilities no magic here.

Edited by Kr0tki
Link to comment
Share on other sites

I don't notice much difference at that resolution, but at higher resolutions it does blur the pixels. And also the smoothening is a good thing; the blockiness makes the 160*200 images looks terrible on emulators. Enclosed is a sample image at higher resolution that looks pretty good.

How did they get all those colors on the checkmark (right picture) - surely that is not a result of artifacting.

 

It's a miggy, probably just a copper list changing the colour per line. Never actually thought about it before after seeing that for years but it'd be the easiest way to do it.

 

 

Pete

Link to comment
Share on other sites

I don't notice much difference at that resolution, but at higher resolutions it does blur the pixels. And also the smoothening is a good thing; the blockiness makes the 160*200 images looks terrible on emulators. Enclosed is a sample image at higher resolution that looks pretty good.

How did they get all those colors on the checkmark (right picture) - surely that is not a result of artifacting.

 

I was addressing the point about artifacting as well as non-RGB video being pretty clear. The first image is from Atari's artifacting. The 2nd from Amiga 600.

Link to comment
Share on other sites

It's a miggy, probably just a copper list changing the colour per line. Never actually thought about it before after seeing that for years but it'd be the easiest way to do it.

 

 

Pete

Yeah - I see that now. I originally thought it was an A8 picture.

Link to comment
Share on other sites

Since there is some discussion if the standard Atari 8-bit graphics modes benefit from VBXE's RGB output, I made a couple of pictures of the three possible output modes (composite video, chroma/luma & RGB) so you can decide for yourself.

 

The pictures can be found in my blog.

 

Robert

 

I can't even run that Bomb Jack on my system. I'll take the Composite and SVideo over RGB anyday for the stock A8 modes.

Link to comment
Share on other sites

This is the same argument that the arcade guys have; LCD vs. CRT. I find myself in the purist camp once again. (I also have a few arcade machines as well)

 

I'd much rather use my s-video connection on my Commodore 1702 monitor over the RGB. It just looks better to me - yes I know it's not as sharp, but the sharper image doesn't look right. And one of my favorite games, Ultima would just look crappy on the RGB since it relies on artifacting to get the colors.

 

Thanks for the comparison shots!

Link to comment
Share on other sites

I can't even run that Bomb Jack on my system.

 

Do you still use a 16KB 600XL? :D The BombJake cart runs fine on my unexpanded/unmodified 800XL :P

 

Robert

 

I tried it on a 64K 800XL and 64K 600XL but I also have a 256K XL but the message was asking for 320K. And I'm not sure what type of expansion specification it's looking for. Here's what mine looks like:

post-12094-127409585865_thumb.jpg

Link to comment
Share on other sites

I can't even run that Bomb Jack on my system.

 

Do you still use a 16KB 600XL? :D The BombJake cart runs fine on my unexpanded/unmodified 800XL :P

 

Robert

 

I tried it on a 64K 800XL and 64K 600XL but I also have a 256K XL but the message was asking for 320K. And I'm not sure what type of expansion specification it's looking for.

 

I was under the impression that you said you can't run BombJake because you don't want to upgrade your Atari beyond the standard Atari provided features. Thats why I answered that you don't need to upgrade your Atari if you use the cart version of BombJake. Similar to that you don't need a memory expansion to run the XE carts like Crime Buster, Airball, Crossbow, etc while they are also big games. But you need a memory expansion if you want to run the hacked disk versions of those.

 

The BombJake cart only needs a 64KB Atari while the disk version needs 64KB+256KB extended memory (thus 320KB in total). If your Atari XL only has 64KB+192KB extended memory (256KB in total), then you need the cart version of BombJake to run it.

 

Robert

Edited by rdemming
Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...