a8isa1 Posted February 24, 2007 Share Posted February 24, 2007 (edited) I have had a video problem which I lived with for quite a while, but it's still annoying. It occurs with both my 800 (unmodified) and 800XL (Supervideo XL 2.0 upgrade but without the composite video on/off switch). Both computers are NTSC models. The problem is more pronounced with the 800XL but both machines have it all the same. If I can't easily solve the problem, that's OK. However, it would be nice to at least have an understanding of what is happening. The display devices are two VGA CRTs. One is driven by an ATI All-in-Wonder video card. The other by an outboard TV tuner for VGA, a Viewsonic VB50HRTV. The input is Atari Chroma+Luminence to S-Video. I have no other display options at the moment. Any insights would be appreciated Please look below to see my issue. Thanks. Regards, Steve Sheppard Edited February 24, 2007 by a8isa1 Quote Link to comment Share on other sites More sharing options...
Gunstar Posted February 25, 2007 Share Posted February 25, 2007 (edited) It looks to me that even though you have S-video input from the 800&800XL, that it's being reverted back to composite before being sent out to your VGA monitors. That's what my screen (1200XL with SuperVideo XL 2.0 mod, also without the composite switch), looks like on my Commodore 1084S monitor (which has an S-video/video switch on it which reverts S-video to composite when pushed) in composite mode. When I switch to S-video mode the image is sharper(sharper than your image shown here) and there is less of that "bleeding" although it is still there. It's basically normal for NTSC video, but is more pronounced to you being displayed on VGA monitors rather than through a normal composite video/S-video monitor or TV. Any time you convert a low-def image to a high-def TV or high-def VGA monitor, the faults of the low-def image source will be more pronounced. Basically that is the best you are going to get it on VGA monitors. It will be slightly better on a standard-def TV or video monitor plugged in through S-video. Edited February 25, 2007 by Gunstar Quote Link to comment Share on other sites More sharing options...
Cybernoid Posted February 26, 2007 Share Posted February 26, 2007 This could also be caused by the All-in-wonder hardware delaying this chroma signal a pixel. I used to test video hardware and it looks like the chroma signal is phase mis-aligned from the luma. (That is the chroma is delayed by a pixel or more.) There might be a luma and/or chroma phase adjust hardware register setting in the All-in-wonder. Also, try using the DScaler software (It works with the ATI All-in-wonder) and adjust the BDelay Value (I think this is the Burst delay and it is hot keys "D" and "Shift-D". DScaler here: http://deinterlace.sourceforge.net/ More on its hot-keys here: http://deinterlace.sourceforge.net/Help/keyboard.htm --C Quote Link to comment Share on other sites More sharing options...
Fröhn Posted February 26, 2007 Share Posted February 26, 2007 That's simply what happens with PAL or NTSC decoding. S-Video will only make color artefacting impossible and make luminances a lot sharper, but it will not be able to remove color bleeding since the color carrier STILL needs to be decoded and the resolution of the color tone is much lower than the resolution of the luminances. Quote Link to comment Share on other sites More sharing options...
cwilkson Posted February 27, 2007 Share Posted February 27, 2007 Yeah, that's just something you have to live with. It's an artifact of having the hue encoded as a phase relative to color burst. Minor hand waving: The chroma info is broken into hue and saturation components. Hue is encoded as phase and saturation (not important here) is encoded as amplitude of a sine wave, both relative to the color burst. assume the display paints a color pixel coincident with the peak of the sine wave. As you change hue around the color wheel, the delay changes and the color pixel will move left and right. Any time you change colors on the same scanline you get a discontinuity in the sine wave and therefor in the apparent color on screen. I don't know the 800 hardware well but I assume it's basically 2600-like in the low level details. For the 2600, I have 15 hues to choose from (plus gray). If I draw a single colored pixel, the associated color ahh..."blip" can appear centered in any one of 16 positions within the pixel. That blip will still be just as wide as the pixel. But the gray blip and the color blip will be skewed relative to one another. Now say I had 5 pixels per scanline. And say I wanted to paint those pixels red (1/4 delay), yellow (no delay), green (3/4 delay) and 2 more reds. The line might look something like this: +---------------+---------------+---------------+---------------+---------------+ pixel positions ************************************************************************************* actual colored line Each color blip begins at a certain time and it lasts exactly one pixel width unless it's overridden by another color that is scheduled to draw. Notice the loss of color before the first red and before the green. The red is the first colored pixel and it has a 4-space delay...so no color. The yellow completes it's drawing then there's no color until the green blip starts drawing (with a 12-space delay). The yellow (0-space delay) overrides the first red blip, and the second red blip overrides the green one. Thus the first red and the green blips appear shortened. There is no overriding and no delay between the last two red blips...they're the same color so we get a seemless transition. And the last red pixel is the last blip to be drawn so it gets to finish it's alloted time, resulting in a red bleed at the right edge of the screen. Also notice the green bleeds into the 4th pixel. Bleh. </HAND WAVING> These same effects occur when you're watching tv. But you don't notice them as much because....how often do you have perfectly vertical boundaries between two uniform color regions in a typpical TV program? Also the image is overscanned, so you can't see the edges of the image. Those would in fact look really bad because of sync recovery. But I digress.... -Chris Quote Link to comment Share on other sites More sharing options...
a8isa1 Posted February 28, 2007 Author Share Posted February 28, 2007 This could also be caused by the All-in-wonder hardware delaying this chroma signal a pixel. I used to test video hardware and it looks like the chroma signal is phase mis-aligned from the luma. (That is the chroma is delayed by a pixel or more.) There might be a luma and/or chroma phase adjust hardware register setting in the All-in-wonder. Also, try using the DScaler software (It works with the ATI All-in-wonder) and adjust the BDelay Value (I think this is the Burst delay and it is hot keys "D" and "Shift-D". DScaler here: http://deinterlace.sourceforge.net/ More on its hot-keys here: http://deinterlace.sourceforge.net/Help/keyboard.htm --C hmm, unfortunately it won't run on this old clunky PC. "Can't create overlay surface.." Quote Link to comment Share on other sites More sharing options...
a8isa1 Posted February 28, 2007 Author Share Posted February 28, 2007 Yeah, that's just something you have to live with. It's an artifact of having the hue encoded as a phase relative to color burst. Minor hand waving:The chroma info is broken into hue and saturation components. Hue is encoded as phase and saturation (not important here) is encoded as amplitude of a sine wave, both relative to the color burst. assume the display paints a color pixel coincident with the peak of the sine wave. As you change hue around the color wheel, the delay changes and the color pixel will move left and right. Any time you change colors on the same scanline you get a discontinuity in the sine wave and therefor in the apparent color on screen. I don't know the 800 hardware well but I assume it's basically 2600-like in the low level details. For the 2600, I have 15 hues to choose from (plus gray). If I draw a single colored pixel, the associated color ahh..."blip" can appear centered in any one of 16 positions within the pixel. That blip will still be just as wide as the pixel. But the gray blip and the color blip will be skewed relative to one another. Now say I had 5 pixels per scanline. And say I wanted to paint those pixels red (1/4 delay), yellow (no delay), green (3/4 delay) and 2 more reds. The line might look something like this: +---------------+---------------+---------------+---------------+---------------+ pixel positions ************************************************************************************* actual colored line Each color blip begins at a certain time and it lasts exactly one pixel width unless it's overridden by another color that is scheduled to draw. Notice the loss of color before the first red and before the green. The red is the first colored pixel and it has a 4-space delay...so no color. The yellow completes it's drawing then there's no color until the green blip starts drawing (with a 12-space delay). The yellow (0-space delay) overrides the first red blip, and the second red blip overrides the green one. Thus the first red and the green blips appear shortened. There is no overriding and no delay between the last two red blips...they're the same color so we get a seemless transition. And the last red pixel is the last blip to be drawn so it gets to finish it's alloted time, resulting in a red bleed at the right edge of the screen. Also notice the green bleeds into the 4th pixel. Bleh. </HAND WAVING> These same effects occur when you're watching tv. But you don't notice them as much because....how often do you have perfectly vertical boundaries between two uniform color regions in a typpical TV program? Also the image is overscanned, so you can't see the edges of the image. Those would in fact look really bad because of sync recovery. But I digress.... -Chris Great information! Thanks! Quote Link to comment Share on other sites More sharing options...
supercat Posted February 28, 2007 Share Posted February 28, 2007 Minor hand waving: I would describe things a little differently. Each line of NTSC effectively has a striped color filter laid on top of it that looks something like this: ****************************** although it's actually continuous rather than made of of fixed hues as in my sample. On a broadcast signal, or on better computer/game machines, alternate lines are staggered like this: ****************************** ****************************** ****************************** ****************************** but the way the timing references are set up, they can also be striped, as on the Atari and Apple computers, as well as earlier Commodore 64's (later '64's used the checkerboard). To display red, one should output a lighter signal in the places where the "color mask" is redish, and a darker signal in the areas where it is bluish or greenish. The greater the difference between the light and dark areas, the more saturated the color. One problem with composite video is that if the source image data happens to contain alternating light and dark areas which are spaced so as to match the spacing of the colors on the color mask, this will generate false colors. Although some computers exploit this by design (e.g. the Apple II series) and others by accident allow it to be exploited (e.g. the A800) it's generally seen as a bad thing. S-video allows lightness information to be output to the display while bypassing the color decoder; it then gets added to the output of the color decoder to yield the final picture. A few more notes: on the 2600 and many other machines, the chroma signal is a square wave which is on half the time and off half the time. So when showing red, it would turn on when the "color mask" was yellow and turn off when it's blue. Viewing the output of a machine on a monochrome display will show this quite clearly. Using the right two colors and the right two shades of gray, one could generate a 640x200 black and white picture using an Atari 7800. Also, on a black and white television, solid colors will appear as vertical stripes. Changes in teh hue will shift the line pattern left and right. I don't know if any games exploited this behavior of black and white sets, but some types of games could be greatly enhanced by it. Quote Link to comment Share on other sites More sharing options...
potatohead Posted February 28, 2007 Share Posted February 28, 2007 I've done the 640x200 bit on an 8bit 800xl. On one of the older high persistance monochrome monitors, it looked really good. I used a VBI to stagger the color changes by scanline. Slowed the machine down, but it did generate nice high resolution displays. In color mode, you could get quite a few artifacted colors as well. Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.