Jump to content
IGNORED

Why do many scandoubler/TV grabber cards/LCD-TV suck on a8?


Beetle

Recommended Posts

 

 

With some trickery I did manage to get a screen capture of just what I see on the All-in-Wonder. I can't explain the color variation.

 

 

 

That is showing 2 frames, blended together.

 

Every adjacent character is offset vertically as the card is assuming it's a normal interlaced display.

 

 

That's using VIVO on a normal AGP card I assume? Capture cards, especially ones integrated into AGP cards, are very prone to interference.

 

I find the best results usually come by using S-Video on a dedicated capture card - try to have it in a PCI slot well away from any other devices.

 

You might want to see if there's a "Spread spectrum" option in your PC's BIOS - it can help, it slightly skews the system clocks and supposedly smooths the waveforms of the clock output, which can cut down the effects of RFI generated in the computer.

Just wanted to point out that the offset appears to be a single scan line. Hias explained this to my satisfaction . It's just that this videocard seems to handle it differently than some scan converters.

 

The video card is a PCI card. It's a first generation ATI All-in-Wonder. Does Video/tv tuning/vivo, all of that but it is only PCI.

 

The color issue was surprising because it is not what I see when looking at the screen. The explanation is kind of long.

 

This old video card did not include a tool that would let me capture the output of Flickerterm properly. My first image shows you what I did get, just one field of the frame. I was going to include a video clip that would show what I actually see however, again the tools supplied with the videocard were old and limited. Most of the video format in which it can capture are apparently no longer in use today, at least not by any players/codecs I have installed on my machines The only formats that were usable were uncompressed video and "Indeo". The latter file, the smaller of the two, was too big to include here (at 38 MB). The old PC, in which the All-in-Wonder resides, does not have any tools to re-encode the video so I copied the video file to another machine. I didn't have much success on there either but I discovered that I at least I now had the ability to do a still image capture of the video playback and this is the point where the color of the image changed. So you see, that color shift was not due to any interference in the original PC.

 

The point I wanted to make by including the images are were twofold. 1) A single field did not seem to be repeated twice each. (I may have been wrong there). 2) The vertical offset (as you called it) indicated to me that both even and odd scanlines are represented. My two scan converters (the AIW and an outboard device) both present the image in the same manor, that which you see in my 2nd image.

 

Other people with other equipment see something different, namely no offset

 

That is all. I didn't (and still don't) have enough understanding to make a conclusion from what I thought I was seeing. I was hoping someone else would. I just wanted to show what I, myself, see.

Edited by a8isa1
Link to comment
Share on other sites

Actually, a simple system to jinx the A8 output to proper interlaced output would be great.

 

It could open up a whole new world of 320x480 graphics using page-flipping.

 

But there is no simple way.

 

Interlacing the signal externally is possible. This would work for static or slow screens only, otherwise you will get all sort of distortion including motion interference. That won't be too bad for LCD screens, because the vertical frequency is altered anyway.

 

Actually generating an interlaced signal internally would be great. You could indeed double the vertical resolution. But it would need to be controlled by software. Doesn't look simple at all to implement. The whole syncing and ANTIC interrupt timing would need to be changed. In effect you would need some kind of extended ANTIC.

Link to comment
Share on other sites

The point I wanted to make by including the images are were twofold. 1) A single field did not seem to be repeated twice each. (I may have been wrong there). 2) The vertical offset (as you called it) indicated to me that both even and odd scanlines are represented. My two scan converters (the AIW and an outboard device) both present the image in the same manor, that which you see in my 2nd image.

 

This is just an artifact of the capture hardware and software.

 

If you think on how the computer internally works and how ANTIC generates the sync timing, then everything would be obvious.

Link to comment
Share on other sites

Here are some links:

 

http://www.pembers.freeserve.co.uk/World-T...html#Horizontal (scroll down)

 

http://www.pembers.freeserve.co.uk/World-T...rds/VBI-525.pdf - graphical representation of NTSC sync over time

 

http://www.pembers.freeserve.co.uk/World-T...VBI-625-PAL.pdf - PAL

 

I would guess that the Atari generates the same field sync progression for every frame.

 

I think the simplest way to alter the video signal would be an external device.

One trick that might work would be to insert an early sync pulse, and remove the last sync pulse every second field. Problem is, you then need trickery to count sync pulses since the last field sync so that you know when to insert an early pulse.

 

Another possibility could be to allow software control by using a device which allowed an extra sync pulse to be generated via PORTA and a joystick port output.

Link to comment
Share on other sites

Just wanted to point out that the offset appears to be a single scan line. Hias explained this to my satisfaction . It's just that this videocard seems to handle it differently than some scan converters.

 

The video card is a PCI card. It's a first generation ATI All-in-Wonder. Does Video/tv tuning/vivo, all of that but it is only PCI.

 

It is offset by just one scanline. This is the exact problem I have with my VGA upscaler. With a standard interlaced signal, every other field would be dropped down by one scanline. On the Atari, every field is the same, so the de-interlacing hardware on the upscaler is incorrectly de-interlacing the Atari's signal.

 

The only upsdie the the upscaler box I have, is no flicker in "page-flip" modes like HIP / RIP graphics use. Downside is, any fast scrolling I get the comb effect, and I can't use Flicker-Term because every other character is dropped by one scanlline.

 

Any recomendations on a better VGA upscaler?

 

Stephen Anderson

Edited by Stephen
Link to comment
Share on other sites

Not sure if anyone hit this exactly yet.

 

In a 'normal' NTSC picture, the odd lines are all scanned first. The device generating the image output a different sync signal for the even frame (its the half scan line you read in the documentation), that basically causes the TV to skip the first scan line. the result being that it paints the even numbered lines instead of the 'default' odd ones.

 

The Atari never outputs the half scan line, so instead of doing 30 interlaced frames of 525 lines in 1 sec, the A8 gives 60 progressive frames at half that resolution. Thus, the A8 signal isn't really NTSC compliant, but most TV's don't care since they keep working normally without the line skip. The black bars between the Atari scan lines are the even NTSC lines that are never painted.

 

Most capture cards expect to have to de-interlace the input to generate a progressive image. (i doubt anyone runs an interlaced PC monitor these days). The cards are expecting to see the 1/2 scan line signal so they know which frame is which. Cards that can't cope with the missing sync will have unpredictable results.

 

LC TV's face a similar problem... they need to create a normal image out of the expected interlaced picture, which isn't interlaced. Your results may vary =P

 

Note that the Atari really needs to work the way it does, otherwise, each 1/2 frame would only be updated 30 times/sec. It would flicker like 2600 PacMan. If you've ever worked on an interlaced PC monitor, you've seen exactly the effect.

Link to comment
Share on other sites

In a 'normal' NTSC picture, the odd lines are all scanned first.

 

That is not exact. There is no true concept of first and second field. The two fields don't necessarily are part of the same image.

 

Some video equipment use 30/25 frames per second, displaying one half before the other. In this case indeed you can say that a given field is the first one.

 

But other equipment use 60/50 fps, each field being a different actual time in the video. That is, assuming there is movement on every field/frame, then there is never a full frame being displayed. You can't capture a full still frame because there is no such a thing. In this cases, there is no actual first or second field.

 

Note that the Atari really needs to work the way it does, otherwise, each 1/2 frame would only be updated 30 times/sec.

 

Again, not necessarily. Modern consoles use an interlaced signal and they don't flicker like the 2600 PacMan, do they?

 

But an interlaced signal makes the hardware more complicated, and depending on the case it might be much more complicated for the software. It doesn't make sense unless you can really cope with the extra veritcal resolution, which is the main gain on using interlacing.

 

Ideally, you want the software to have full control and decide if it wants interlaced or not.

Link to comment
Share on other sites

That is not exact. There is no true concept of first and second field. The two fields don't necessarily are part of the same image.

 

While it is true that the two fields don't necessarily have to be part of the same image, the NTSC standards explicitly specify which is field 1 and which is field 2. This is critical for things like Closed Captioning, which uses line 21 of field 1, and for VCRs which record thirty frames per second (if a movie was telecined using a 2-2-2-4 or 2-3-3-2 field sequence, a step-frame button will show clear stills).

 

But other equipment use 60/50 fps, each field being a different actual time in the video. That is, assuming there is movement on every field/frame, then there is never a full frame being displayed. You can't capture a full still frame because there is no such a thing. In this cases, there is no actual first or second field.

 

Many cameras, especially older ones, worked that way, but any device looking at the output signal would still clearly recognize alternating field 1 and field 2 even though a clear freeze-frame would require using one field or the other.

 

Again, not necessarily. Modern consoles use an interlaced signal and they don't flicker like the 2600 PacMan, do they?

 

There would not have been any particular problem with having the 2600 interlace its video signal (indeed, it can do so under software control) but there's not really much point either. Interlacing tends to create flicker with hard-edged objects which are fully present on one scan line and entirely absent on the next. Many newer games use somewhat softer-edged objects which can greatly reduce flicker, but allowing for such things would have greatly complicated the 2600 hardware.

 

But an interlaced signal makes the hardware more complicated, and depending on the case it might be much more complicated for the software. It doesn't make sense unless you can really cope with the extra veritcal resolution, which is the main gain on using interlacing.

 

Given that the 2600 often only used half of the available vertical resolution anyway, doubling the vertical resolution wouldn't see to serve much purpose in most cases. Andrew's interlaced Chronocolor would seem to be an exception (there, the 2600 is effectively used at 1/3 vertical resolution, and using interlacing improves it to 2/3).

Link to comment
Share on other sites

While it is true that the two fields don't necessarily have to be part of the same image, the NTSC standards explicitly specify which is field 1 and which is field 2

...Many cameras, especially older ones, worked that way, but any device looking at the output signal would still clearly recognize alternating field 1 and field 2 even though a clear freeze-frame would require using one field or the other.

 

Of course that modern equipment can clearly distinguish and recognize the fields. That was not the point. The point is there is no true concept of which is the first and which is the second, because they don't necessarily belong to the same still image. Then first and second, simply doesn't make sense in this context.

 

There would not have been any particular problem with having the 2600 interlace its video signal

 

I think we were talking about the A8. Producing a true interlaced signal in the A8 wouldn't be a minor task. The hardware complications (ANTIC would need to be enhanced) aren't that bad as the software complications in some cases.

Link to comment
Share on other sites

Of course that modern equipment can clearly distinguish and recognize the fields. That was not the point. The point is there is no true concept of which is the first and which is the second, because they don't necessarily belong to the same still image. Then first and second, simply doesn't make sense in this context.

 

Given a pulser which runs at Hsync*2, a divide-by-525 counter with triggers at certain counts, and a divide-by-two counter, one could generate a valid NTSC sync train without having to explicitly count even and odd fields. On the other hand, during even fields the the LSB of the divide-by-525 would have one phase relationship with the divide-by-two counter and during odd fields it would have the opposite. Anything that's going to generate a valid interlaced sync train has to have some aspect of its state which distinguishes even and odd fields.

 

I suppose one could generate both the vertical and horizontal signals using independent analog sources; if they were trimmed properly one could end up with an interlaced signal that would kinda sorta work. I wouldn't be surprised if some cameras did such a thing back in the days of vacuum tubes, but I would think any decent camera--even in the days of vacuum tubes--would have used a rate divider chain. Start with a 14.3818MHz crystal and a circuit to kick out 14,381,818 pulses/second. Feed that into a circuit which will kick out an output pulse when it receives an input pulse at least 873ns after the last output pulse (and will ignore input pulses received at any other time). I believe such a circuit may be built with half a dual-triac tube, a transformer, and some resistors/caps/etc. The output of that circuit would be precisely HSync*70. So feed that into two more similar circuits to get Hsync*10 and Hsync*2. Then four more to get Hsync*2/525, and one more (fed off Hsync*2) to get Hsync. So probably four vacuum tubes to generate all the timing signals off a single crystal.

 

There would not have been any particular problem with having the 2600 interlace its video signal

 

I think we were talking about the A8. Producing a true interlaced signal in the A8 wouldn't be a minor task. The hardware complications (ANTIC would need to be enhanced) aren't that bad as the software complications in some cases.

 

There was discussion of both, I think. A key to being able to jinx the A8 into producing an interlaced signal would be the ability to jinx the scan mid-line. On the 2600, a store to RSYNC will cause the TIA's horizontal sync circuitry to get asynchronously reset. I have no idea if the A8 has anything similar or how it works.

Link to comment
Share on other sites

In a 'normal' NTSC picture, the odd lines are all scanned first.

 

That is not exact. There is no true concept of first and second field. The two fields don't necessarily are part of the same image.

 

 

While the reality may be that each field is different, the NTSC spec clearly defines a frame as consisting of odd and even fields. You have to remember that the spec dates back to the 1930s, with color added in the 50s. A proper interlaced NTSC picture will conform to the spec. The A8 signal is not a proper NTSC signal- it takes advantage of the way the spec was implemented to work.

 

You need odd and even fields to properly deinterlace a NTSC picture, period. If you don't know which frame you are on, you have to guess which ones to combine. It's what we are stuck with for the next few years, and, yes, it is exactly why your capture cards don't like A8 stuff. The cards expect a NTSC signal and they are not getting it.

 

As to newer consoles providing a proper interlaced picture- yes, they do and no they don't flicker. They render to a back buffer and copy the buffer to the display buffer in sync to the television signal. You really couldn't do that on an A8 in 1977- memory was too expensive, and you had a 1.79 Mhz processor! (For reference, the NTSC pixel clock is about 15 MHz)

 

Stupid trivia... the PC 18.2 Hz clock tick and the 1st PC's clock speed (4.77 MHz) are derived from the NTSC colorburst clock)

Link to comment
Share on other sites

Actually generating an interlaced signal internally would be great. You could indeed double the vertical resolution. But it would need to be controlled by software. Doesn't look simple at all to implement.

What if we add a circuit that shuts off the clock to ANTIC for half of a scanline? (or is ANTIC one of those chips that will forget its state after a short while with no clock?) Then a program could set a up a display list with two parts, one for each field, and at the beginning of one part there will be a DLI during which the CPU will toggle a PIA bit (or whatever) and trigger the external circuit that provides the delay.

 

Or is horizontal timing controlled by GTIA and not ANTIC? I know vertical timing is controlled by ANTIC but I can't remember how GTIA "knows" when to read the player/missle gfx and stuff like that.

 

You need odd and even fields to properly deinterlace a NTSC picture, period. If you don't know which frame you are on, you have to guess which ones to combine.

You don't need to guess, it's clear from the timing which field is being displayed. Combining two fields into one "frame" is still problematic though because as the other poster mentioned, the fields may be 1/60th second apart in time (not part of the same frame). Have you ever tried ripping a DVD and converting to a progressive scan AVI without dropping any resolution? It often can't be done.

 

(For reference, the NTSC pixel clock is about 15 MHz)

Is there really an official NTSC pixel clock? 15MHz wouldn't seem to make any sense as far that goes.

Link to comment
Share on other sites

Is there really an official NTSC pixel clock? 15MHz wouldn't seem to make any sense as far that goes.

 

I remember at least one digital component video standard used chroma*3.5 (i.e. 12.5284Mhz) and one digital composite video standard used chroma*4 (i.e. 14.31818Mhz) but I'm not sure what people have used since.

Link to comment
Share on other sites

The best method IMO would be external circuitry. At least then it would be an easier and more available upgrade.

 

ANTIC controls Hsync/Vsync, it uses the AN0 - AN2 lines to notify GTIA when it should be generated.

 

http://www.atarimax.com/jindroush.atari.org/atanttim.html (scroll to bottom)

 

The problem though would be inserting the early sync pulse each second VBLANK.

 

A software/hardware solution might be the way to go - although when an early sync pulse is inserted, then the last one has to be removed.

Link to comment
Share on other sites

A key to being able to jinx the A8 into producing an interlaced signal would be the ability to jinx the scan mid-line. On the 2600, a store to RSYNC will cause the TIA's horizontal sync circuitry to get asynchronously reset. I have no idea if the A8 has anything similar or how it works.

 

There is nothing like that on the A8, and AFAIK, there is no way whatsoever to alter sync timing by software.

 

While the reality may be that each field is different, the NTSC spec clearly defines a frame as consisting of odd and even fields...You need odd and even fields to properly deinterlace a NTSC picture, period. If you don't know which frame you are on, you have to guess which ones to combine.

 

The specs doesn't define a frame as being a "picture". And the specs certainly doesn't define how to deinterlace a picture, because again, there is no concept of "picture".

 

It isn't just "reality" that both fields could be different. That's how the whole TV signaling, shooting and broadcasting was designed in the first place.

 

Deinterlacing to capture a picture is a modern concept, and if the video was produced at 60 fields per second (and not just at 30 frames per second), then you can't combine the fields without some kind of distortion.

Link to comment
Share on other sites

What if we add a circuit that shuts off the clock to ANTIC for half of a scanline? (or is ANTIC one of those chips that will forget its state after a short while with no clock?)

 

I've been thinking on this. Even if you can stop ANTIC without harmfull consequences (which I don't know), I'm not sure this would work.

 

Scan lines are supposed to be at a constant rate. Interlacing is not really generated by a half scan delay of hsync. It is the vertical sync rate not being a multiple of the horizontal rate that produces the vsync-hsync shift between fields.

 

OTOH, a signal following the specs strictly have equalization hsync pulses. This means that not all hsync pulses are separated by the same exact time period. But displayed scans, even across both fields, do have hsync at a constant rate.

 

So I'm not sure every receiving equipment (TV/Monitor/Capture) would handle this half scan delay "correctly".

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...