Jump to content
IGNORED

Atari SM147 Monochrome Monitor - Impressions?


Recommended Posts

sm147.jpg

 

 

I've searched threads.

 

I searched atari-forum.com.

 

I Googled.

 

 

I found many references to the SM147 monochrome monitor. I found old Atari press releases, descriptions, etc.

 

Basically, all I learned was that it's a replacement for the SM124, it's a flat-screen, the screen is allegedly a little bigger, it's a paper-white VGA monitor with an Atari plug, and it has no built-in speaker/amplifier.

 

So what's the obvious question I could not find an answer to?

 

DOES IT MASK DOWN THE ST MONOCHROME PICTURE WITH A HUGE BORDER LIKE THE SM124???

 

I can't believe that was never posted about, so I'm guessing it does? I know this is way too early to have a little computer inside the monitor to handle autosizing and positioning. But I was hoping to hear that it has thumbwheels for picture size/position, and that you can get it to go full-screen?

 

I've been using an Ebay-sourced ST-to-VGA adapter and a fine $5 thrift store 17" monitor, but it would be kind of cool to have an Atari-branded monitor, but I can't stand the underscan of the SM124 (esp after edge-to-edge on the thrift store monitor) and I was hoping the SM147 was better in that regard. The SM124 underscan is deplorable; why the hell didn't they adjust it out of the monitor at the factory?

 

Thanks!

Edited by wood_jl
Link to comment
Share on other sites

The SM124 underscan is deplorable; why the hell didn't they adjust it out of the monitor at the factory?

 

Thanks!

 

Cause of the crazy ST video signal. The ST borders are actually part of the picture from the monitor's perspective

Link to comment
Share on other sites

sm147.jpg

 

 

I've searched threads.

 

I searched atari-forum.com.

 

I Googled.

 

 

I found many references to the SM147 monochrome monitor. I found old Atari press releases, descriptions, etc.

 

Basically, all I learned was that it's a replacement for the SM124, it's a flat-screen, the screen is allegedly a little bigger, it's a paper-white VGA monitor with an Atari plug, and it has no built-in speaker/amplifier.

 

So what's the obvious question I could not find an answer to?

 

DOES IT MASK DOWN THE ST MONOCHROME PICTURE WITH A HUGE BORDER LIKE THE SM124???

 

I can't believe that was never posted about, so I'm guessing it does? I know this is way too early to have a little computer inside the monitor to handle autosizing and positioning. But I was hoping to hear that it has thumbwheels for picture size/position, and that you can get it to go full-screen?

 

I've been using an Ebay-sourced ST-to-VGA adapter and a fine $5 thrift store 17" monitor, but it would be kind of cool to have an Atari-branded monitor, but I can't stand the underscan of the SM124 (esp after edge-to-edge on the thrift store monitor) and I was hoping the SM147 was better in that regard. The SM124 underscan is deplorable; why the hell didn't they adjust it out of the monitor at the factory?

 

Thanks!

Don't all those analog monitors have calibration support via pots? (it's just that most are inside the casing, so you have to open it to adjust it)

 

External pots would have been great, not just for fine tuning to one's personal preference of overscan, but also for allowing perfectly square pixels (which would leave a noticeable vertical boarder by necessity).

Also, it would allow you to stretch it even further for Mac emulation. ;) (you'd want both overscan and positioning controls though)

 

 

Also, can't you just wire an ST's monochrome lines to most/all VGA/SVGA monitors in general? (most of which have overscan control) Unlike the color modes which use 15 kHz h-sync and would only work for standard def (or multimedia) RGB monitors and some early VGA monitors specifically designed to support EGA/CGA sync rates. (if you managed to find one of the latter, you could even rig up a simple switch with a custom/hacked cable to toggle between mono and RGB video lines from the ST)

 

 

 

 

The SM124 underscan is deplorable; why the hell didn't they adjust it out of the monitor at the factory?

 

Thanks!

 

Cause of the crazy ST video signal. The ST borders are actually part of the picture from the monitor's perspective

It's all about calibration. There's no hard limit for overscan (until you hit the actual edge of practical vblank or hblank for synchronization purposes), and indeed on standard definition monitors with calibration like those of most SDTVs, you can very well end up losing some of the picture due to excessive overscan (ie you might have a 320x240 image that ends up with only 296x224 on-screen). With user adjustable controls, that's a non-issue though, and something that's even more necessary for later multi-sync monitors. (actually, it would have been cool if Atari had supported grayscale multi-sync monitors with grayscale versions of the 200 line modes to at least allow users to run color-specific software in grayscale -heh, that's what I was stuck with on a low-end VGA grayscale monitor as a little kid in the early 90s; hell, maybe some developers would even have supported a grayscale mode for games with different colors used to cater to that ;))

Edited by kool kitty89
Link to comment
Share on other sites

  • 2 weeks later...

The SM124 underscan is deplorable; why the hell didn't they adjust it out of the monitor at the factory?

 

Thanks!

 

Cause of the crazy ST video signal. The ST borders are actually part of the picture from the monitor's perspective

It's all about calibration. There's no hard limit for overscan (until you hit the actual edge of practical vblank or hblank for synchronization purposes), and indeed on standard definition monitors with calibration like those of most SDTVs, you can very well end up losing some of the picture due to excessive overscan (ie you might have a 320x240 image that ends up with only 296x224 on-screen). With user adjustable controls, that's a non-issue though, and something that's even more necessary for later multi-sync monitors. (actually, it would have been cool if Atari had supported grayscale multi-sync monitors with grayscale versions of the 200 line modes to at least allow users to run color-specific software in grayscale -heh, that's what I was stuck with on a low-end VGA grayscale monitor as a little kid in the early 90s; hell, maybe some developers would even have supported a grayscale mode for games with different colors used to cater to that ;))

 

A monitor "properly" calibrated for the ST video timing will show large borders when connected to an ST. It is largely because the ST's pixel clock is not correct for the screen timing. The ST's pixel clock runs too fast, so all the displayed pixels are squirted out in less time than would otherwise happen. the border color is displayed where there are no pixels.

 

This works fine for an analog display, but is unacceptable for an LCD. They did it this way because 16 and 32 MHz clocks are readily available in the ST, and are nice multiples of the bus speed, allowing the pixel data to simply be shifted out of the video chip registers (which is why they call it SHIFTER).

It is both elegant and horrific. :evil:

Link to comment
Share on other sites

  • 2 weeks later...

The SM124 underscan is deplorable; why the hell didn't they adjust it out of the monitor at the factory?

 

Thanks!

 

Cause of the crazy ST video signal. The ST borders are actually part of the picture from the monitor's perspective

It's all about calibration. There's no hard limit for overscan (until you hit the actual edge of practical vblank or hblank for synchronization purposes), and indeed on standard definition monitors with calibration like those of most SDTVs, you can very well end up losing some of the picture due to excessive overscan (ie you might have a 320x240 image that ends up with only 296x224 on-screen). With user adjustable controls, that's a non-issue though, and something that's even more necessary for later multi-sync monitors. (actually, it would have been cool if Atari had supported grayscale multi-sync monitors with grayscale versions of the 200 line modes to at least allow users to run color-specific software in grayscale -heh, that's what I was stuck with on a low-end VGA grayscale monitor as a little kid in the early 90s; hell, maybe some developers would even have supported a grayscale mode for games with different colors used to cater to that ;))

 

A monitor "properly" calibrated for the ST video timing will show large borders when connected to an ST. It is largely because the ST's pixel clock is not correct for the screen timing. The ST's pixel clock runs too fast, so all the displayed pixels are squirted out in less time than would otherwise happen. the border color is displayed where there are no pixels.

Yes, but what defines the standards for such calibration? How much is "normal" for off-screen/hblank (and vblank) area? Etc, etc.

 

For a lot of multi-sync analog monitors, you have to manually calibrate things for certain resolutions (some resolutions have common overscan boarders, some are considerably off).

 

And given the ST's resolutions are all direct multiples of one another, a single monitor calibrated with that in mind (either to minimize boarder or compromise for square pixels in 320x200 or 640x400) would be the best by default. (preferable with easily accessible pot knobs for manual adjustment -technically you could have 640x200 with square pixels and a huge vertical boarder)

 

 

For normal TV's there's a fairly wide range of "standard" calibration. The most common for modern sets (late 80s onward) is around 224/448i vertical lines and approximately 75% of the horizontal scan visible (25% is in overscan), but the "correct" calibration for NTSC (or 60 Hz/15.7 kHz SDTV of any sort) is technically 240/480i lines and ~80% of the horizontal scan visible.

Of course, some older (and even not so old, but cheap/low quality) sets will be further off and possibly even have aspect ratio issues. (some sets don't even show a vertical boarder with 192 lines)

 

As such, the Amiga will also show a significant boarder: vertical just as large as the ST (for NTSC -PAL can optionally go up to 256 lines though, non interlaced), but a smaller (but still noticeable on most sets) horizontal boarder similar to the 2600/A8. (ST's horizontal boarder is more like the C64 -apparently the C64 uses a 4/8 MHz dot clock rather than being tied to the NTSC color clock)

The 5.37 MHz dot clock of the TMS9918/NES/PCEngine/Genesis(some modes), SNES, etc gives almost exactly 256 pixels on NTSC TVs at the common (75% visible) calibration while the Genesis's 6.67 MHz mode gives almost exactly 320 pixels for that common calibration. (it also gives a good middle ground between PAL and NTSC pixel aspect ratios -so games that have art designed for square pixels will look OK on both, not perfect, but not too bad -unlike 5.37 MHz which looks OK on NTSC but super wide in PAL)

7.16 MHz gives almost square pixels in PAL, but way too tall in NTSC, while 6.25 MHz does the same for NTSC. (the 6 MHz used on the Neo Geo is pretty close to that, but very slightly wide -it also only ends up showing about 288 pixels on "normal" TVs, and is, of course, pretty wide in PAL)

 

Of course, with a computer, you can't afford any loss to the boarder, so the Amiga/A8 resolution was pretty much right-on for NTSC. (320x200 at 7.16 kHz was pretty much right-on for any good quality TV from the mid 80s onward -though might be problematic for some old/cheap sets -without resorting to recalibrating, albeit a lot of older sets had external pots for adjustment, even some really low-end sets like GE's portacolor -that one even has external RGB pots on top of H/V scan adjustment)

7.16 MHz is also pretty close to the limit for practical viewing via RF, at least for text. (and only for good RF at that, and both composite and RF would be limited by chroma interference for color, though for a luminance only signal, you'd be about as good as RGB through composite, though RF noise would still limit things -all A8s with monitor ports have- with a decent TV with composite input, you could reasonably manage much more than ~7-8 MHz, closer to double that -only limited by beam precision and phosphor dot pitch)

 

They did it this way because 16 and 32 MHz clocks are readily available in the ST, and are nice multiples of the bus speed, allowing the pixel data to simply be shifted out of the video chip registers (which is why they call it SHIFTER).

It is both elegant and horrific. :evil:

Yes, just like other systems used common NTSC color clock derived frequencies. (had the ST used composite video by default, it would have had to have a 3.58 MHz compatible oscillator on all models rather than only some)

As it is, it would have been nice if they'd used a faster master oscillator (like 16, 24, 32, or 48 MHz), they could have had a lot more options for potential dot clock rates for the SHIFTER, and more options for different CPU clocks at launch (or at least later on). Hell, even using different dot clock versions of existing modes (just with different shaped pixels) could have been really useful, namely lower res versions of current modes. (like 640x200 4 color with wider pixels -with more and more into overscan as dot clock dropped, or the same for 320x200 4 colors)

 

As it is, they at least could have offered a 4 MHz low-res mode (exactly the same as the C64's pixel resolution) with 320x200, but only about 192 pixels (or a bit less) visible on-screen with normal calibration.

 

 

Actually, what might have been really interesting if if they'd used a 50 MHz master clock from day 1 and used 10 MHz CPUs as the base standard (rather than 8 MHz) and offered 12.5 and 16.67 MHz models too (maybe 12.5 initially as the high-end version and 16.67 MHz as it became widely available). Maybe use 8.33 MHz if that was an acceptable rage of overclocking for 8 MHz rated parts. (otherwise the next step down would be 7.14 MHz)

Faster oscillators (especially high resolution ones) would generally be more costly though, especially ones at non common speeds. (I think 50 MHz was pretty common though, probably a lot more so than something like the 53.7 MHz Sega was using with the Mk.III/Master System in '85/86)

Link to comment
Share on other sites

  • 10 years later...
On 4/18/2011 at 5:28 AM, kool kitty89 said:

7.16 MHz is also pretty close to the limit for practical viewing via RF, at least for text. (and only for good RF at that, and both composite and RF would be limited by chroma interference for color, though for a luminance only signal, you'd be about as good as RGB through composite, though RF noise would still limit things -all A8s with monitor ports have- with a decent TV with composite input, you could reasonably manage much more than ~7-8 MHz, closer to double that -only limited by beam precision and phosphor dot pitch)

when I compare the same graphic mode (Antic mode F - 320x192 vs ST-LOW 320x200) on my Atari 8bit 7Mhz and  Atari 16bit 8Mhz then the fist one has a huge moire (I mean additional colors, not a noise).

Even in 640x200 mode the text is well readable on the ST. Therefore for PAL I would choose 8Mhz

 

 

 

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...