Jump to content
IGNORED

Stella 3.6 released


stephena

Recommended Posts

It seems safe to say that anything p4-class at 1100MHz and up will work fine.

A PIII at 1 GHz is about as fast as a PIV at 1.4 GHz. So if the PIII is struggeling at 1.4 GHz, you need at least a 2 - 2.5 GHz PIV.

 

If the CPU is the bottleneck.

 

I understand the shorter pipeline in the P3 is a good thing. I also know that my 1st test system, the P4 mobile Dothan core, has a much better IPC than Netburst on a regular P4 and the P3.

 

I would guess perhaps the P3 I tested is lagging due to missing instructions or perhaps it's even the ISA soundcard or the slow-ass 100MHz bus.

Edited by Keatah
Link to comment
Share on other sites

Just another quick update. First the bad news. I've pushed the release date until June 1. But barring any major issues that pop up at work, I won't be going past that date.

 

Now the good news. I think I've fixed the speed regression from 3.6 to 3.7. It was necessary to fix this, since we've moved from 16-bit to 32-bit colour, and the new phosphor effect will use blending, which stresses the video card a little. So I wanted to reduce the CPU usage as much as possible, since it'll increase with new features.

 

So here's another test release. Note that this one still doesn't have phosphor effect enabled for TV modes, but it does use 32-bit framebuffer. So it's just like the last release, except it should be faster. Testing on my systems show that it's just as fast in 32-bit colour mode as 3.6 was in 16-bit colour mode.

Link to comment
Share on other sites

I also see ZERO speed decrease when switching between BLIT and FLIP modes in my graphics card driver option menu. In the previous beta 3.7.2 there would be a slowdown in BLIT mode. Also I believe the 3.6.1 > 3.7.x slowdown has been stomped on. I don't have time to do an in-depth test. But it looks promising. I will run a quick test that I believe will tell the truth entirely once and for all..

 

Wait...

Link to comment
Share on other sites

Absolutely thrilling!

 

I eco-clocked my CPU down to 600MHz and cut the integrated on-board Intel EXTREME Graphics II core to 133MHz. Half speed on the GPU basically. We were getting full head of steam and had clock cycles to spare. I think I could run this as low as 475MHz before we see a slowdown. Righteous!

Link to comment
Share on other sites

I took the afternoon off to play with this shit and finish up my commentary on the Paddle emulation issues (which you probably don't want to hear about).

 

I regressed this crap back to 3.6.1 and did another side-by-side comparison, again with the CPU clocked down low to see some granularity.

 

3.6.1 and 3.7.3 are indeed now the same - within 2-3% as far as I can see in the graphs.

Link to comment
Share on other sites

hey guys how do I get these command line arguments working?

 

It would appear that -video <soft|gl> is working fine. Good..

But -ntsc_filter <1|0> and -gl_filter <nearest|linear> seem busted. There's no effect.

 

I have not played with other command line options, so maybe there are more that are broke?

Edited by Keatah
Link to comment
Share on other sites

It also shows up in some other games like MazeCraze and Adventure, so it's not game specific.

Seems to happen when I have gl_fs_stretch on and 1024x768 resolution, which happens to be the same as my desktop.

 

If I select 640x480 or 800x600 or turn of fs_stretch then it goes away, but then the image size isn't quite full screen - border to border.

 

I also got the old beater PIII going at full speed with room to spare.

And I dropped my other system's clock speed to 468MHz, with filtering before it started slowing down.

Edited by Keatah
Link to comment
Share on other sites

The 'artifacts' are from performing what is called a non-integral stretch. Basically, this is stretching an image by a non integer amount (so instead of 1x, 2x, etc, it's probably something like 2.66 or 3.4). As such, there's no guarantee that the image will come out looking precisely the same as its non-stretched representation. This is also dependent on video cards and drivers; some do it better than others.

 

One other thing that occurred to me is that I haven't tested much on 4:3 displays, since all I have access to here is 16:9 and 16:10 widescreens. This issue will have to be pushed post-3.7, since I'm not even sure I can fix it.

 

As for the dead commandline arguments, I'm assuming you're looking at the 3.6.1 documentation, because these items aren't actually present in 3.7. Have a look in the 'docs' directory of beta3, which contains more up-to-date documentation. Keep in mind that you're downloading WIP releases, which of course also includes the documentation.

Link to comment
Share on other sites

OK, I'm making good progress on the phosphor effect. One initial issue is that I was trying to make it look like software mode, where there is absolutely no flicker at all. However, I realized that this was just an idealized approximation; the colours there are precomputed to completely remove flicker, and the phosphor 'blend' simply lightens or darkens the output.

 

But when you think about it, this isn't how a real TV works; there is always some flicker, but it's not as noticeable as on an LCD. The real issue is that in Stella, flickering is happening because you're first drawing a completely black area, then coloured, then completely black again, and so on. So what needs to happen is that coloured areas 'persist', and black doesn't. But you can't just completely through away the black, since then the image is all colour, which (while probably nice looking) doesn't emulate what a real TV look like.

 

Anyway, long story short, the phosphor blend for OpenGL modes (including TV effects) will now act as a real blend %. That is, phosphor blend of 100% will show complete colour and throw away black completely. Phosphor blend of 0% will alternate between black and colour every frame (IOW, just like turning it off). And anything in between will give you varying levels of persistence, which results in varying levels of flicker. With the default of 77%, you can just see some slight flicker; it isn't completely hideous, but isn't completely gone either. Which is how a real TV looks.

Link to comment
Share on other sites

And some snapshots to illustrate what I mean. These are with TV effects turned off; I haven't integrated Blarrg filtering and phosphor mode yet:

 

Software rendering (colours are precomputed, no flicker at all, 77% 'blend' level):

post-1512-0-57363900-1337782245_thumb.png

 

OpenGL rendering (actually uses OpenGL blending, 80% blend level):

post-1512-0-19997500-1337782246_thumb.png

 

Note how in the second snapshot you can actually see parts of the previously drawn image. That's what I mean by 'persistence', sometimes also known as motion blur.

Link to comment
Share on other sites

The fading factor is the blend level, and that is currently already configurable (and defaults to 77%). Although that brings up another issue. The current phosphor settings (whether it's enabled/disabled and the blend level) are stored in per-ROM settings. This was required in the past, since enabling phosphor for ROMs that didn't need it gave too much blur, and disabling for those that did need it resulted in too much flicker.

 

The real problem is that the old phosphor effect is too coarse, so it needed to be specific to the ROM. The new phosphor effect really shouldn't be like that, since when you think about it, phosphor blending is a function of the TV, not the ROM! Put another way, if I enable phosphor effect at (say) level 80%, it should look fine in all types of ROMs. For those that don't have 30Hz flicker, there will be some slight smearing. And for those that do, it will greatly reduce flicker. In either case, it is exactly what a TV would output.

 

So now I need to figure out how to deal with the settings. I would like to just kill the per-ROM settings entirely, but there's still software rendering mode (arrrgh!@#) which complicates things and necessitates keeping them around.

Link to comment
Share on other sites

if you want to follow mame behavior you will have a global settings for ALL games and the ability to override this setting in specific games with different values...

 

But, global settings for all games should be fine since they all running in Atari hardware, and this is like the settings in Mess for each system,

 

In mess you can set the values differently for each system but not for each game in this system...

 

Don't be hard with yourself...

 

I.G

Link to comment
Share on other sites

I have both 16:10 and 4.3 displays. 16:10 is good for technical work, data recovery ops, and watching movies only. 4:3 is for everything else including classic gaming, especially classic gaming! Regarding 16:9?? That's some kind of Hollywood thing that just happens to be the most efficient proportions when cutting the glass plates at factory where LCD's are made. I'm personally a fan of 16:10 and 4:3.

 

With a little bit of y-positioning and y-stretching I "tuned out" the "miniature golf glitch". And in the process got it to be 100 perfect size on 4:3 LCD. It's probably my cheap-o intel graphics making the glitch in the first place.. My GeForce 4 does not show the problem. So don't make a big stink about it because I don't know what enthusiast these days is using 10 year old shit intel graphics, except for the hundreds of millions of business-class laptops and desktops.

 

I'll have to try this out on the i7 with integrated graphics and the older 945g graphics as well.

 

Anyhow when it comes to screen formatting it's always been a sack full of tediousness for the programmer and the end-user; both of whom have to work at it to ensure the right proportions are achieved.

 

As far as the beta 3.7.x goes, it's so good its easy to think it's a finished product. I reviewed the documentation, btw.

 

Briefly, back to my Miniature Golf issue and how I fixed it.

A short list of settings

4:3 monitor (best CRT equivalent!)

OpenGL

Zoom3x

Resolution Auto (or any 4:3 ratio)

GL_FS_Stretch ON

 

This gives me a perfect edge to edge image with no glitches and fills the entire screen top to bottom, left to right; provided I use the Game Properties / Display / Ystart & Yheight tweak. Demon Attack is just fine the way it is and needs no tweaks.

 

I'm sure we could spend hours and hours tweaking each individual game to perfection, but my point is that we need the GAME PROPERTIES <<-- you need to keep them. Keep them until the PC display hardware can be made to behave exactly like a VCS + CRT combo.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...