Jump to content
IGNORED

MAME PC Questions


Tempest

Recommended Posts

A family member gave me their old PC that they were no longer using.  Since it's slower than the PC I'm currently using I've decided that I'm going to turn it into a dedicated MAME box.  While the specs aren't amazing, I think it should work for what I want to do:

 

Gateway DX4860-UB32P

Core i3 2120

Intel HD Graphics

6GB RAM

1TB Hard Drive

 

I plan on using the built in HDMI out to plug it into my TV (a 43" 1080p) and hopefully play some arcade games from the comfort of my couch.  I do have some questions though for those who are knowledgeable about such things:

 

1. Given the above specs, what's the latest year I can run games from.  I'm thinking maybe stuff from the late 90's/very early 2000's.

 

2. What's a good controller to use for this?  As I'm sitting on my couch, I need something that can sit on my lap.  I know they make fancy controllers for that (which I might eventually invest in, any suggestions there?), but I want to use something that I might currently have laying around.  I'm thinking maybe my Xbox 360 controller or maybe my PS3 controller.

 

3. I'd like to minimize the use of the keyboard if possible.  I'm going to have the PC boot directly into MAMEUI, but I'm not sure if I can get around having to have a keyboard laying around for stuff like inserting coins and managing save states.  Then again the 360/PS3 controller has a lot of buttons...

 

4. Would it be worth upgrading the video card with something cheap that supports HDMI?  This is all for fun so I don't want to dump a lot of money into it, but if the built-in graphics are really poor then I might have to.

Link to comment
Share on other sites

Just of note you may want to mention which tier of the HD Graphics from intel you got there.  Before I got this gaming laptop 7 1/2 years ago I had been using this pretty solid ASUS I got at Fry's in the day (11~ years ago or so) whenever the sandybridge i5s dropped, it was the first model with the HD 3000 Graphics from intel.  That unit was strangely unique for power in that while it would crawl to a halt if using the Call of Duty (MW1) engine or similar where hardware T&L was used, other stuff ran oddly too well, better than it should, to the point you could run Civ5 at a mix of low to medium levels for graphics while maintaining a good frame rate.

 

To know which HD level of graphics you ended up with along with the speed on that i3 I think it would be safe to say you may be able to run a bit more/less than you think depending on those figures.

Link to comment
Share on other sites

30 minutes ago, Tanooki said:

Just of note you may want to mention which tier of the HD Graphics from intel you got there.  Before I got this gaming laptop 7 1/2 years ago I had been using this pretty solid ASUS I got at Fry's in the day (11~ years ago or so) whenever the sandybridge i5s dropped, it was the first model with the HD 3000 Graphics from intel.  That unit was strangely unique for power in that while it would crawl to a halt if using the Call of Duty (MW1) engine or similar where hardware T&L was used, other stuff ran oddly too well, better than it should, to the point you could run Civ5 at a mix of low to medium levels for graphics while maintaining a good frame rate.

 

To know which HD level of graphics you ended up with along with the speed on that i3 I think it would be safe to say you may be able to run a bit more/less than you think depending on those figures.

I think it's 2000.  I do have a spare GTX 260 which is way better, so maybe I'll use that.  Assuming the system has the slot for it. (EDIT: it appears to).

Link to comment
Share on other sites

I would, that's a definite improvement.  The one I had with the HD 3000 was quite capable as it surprised me, even could play a steady online CO-OP-MMORPG game of Guild Wars too no matter how many were online in hub towns etc.  The 2000 was the second model of that style, less capable by a step back so that 260 would be a sharp improvement.  I'd add that and hardcore disable the other from even being picked up or find a way if software allows to slave it for general desktop navigation use only so it doesn't sponge of the RAM as it will share it and hammer your abilities down a peg.  That 6GB of RAM usually those things would eat 1-1.75GB of the system ram as it's workspace for its own tasks leaving you with far less for the PC.  Keeping it dead you'll save an added 1-nearly 2GB of RAM for necessary operations of the emulator for far better performance.

 

My current gaming laptop I mentioned it has the Nvidia 980M in there with 8GB of its own ram (effectively it's the 970 desktop model in performance but with added memory as it was the first nvidia chip not to lose like 50% power being for laptops.)  But it also has the intel built in crap all boards have, and it was causing me problems years ago with some games/emulators as it would try and use that BS to run the games/apps and they'd get spotty sharing system memory (16GB for me.)  I forced it to never be used, EVER, relying all system operations on the 980M and it's 8GB of ram and my performance increased a lot, especially on games to where even this old thing can run PS4Pro level games at same (1080p) resolution at notably better/steady frame rates.

Edited by Tanooki
Link to comment
Share on other sites

Once I get the GTX-260 in I'll see if I can disable it so it doesn't eat RAM, but honestly I doubt I'll be playing any games that will need that much RAM.  I'm mostly sticking with the 80's and maybe some early 90's stuff before everything went to fighting games.  I wouldn't mind trying to run something like that 3-D Gauntlet though.

Link to comment
Share on other sites

I understand, it's not so much the eating of it, but how it'll mess with other system resources, the os itself, and the fluctuating leftovers for the emulator itself to use.  If it's going up/down snapping up ram for that junk on board stuff it may impact depending.  Surely not on some 80s arcade game, but you get into the 90s up and into the earlier 00s I bet it could make the difference from at least a stability stand point.

Link to comment
Share on other sites

3 minutes ago, Tempest said:

How do you disable the internal graphics in Windows 10?

It depends on your graphics card, but for my Nvidia card, you right click on an empty area of the desktop and then you can get to the Nvidia control panel.

  • Like 1
Link to comment
Share on other sites

Zoyous is right, if you right click your empty desktop space you can pop into the Nvidia control panel.  There's only like 2 tabs like a folder, the right one has a drop box, you can pick the app/game you want and force it to specifically use a video card, but there's also above that a master override where you can just force it to only use the nvidia card and not slave it to the lame on board video.

Link to comment
Share on other sites

Looks like the GTX 260 is a no go.  It needs a 500W power supply and this thing only has a 300W power supply with SATA power connectors and one 4 pin Molex. 

 

Cost-wise am I better off getting a beefier power supply or getting a better graphics card that will work with a 300W power supply (better than the Intel HD 2000 it has)?

Link to comment
Share on other sites

21 minutes ago, Tempest said:

Looks like the GTX 260 is a no go.  It needs a 500W power supply and this thing only has a 300W power supply with SATA power connectors and one 4 pin Molex. 

 

Cost-wise am I better off getting a beefier power supply or getting a better graphics card that will work with a 300W power supply (better than the Intel HD 2000 it has)?

It's probably just simpler to get a beefier power supply. Although newer graphics cards require less power, you've got the pricing craziness of the brand new ones to deal with. And it you're looking for a newer-but-used card, you have both a research project on your hands to determine how much power it needs, plus the possibility that it's been frying in some crypto farm continuously for months.

Link to comment
Share on other sites

Looking through my box 'o parts I think I have what appears to be an OEM Quadro card (looks like this: https://www.ebay.com/itm/264628161730)  Would that be better than the Intel HD 2000?  I don't need screaming performance, I just want to play games probably up to the late 90's.

 

EDIT: No dice.  It needs a PCIe power connector too.

Link to comment
Share on other sites

29 minutes ago, Zoyous said:

Try using this power supply calculator, it should give you a good idea... and it wouldn't hurt to go a little above; it won't use it if you don't need it, but if you do need it you gotta have it: https://www.newegg.com/tools/power-supply-calculator/

There's a 600W version for $5 more so maybe I'll do that.  The GTX260 doesn't even appear on that list.

 

EDIT: I found something that has the same power requirements as the GTX260 and the calculation said a 400-500W PS would be fine.

Link to comment
Share on other sites

Ok I tried my current setup out on my TV and I was surprised that it worked really well.  There were some problems though that I'd like to pick everyone's brains about  though since I really haven't played with MAME since the early 2000's. @Tanooki@Zoyous

 

1. I noticed that horizontal scrolling games had some screen tearing.  I didn't see this when I put the screen bezels on which shrinks the screen down a ton (seriously, how do people play with those enabled?), so I'm assuming the issue is that the anemic Intel HD Graphics 2000 chip in the computer can't 100% handle full screen on a 42" TV (but no slow down oddly enough).  Is there anything I can do to help fix that or do I just need to get a better video card?  Or is this possibly an issue with my TV?

 

2. Speaking of a new video card, I realized that my GTX 260 only has DVI out on it.  There are no HDMI ports (but it does have S-Video oddly enough).  I have a DVI to HDMI cable, but that won't carry sound.  I'm not quite sure how I'll get sound to TV since the TV/Receiver assume that the sound is coming from the HDMI input.  Has anyone ever done something like this before?  Maybe I'll have to use the S-Video out?

 

Link to comment
Share on other sites

2 hours ago, Tempest said:

Ok I tried my current setup out on my TV and I was surprised that it worked really well.  There were some problems though that I'd like to pick everyone's brains about  though since I really haven't played with MAME since the early 2000's. @Tanooki@Zoyous

 

1. I noticed that horizontal scrolling games had some screen tearing.  I didn't see this when I put the screen bezels on which shrinks the screen down a ton (seriously, how do people play with those enabled?), so I'm assuming the issue is that the anemic Intel HD Graphics 2000 chip in the computer can't 100% handle full screen on a 42" TV (but no slow down oddly enough).  Is there anything I can do to help fix that or do I just need to get a better video card?  Or is this possibly an issue with my TV?

 

2. Speaking of a new video card, I realized that my GTX 260 only has DVI out on it.  There are no HDMI ports (but it does have S-Video oddly enough).  I have a DVI to HDMI cable, but that won't carry sound.  I'm not quite sure how I'll get sound to TV since the TV/Receiver assume that the sound is coming from the HDMI input.  Has anyone ever done something like this before?  Maybe I'll have to use the S-Video out?

 

1. My initial thought is that it is probably related to the refresh rate of your TV. I'm assuming it's a newer HDTV (not a CRT)? Some TVs have a "Game" mode that will have the fastest refresh rate. I don't think screen tearing would be caused by a particular resolution setting, that would be more likely to cause "shimmering" when the pixels are rendered at inconsistent sizes due to nearest neighbor scaling.

 

2. I would route the audio out of the speaker/headphone jack of your computer if there is an audio input next to the HDMI input. Some TVs have them next to each other. If not, S-Video might be the way to go but I would expect a somewhat softer image. Here's a general reference from Sony, but you might be able to find something more specific to your TV: https://www.sony.com/electronics/support/articles/00010109

Link to comment
Share on other sites

9 minutes ago, Zoyous said:

1. My initial thought is that it is probably related to the refresh rate of your TV. I'm assuming it's a newer HDTV (not a CRT)? Some TVs have a "Game" mode that will have the fastest refresh rate. I don't think screen tearing would be caused by a particular resolution setting, that would be more likely to cause "shimmering" when the pixels are rendered at inconsistent sizes due to nearest neighbor scaling.

 

2. I would route the audio out of the speaker/headphone jack of your computer if there is an audio input next to the HDMI input. Some TVs have them next to each other. If not, S-Video might be the way to go but I would expect a somewhat softer image. Here's a general reference from Sony, but you might be able to find something more specific to your TV: https://www.sony.com/electronics/support/articles/00010109

It could be my TV.  It's a Vizio from 2016 or so and it wasn't a top of the line one.  I can try looking for a game mode or I can just hook it up to the 20" CRT I have here. 

Link to comment
Share on other sites

Yeah I'd look into game mode, it should be there as I had one from around then, and if not you can manually kill all the video and audio filters and processing in the lists.

As far as tearing goes, it could be you need vsync set, or as noted, it could be an issue of tv refresh vs the not quite 60hz an arcade game may have used possibly.

 

The audio isn't a problem I used to split the signal for over 15 years with an old MONO NES cable, I broke one part off (video) to the TV and the other using a gold plated monster Y cable to left/right into a receiver and it was perfect.  Something of that sort, the DVI-HDMI works visually as you said, but it won't do audio, so you'll need to run the audio on the side to the source of choice.

Link to comment
Share on other sites

18 minutes ago, Tanooki said:

Yeah I'd look into game mode, it should be there as I had one from around then, and if not you can manually kill all the video and audio filters and processing in the lists.

As far as tearing goes, it could be you need vsync set, or as noted, it could be an issue of tv refresh vs the not quite 60hz an arcade game may have used possibly.

 

The audio isn't a problem I used to split the signal for over 15 years with an old MONO NES cable, I broke one part off (video) to the TV and the other using a gold plated monster Y cable to left/right into a receiver and it was perfect.  Something of that sort, the DVI-HDMI works visually as you said, but it won't do audio, so you'll need to run the audio on the side to the source of choice.

I checked and the Game Mode is on already.  Here's a video of what's happening, you can see that weird line going vertically down the screen.   Maybe there's an option in MAME that I can turn on to help?

Link to comment
Share on other sites

That's vsync the bane of emulators going back into the mid 90s.  If it's not enabled, you get the tear.  The problem I guess will be looking into mame wherever it hides an enabling it.  It will make it more thirsty for hardware to lock the thing down to stop that, so keep that in mind on some upper tier games you may attempt.

Link to comment
Share on other sites

In MAME, you can go to Configure Options > Video Options. And there you will find the various settings that people play around with to eliminate/minimize screen tearing. Some people say changing to/from OpenGL or D3D has helped them; other turn on Triple Buffering. It's a YMMV situation that just requires some trial and error. The ideal solution according to MAME devs is to have a Gsync/Freesync monitor - everything else will introduce varying degrees of lag (perhaps so minor as to not be noticeable or impactful on gameplay).

Link to comment
Share on other sites

I did look in the options and turned on wait for vsync and triple buffering, but I don't think that helped.  There are a whole ton of options in there that I don't quite understand, maybe I need to ask on a MAME specific board?

 

I hope I can get this fixed because it actually works quite nicely for the 80s/early 90s games I usually play.  Interestingly I didn't notice the issue on single screen games.

 

maybe I need to just try triple buffer by itself and see if that fixes things.

 

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...