Keatah Posted December 3, 2016 Share Posted December 3, 2016 Seems MAME is caving in and going against what they once said they'd never do. And that is to use the GPU. My opinion, they have to. Given that processor development and increases in speed is all but at a standstill these last couple of years. Quote Link to comment Share on other sites More sharing options...
Shift838 Posted December 3, 2016 Share Posted December 3, 2016 Seems MAME is caving in and going against what they once said they'd never do. And that is to use the GPU. My opinion, they have to. Given that processor development and increases in speed is all but at a standstill these last couple of years. I think if the developers can use the GPU to relieve some cycles on the CPU and still retain the original graphics quality then that will be great. MAME is a resource hog on the CPU. I had to move to MESS which is a much more stripped down version for my needs in order to free up CPU cycles. I am running a BBS via MESS emulation and when running MAME the cpu usage was too high and caused my clock within the emulated system to be off within a week to be a day wrong. Moving to MESS solved that problem. Of course I don't think the developer thought that someone would run MAME/MESS in an extended period. I think utilizing the GPU is a step in the right direction. 1 Quote Link to comment Share on other sites More sharing options...
MrMaddog Posted December 12, 2016 Share Posted December 12, 2016 I've always thought that there should be a version of MAME that uses the 100-core GPUs for processing in place of the 4-core CPUs for emulating mid-90's arcade games. This is not the same as just using 3D graphic accleration like PSXMAME. Quote Link to comment Share on other sites More sharing options...
witchspace Posted January 28, 2017 Share Posted January 28, 2017 Seems MAME is caving in and going against what they once said they'd never do. And that is to use the GPU. My opinion, they have to. Given that processor development and increases in speed is all but at a standstill these last couple of years. The way you word it, it sounds almost like they had a deep philosophical issue with using GPUs? (for anything but rendering/scaling the final image?) I guess because in the early days of GPU computing it all was proprietary and unstable and limited to a few vendors. Today, OpenCL and OpenGL compute shaders are mature open standards and supported by any GPU vendor worth it's salt, even embedded ones. Integer arithmetic is now available, which relieves the precision concerns with floating point handling on some GPUs. Depending on the algorithm it is usually more work to write certain code for GPU, though, that still limits their use in open source projects. Having to implement things twice, once for CPU and once for parallelized GPU doesn't really help either. 1 Quote Link to comment Share on other sites More sharing options...
Keatah Posted January 28, 2017 Author Share Posted January 28, 2017 The way you word it, it sounds almost like they had a deep philosophical issue with using GPUs? (for anything but rendering/scaling the final image?) That's correct. They did and still do, but are slowly coming around. I probably have archived chat conversations and save webpages discussing it. Quote Link to comment Share on other sites More sharing options...
BillyHW Posted January 28, 2017 Share Posted January 28, 2017 MAME should have three emulator cores for each machine/game: One built for accuracy, and another built for performance, and a third which is the best compromise between accuracy and performance for contemporary computers. I know that's asking a bit much though. Quote Link to comment Share on other sites More sharing options...
English Invader Posted January 28, 2017 Share Posted January 28, 2017 My concern here is that they might be trading one set of problems for another. GPU's produce quite variable results and which GPU's are they intending to support? Are systems with integrated graphics going to be included or are we talking dedicated GPU's only here? And is there really that much to gain from it? Is the MAME library not already plentiful enough with thousands of successfully emulated arcade machines with emulated graphics from the CPU? I think the best compromise would be to give the user the option of using the GPU in the emulator settings. That way those who want to stick to tried and tested methods can do so while those who want to experiment can do that as well. 1 Quote Link to comment Share on other sites More sharing options...
Trebor Posted January 28, 2017 Share Posted January 28, 2017 MAME should have three emulator cores for each machine/game: One built for accuracy, and another built for performance, and a third which is the best compromise between accuracy and performance for contemporary computers. I know that's asking a bit much though. That has been tried with another emulator. Byuu dabbled with it for awhile with his SNES emulator. It may work well short term, but for the long haul, it adds more complexity and work (of course), including additional issues trying to maintain three separate target builds. Byuu has long since dropped the one "built for performance" (bsnes-performance), as well as "best compromise between accuracy and performance" (bsnes-compatibility), and kept the "build for accuracy" (bsnes-accuracy) as the goal...just like MAME. Quote Link to comment Share on other sites More sharing options...
Keatah Posted January 28, 2017 Author Share Posted January 28, 2017 The goal for every emulator should be accuracy. This point isn't open for debate. It is the user's responsibility to provide enough host computing power. Besides, if you want a higher performing emulation, then just go with an older version - and don't complain about inaccuracies. 1 Quote Link to comment Share on other sites More sharing options...
Flojomojo Posted January 28, 2017 Share Posted January 28, 2017 ^^this is how I feel as well. So long as older versions remain readily available (and they will), I can't find anything to complain about with this minor change in direction. 2 Quote Link to comment Share on other sites More sharing options...
fujidude Posted January 28, 2017 Share Posted January 28, 2017 I'm in agreement that accuracy should ideally always be the priority. But because there can be a considerable wait time for host hardware to become available that is powerful enough to even allow for accurately emulating other hardware, I can sympathize with the decision to forego some accuracy. For example if doing so means the difference between having any emulation at all vs. none. But if it is possible to have better accuracy with hardware that is available then that should be the path. Those who can't (or chose not to) afford hardware of sufficient performance will just need to wait until they can afford it. Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.