Jump to content
IGNORED

Retroblox


omnispiro

Recommended Posts

You can get technical all you want, fact is games always feel better on FPGA simulation, not software emulation, and that's what matters to the end player. Emulators have always been crap, and will always be crap compared to original hardware, good riddance.

Those are some strong words, "always" and "crap." Got any evidence to back them up? Besides "look at Kevtris," please. Consumer level FPGA is still future tech from where I sit, and emulation is plenty good enough for me --especially since it's working here and now.

  • Like 4
Link to comment
Share on other sites

And to get the real usage I subtracted the 12watts which the test setup consumes when standing idle,

 

I think we should keep the idle load when comparing across platforms, because that's what it's going to cost to have it running for an hour. No doubt my PC rig is not optimized at all, and as you say there might be ways to cut down power usage.

 

 

It includes drawing the screen and playing sound through small speakers at low-moderate volume levels.

 

3W-10W for Doom in DosBox

9W for the Commodore 64 playing Gyruss or Frantic Freddie

8W for the Atari VCS with some TV effects and playing Video Pinball, Combat, or Scramble DPC+

6W for Sony PlayStation 1 playing Tempest or Xevious 3D/G

5W for Atari 400/800 playing Star Raiders or MemoPad

4W for Amiga 500 sitting at the Workbench Desktop

4W for classic arcade games like Gyruss, Tempest, Zaxxon, Discs of Tron, Blasteroids, Sky Raiders, Assault

And an amazing 1W for the Apple II playing A2-FS1

 

Does this include the monitor and speakers too, or are those in the base 12W?

 

The comparison I can suggest is against a Raspberry Pi, which uses much less than a desktop PC, and is viable for many 8 bit emulators. Power usage is still higher than an FPGA though.

 

 

Side note1: What i uncovered was a flaw in Emulator Stella for the VCS. The ROM selection screen takes an extra 3 watts above and beyond what is consumed while running the emulator. Must be some sort of faulty programming with funky loops in the sort routine going on.

They probably call some external library vs. custom handling on screen for the emulation.

 

Side note2: (...) You can use Emulator Stella or MAME to deny/allow such snoozing and directly observe the states being entered or not. And you can observe corresponding decreases/increases in power usage.

 

I'll have to look into that, but I think it will be hard to beat an FPGA in this department. I will check again but I remember it was less than 4-5W for running the Amiga AGA core (to be compared with running UAE or similar). And that's total load, i.e. without substracting idle load.

 

Another observation, DosBox power usage can be all over the graph depending on the task given it. Granted I have not checked all the CPU options it has and it may be possible to stabilize it. Whereas things like the VCS or Apple II, which are totally alien to PC hardware, are rather steady no matter what type of software you run; bouncing around as little 0.5 - 1 watt.

I'd expect that emulating an x86 and dos is much more complex than the humble 6502 of the VCS and Apple II :)

 

The more accurate the emulator, the more isolated virtual load fluctuations are from the host CPU.

Not sure. Have you tried Higan? (ex bsnes) That should be a good way to test that...

 

And therein lies a complex load balance. And don't forget the "art of programming" too. It's not always a science. Look at the emulated VCS vs Apple II AppleWin figures? Amusing, eh?

Yes a lot of factors go in... but on my own measurements I convinced myself FPGAs have an objective advantage here. Happy to test that conclusion though and to find out how low we can go with CPU emulation. After all the lower the draw, the easier it will be to make things portable. Or to run dozens at once as a mini arcade..! :D Edited by Newsdee
Link to comment
Share on other sites

You can get technical all you want, fact is games always feel better on FPGA simulation, not software emulation, and that's what matters to the end player. Emulators have always been crap, and will always be crap compared to original hardware, good riddance.

Tell the authors of the software emulation that their work is crap and see if you gain any brownie points. If you're lucky they may take your "criticisms" as a suggestion to improve things.

 

Software Emulation has a rich heritage of sophisticated code 20 years in the making. It's here, now, and it works. It has many different features and capabilities you're simply not seeing.

 

FPGA rigs are still in early development and don't sport the levels of refinement that software emulation does. Not yet. In critically and honestly comparing the two technologies you will find software emulation to be more versatile and easier to live with overall. And in some cases SE is even BETTER than the original hardware.

 

 

  • Like 1
Link to comment
Share on other sites

Monitor and speakers and everything is in the base number. ~12 watts. Not a very big monitor, and not loud speakers either. That is the power consumed when you're sitting at the desktop thinking about what emulator to run next.

When you start up Altirra and play Star Raiders, for example, the power consumption jumps by 5 watts. So the emulator is adding 5 watts of load so to speak. And the total consumed is now 17 watts. Likely that's because the host CPU is activating and applying voltage across dormant parts and tickling them with a clock so to speak. It's more complex than that. You can figure in how long a signal takes to transition from high-to-low across a gate, and while transitioning you generate heat in middle of the curve. Intel does all those calculations when they design the chip. For our purposes a watt-meter and sound measurement techniques should suffice. After all.. Power goes in, video comes out!

I don't run Higan on this machine. It's not quite there and struggles. But it works fine on my main rig which is a full-blown i7 at 4.5GHz with power to spare. It will be an interesting test to run it on the 25watt 6700T processor. That's i7 at 3.6GHz, but is also part of low-voltage family! It's on my purchase list.

4-5W for Amiga on FPGA sounds right. Won't argue there. The x86 test system consumes 17W all inclusive with WinUAE running Workbench desktop, monitor, speakers, mouse. There may be some software that could ask more of the emulator and push it up another watt. But so far I'm not seeing big variances in 8 or 16 bit systems.

I won't deny FPGA is the lowest power consumer for playing the vintage games and classic computer material. I suppose the point of my little-a'speriment was to see just how bad, how hot'n'heavy x86 is in comparison to those FPGA rigs. And I was pleasantly surprised. It's nowhere near as bad as people make it out to be.

Since FPGA is rather bare-bones, only the emulated system, and nothing but the emulated system it makes sense that it will have a power advantage. The x86 system is going to have an OS to handle. And I wouldn't be caught dead without one. It provides so many features and niceties.

It seems that through either software emulation or fpga we can achieve very significant power savings over classic hardware in most cases. A lot of these savings come simply from the reduced pitch/geometry used in modern manufacturing, power supplies, and management of the whole computing platform. Fpga achieves that naturally by cutting the fat and "being just the game". X86 gives you more, like better game selection screens, more customization capability, and a debug environment. And while those things can be turned off and pushed into the background, they cost some energy, in some way, to keep at the ready.

 

Certainly without question both technologies offer huge (beyond huge) power savings when compared against arcade cabinets. A lot of chips in those beasts are of 1um to 8um construction. 8 for the first 6502, 5 for the Z80, 3 for the 68000. And then there are all the individual 5V TTL logic chips and ROM. That monster monitor and gorilla power supply with a 6-pound transformer. God only knows the inefficiencies happening there. And main RAM, like 4116 chips, needed +5, +12V, and -5V. The memory dissipated 1/2watt per chip when accessing data, half that in standby. And you have double-sized motherboards, multiple-motherboards, populated with hundreds of chips, in card cages in those cabs. Crazy!! MAME doesn't seem so power hungry now, does it?

 

The power supply rails in my head are sagging already so I must end this..

Link to comment
Share on other sites

FPGAs don't provide just the game though, you can code a "housekeeping" firmware to run inside a tiny CPU that will handle IO and other amenities. That can run inside the FPGA itself or as a separate chip on the board. That doesn't need a lot of power because it is just e.g. handling USB or SD card files.

 

The NT Mini and many MiST cores (e.g. PC engine) implement this MCU inside the FPGA, but the MiST and other boards provides a phisical one as a tiny ARM chip (it gets mostly bypassed by cores that have their own IO handling).

 

The one thing holding back FPGA is lack of developers, and less open source projects going around to invite collaboration. But this is slowly changing, for example 3-4 guys are making progress on the Genesis right now.

Edited by Newsdee
Link to comment
Share on other sites

You can get technical all you want, fact is games always feel better on FPGA simulation, not software emulation, and that's what matters to the end player. Emulators have always been crap, and will always be crap compared to original hardware, good riddance.

i would not be so quick to discredit emulation. It does a nice job getting 98% there, but some discerning retrogamers demand 100%, zero latency, and this level of perfection demands cycle accurate replication of the original hardware. Only very recently has this been realized through fpga.
  • Like 1
Link to comment
Share on other sites

I find it quite amusing when people complained about the sharp squared-off edges that strict software emulation generates by default. But all-of-a-sudden it's perfectly acceptable on fpga simulations.

 

The irony is that while FPGA hardware emulation can achieve the closest thing to 100%, that comes with some of bugs/artifacts that were in the original hardware (such as sprite limits and stray lines in the case of the Atari and NES.) However software emulators always, always, have to use performance hacks to get reasonable performance on a desktop because not everyone is going to be running a 4Ghz processor. So when it comes to the entire input latency-output scaler latency you will never achieve 100% on a desktop computer, though it will be better than underpowered devices like the RPi. Some people will not even realize that the software emulator is so inaccurate because they never had the original hardware.

 

That is why you see so many youtube videos with square-pixels. To get the correct aspect ratio requires adding two stages of buffering latency in software, thus the games are not actually playable while scaled to the monitor's native resolution in most cases. The best you can get with a desktop is to not do it in software but instead leverage the GPU to scale, which can bring it down to just one stage if the emulator is writing directly to the texture and not to a software buffer first. Never mind the other CRT effects.

 

That's why a desktop's software emulation is about as close as you're going to get to accurate in software. The latency on the desktop is about the same as capturing it from the real hardware with a high-quality capture card.

Link to comment
Share on other sites

Which emulators and games are unplayable? On a moderate to fast desktop.. Nevermind the R-Pi things.

 

Also even the most basic GPU can scale 2x, 3x or whatever is needed. Typically up to 8x before they fail. They've been doing that since the mid-2000's. In fact it's a default and you need to turn it off to request software scaling.

 

I also believe you're making the latency situation sound worse than it is. This summer, look for a test, like I did the power consumption tests, but for latency. It's real simple. The discrete separate timer starts when you press the fire button. And it stops when the photocell detects a change on-screen. Takes a bit of setup and a controlled gaming environment. We'll have some solid hard numbers then. It will measure the the entire path button-to-action.. If I get sidetracked please remind me.

 

As an alternate less accurate method you can press the fire button, light up a led, and watch the onscreen action with a fast camera. And count the frames or fractions of a frame and compute the time.

  • Like 1
Link to comment
Share on other sites

Which emulators and games are unplayable? On a moderate to fast desktop.. Nevermind the R-Pi things.

 

 

Look at what CPU's are in everything. Unless you're willing to buy a $2000 PC, what you're typically getting are 1.6-2.0Ghz dual core systems that are about 1/2 to 2/3rds the performance of the performance point you need to run Higan, it's very likely that no available laptop or desktop CPU can run Higan at more than half the speed. Most of the games I throw at it top out at 45 frames per second and I have an i7-4770.

 

SNES emulators have been out since the 90's. The accuracy and compatibility back then was pretty bad. While these legacy emulators (ZSNES, Snes9X) compatibility have improved since then, they still resort to the same timing hacks that were needed in 1999.

 

Which is I have to keep mentioning the problem is that many of the people who play games on these, have no idea that's not how the game is supposed to behave. One of the more earliest attempts at running a SNES emulator that I remember was to play the FFV translation, but because ZSNES at the time didn't have 16-bit color mode, you'd get into the ship graveyard and everything was impossible to navigate around without turning off backgrounds. You can't report a bug in a game if you haven't actually played the original game that far.

 

Just recently I bought a SFC cart of a game that I played in Japanese and in English before, and the first thing I noticed was that the text color was not always white. Was this a bug in the emulator? Was this a bug in the translation? Was it a bug in the cart dump? Did I just not notice before? It's things like this that causes people to mistakenly believe that the emulator is correct when it is not. While a color being wrong is not the end of the world, that goes right back to the accuracy question. For many people, they just see emulation and piracy as a way to play free games, and thus emulator developers have no way to verify if a bug reported by someone seen in a game is a bug in the emulator or the game was actually supposed to do that on all versions of the original hardware or only the version with a certain hardware bug.

 

But when things like the Raspberry Pi (RetroPie) just come out and the color isn't even remotely correct in the first place, that kinda makes you wonder why you spent money on something that not even the developer cared enough to ensure it worked properly.

Edited by Kismet
  • Like 1
Link to comment
Share on other sites

What FPGA cores run SNES with the required accuracy?

Kevtris should simply set up a GoFundMe account to get the SNES chips decapped. Maybe do the 1chip model as less chips = cheaper, although those have certain inconsistencies with the original units, any minute differences could likely be worked out. It's a far cry from the multitude of Mega Drive / Genesis hardware variations. All of the patents are expired anyway and the masks aren't getting copied in the FPGA so there's no copyright issues.

Link to comment
Share on other sites

Kevtris should simply set up a GoFundMe account to get the SNES chips decapped. Maybe do the 1chip model as less chips = cheaper, although those have certain inconsistencies with the original units, any minute differences could likely be worked out. It's a far cry from the multitude of Mega Drive / Genesis hardware variations. All of the patents are expired anyway and the masks aren't getting copied in the FPGA so there's no copyright issues.

 

Their could be copyright issues for a FPGA Core on a SNES,

SNES and a lot of it parts which could of been designed by FPGA which would add a copyright along with the patent

 

Which would mean if someone designed a FPGA it could fall under a Derivative Work

 

 

What FPGA cores run SNES with the required accuracy?

 

 

Believe the above is stopping people from releasing a public FPGA since it unknown if it copyrighted and difficult to track some of the parties down

 

 

 

Link to comment
Share on other sites

What FPGA cores run SNES with the required accuracy?

 

There are at least 3 known projects/attempts at this and I linked to them earlier. Until such time one of them is publicly available, we just have to take it at the developers word that it's correct.

 

In the case of jwdonel's VeriSNES, that's not his first crack at a FPGA emulator, but this is the first one that nobody else has published a working FPGA hardware core for. At last I checked he had not implemented sprites or color math (transparency.) We only know that the other two worked to some degree because the developer told us so, and in the case of the one with a video, they actually have a posted fitting report http://pgate1.at-ninja.jp/SNES_on_FPGA/rpt_DE2-115.htm, so we do have some idea how much space on a FPGA is required for the SNES alone, but not enough to include upscaling like in the NT Mini. The Z3K proposes to do this by putting that upscaling logic into another chip. But note that all of these were built on devboards on much larger FPGA's. The DE2-115 is actually a $600+ board with a $350 FPGA

 

You can see what FPGA is needed for upscaling in the OSSC already https://www.niksula.hut.fi/~mhiienka/ossc/diy-v1.3/bom_v1.3.xlswhich is the EP4CE15E22 (Cyclone IV E, 15408 LE), so if that much is needed to upscale to 1080p without a frame buffer, than you can be sure that the 28,417 LE needed on a Cyclone IV E for the SNES combined needs around 44K total, of which kevtris already said he wants to use a 49K FPGA for the system and a max10 for the scaler. So if you need 16K LE for this, the cheapest MAX10 with that amount is $34 for 10M16SAU169C8G. The next one up is 25K LE and $49. Now assuming that the SNES was the maximum thing that you wanted to fit, and you wanted to do the same thing the NT Mini jailbreak does, that means the SA1 would have to be emulated which is another CPU, so going back to that fitting report, the CPU alone is 2160 LE. Then we have to consider things like the Sufami Turbo games, DSP chips, CX4, BS-X, MSU-1, and Super Gameboy. To be able to do any one of these devices requires emulating at least one extra chip. Probably the only game that might not be doable in a FPGA is the one with the ST018 chip.

 

But going back to the point of the question, short of someone uploading raw video comparing games running on the real SNES to the complete FPGA SNES we won't know how accurate they are, yet.

Link to comment
Share on other sites

 

Their could be copyright issues for a FPGA Core on a SNES,

SNES and a lot of it parts which could of been designed by FPGA which would add a copyright along with the patent

 

Which would mean if someone designed a FPGA it could fall under a Derivative Work

 

 

 

 

Believe the above is stopping people from releasing a public FPGA since it unknown if it copyrighted and difficult to track some of the parties down

 

 

 

 

At least one project uses a WDC sourced CPU core, and for all intents, we don't really know what the Japanese one did, almost seems like they just guess-and-checked.

 

If you run http://pgate1.at-ninja.jp/SNES_on_FPGA/through a machine translator, it sounds like this one was designed without probing an original SFC, Note the date started was 2005. They now appear to be working on the expansion chips in 2016.

 

Incidently, if you read the note around this video

It says SNES FPGA with original SPC, because it's a DE1 dev board with no room for it.

 

A note later in the page says that they would need two DE1's and that just seems too expensive.

Edited by Kismet
Link to comment
Share on other sites

There's always the option not to upscale and let people hook up their favorite thing to it. While many would complain of the lack of HDMI, it would be good to get an open source core out there while we wait for cheaper and larger FPGAs to become available.

You'd still need a DVI-I or HDMI output anyway to send it through an upscaler, but HDMI does not support anything below 640x480p60. A DVI-A to VGA works in this case BUT VGA may not support 15Khz either.

 

But the problem is, SNES and NES aside, if you are dealing with a FPGA device that can output multiple consoles, is going to be a problem that isn't futureproof in the slightest. 4K monitors do not come with VGA inputs, 4K TV's may or may not have any analog inputs on them depending on your part of the world. So it comes back to being able to produce a device that people can use without having latency-adding boxes.

 

At any rate. I think kevtris is definitely on the right track, but 4K is ultimately going to be a problem one way or the other. We may not have a FPGA solution for 4K ($9000 for such FPGA's) and may need to press on TV manufactures to do line-by-line pixel scaling, and not these gross bilinear buffer filters. If you read one of Altera's solutions for doing this:

https://www.altera.com/products/reference-designs/all-reference-designs/broadcast/ref-4k-video-upscaling.html

 

You actually see that what they do is concat 4 1080p streams horizontally. But they use a buffer.

Edited by Kismet
Link to comment
Share on other sites

 

There are at least 3 known projects/attempts at this and I linked to them earlier. Until such time one of them is publicly available, we just have to take it at the developers word that it's correct.

 

In the case of jwdonel's VeriSNES, that's not his first crack at a FPGA emulator, but this is the first one that nobody else has published a working FPGA hardware core for. At last I checked he had not implemented sprites or color math (transparency.) We only know that the other two worked to some degree because the developer told us so, and in the case of the one with a video, they actually have a posted fitting report http://pgate1.at-ninja.jp/SNES_on_FPGA/rpt_DE2-115.htm, so we do have some idea how much space on a FPGA is required for the SNES alone, but not enough to include upscaling like in the NT Mini. The Z3K proposes to do this by putting that upscaling logic into another chip. But note that all of these were built on devboards on much larger FPGA's. The DE2-115 is actually a $600+ board with a $350 FPGA

 

You can see what FPGA is needed for upscaling in the OSSC already https://www.niksula.hut.fi/~mhiienka/ossc/diy-v1.3/bom_v1.3.xlswhich is the EP4CE15E22 (Cyclone IV E, 15408 LE), so if that much is needed to upscale to 1080p without a frame buffer, than you can be sure that the 28,417 LE needed on a Cyclone IV E for the SNES combined needs around 44K total, of which kevtris already said he wants to use a 49K FPGA for the system and a max10 for the scaler. So if you need 16K LE for this, the cheapest MAX10 with that amount is $34 for 10M16SAU169C8G. The next one up is 25K LE and $49. Now assuming that the SNES was the maximum thing that you wanted to fit, and you wanted to do the same thing the NT Mini jailbreak does, that means the SA1 would have to be emulated which is another CPU, so going back to that fitting report, the CPU alone is 2160 LE. Then we have to consider things like the Sufami Turbo games, DSP chips, CX4, BS-X, MSU-1, and Super Gameboy. To be able to do any one of these devices requires emulating at least one extra chip. Probably the only game that might not be doable in a FPGA is the one with the ST018 chip.

 

But going back to the point of the question, short of someone uploading raw video comparing games running on the real SNES to the complete FPGA SNES we won't know how accurate they are, yet.

FPGA prices are going to drop dramatically. In another couple of years, a 100k LE FPGA might be a reasonably affordable solution.

 

If they ever get large enough to incorporate the RAM into the FPGA, then the SDRAM latencies will be a non-issue. It's a pity 1-1-1 parallel RAM is not commercially available in sufficient sizes to handle the myriad of games from the 16-bit era (16-bit parallel RAM would be adequate and multiple chips could handle multiple busses, or higher bus width by dual ganging the modules). Then you wouldn't have to worry about whether the 150Mhz SDRAM module has low enough latency to provide direct byte-by-byte access within one clock cycle at ~7Mhz speeds.

Link to comment
Share on other sites

You'd still need a DVI-I or HDMI output anyway to send it through an upscaler, but HDMI does not support anything below 640x480p60. A DVI-A to VGA works in this case BUT VGA may not support 15Khz either.

 

Isn't the VGA port on the NT Mini technically 15kHz VGA signal, provided your monitor is one of those rare models that support this sync rate? This could easily be adapted for JAMMA too I think. It would have been great to allow line doubling over VGA for use with 480p tube monitors.

Link to comment
Share on other sites

 

Isn't the VGA port on the NT Mini technically 15kHz VGA signal, provided your monitor is one of those rare models that support this sync rate? This could easily be adapted for JAMMA too I think. It would have been great to allow line doubling over VGA for use with 480p tube monitors.

Not really.

 

 

 

It's designed to be able to use pre-existing cables like these:

https://www.monoprice.com/product?p_id=562

 

 

 

VGA (HD15) to 5 BNC Component (RGBHV) adapter cable.
For adapting VGA to 5 BNC connectors typical on Sony and broadcast quality video equipment.
For RGBHV only. Will not work with 3 BNC component video (Y, Pr, Pb)
Beige color cable and connector housings.

 

 

This cable is intended for use with projectors that use a VGA connector for its component (YPbPr) video connection. This cable DOES NOT CONVERT VGA signals to component video or vice versa - it functions only with devices that use the VGA connector for component video.

This cable has an DE15 (HD15) VGA/SVGA connector on one end and three, color-coded RCA connectors on the other end. The connectors and pins are gold plated for smooth, corrosion-free connections.

 

https://support.analogue.co/hc/en-us/articles/115000923948-Using-Analog-Video-output-with-the-Nt-mini

 

At any rate, VGA CRT's do not support 15khz 240p60, they typically support 320x240@70 QVGA 31khz , so not many monitors would support it, but run it through a scan doubler and you push it through the 640x480@60 mode.

 

  • Like 1
Link to comment
Share on other sites

It's trivial (not many LEs) to make the FPGA double the lines and output 480p. The problem is that technique isn't fully VGA compliant in many cases, as the sync frequency might be slightly out of VGA spec. HDMI is even less tolerant.

 

Kevtris' idea for the Z3K is to ship with an internal upscaler, but technically it's not much different to hooking an upscaler board to RGB out. Of course it's more convenient to have it all in one box, but the caveats regarding framebuffer and such still apply.

 

That said, if you buy yourself a 4K TV, getting an independent high quality upscaler is not much of a stretch if you really care about latency.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...