Jump to content
IGNORED

FPGA Based Videogame System


kevtris

Interest in an FPGA Videogame System  

682 members have voted

  1. 1. I would pay....

  2. 2. I Would Like Support for...

  3. 3. Games Should Run From...

    • SD Card / USB Memory Sticks
    • Original Cartridges
    • Hopes and Dreams
  4. 4. The Video Inteface Should be...


  • Please sign in to vote in this poll.

Recommended Posts

 

Here's the quote Wolf's referring to:

 

"bsnes accuracy is VERY close to 100%, with a 65816 CPU core that is - are you sitting down? - accurate to the clock cycle. "

(from http://nonmame.retrogames.com/)

That makes the emulator great but it doesn't remove the 1-2 lag in frames.

 

Lemme explain .... higan/bsnes runs on modern HW which can only render whole frames (there's no access to per-line sync afaik) as such the previous frame must be first buffered in its entirety before being displayed (this happens as the current frame is being displayed). Then there's the difference in frequency of refresh among the original HW and the HW you use the emulator on, if they are not multiple of each others you have tearing if you're not careful, to avoid tearing you need buffering, one would think you can lump the 2 together [full frame render + diff in sync] but it's not that easy depending on how much/little the sync difference is.

So told if you can simply make the emulating host HW double or triple to original HW framerate you're good and can lump in at the sub frame, but I am not sure how many people get to play on a LCD/LED display at 120/180/240Hz to begin with.

Add to that that you're going to need to speed up/slow down the original system to match an exact multiple of the screen display (no console ever run at exactly 60Hz for example) and you see how it's really hard to avoid that 1 full frame of delay.

 

Kevtris in his FPGA speeds up/down the original system logic (all of it) ever so slightly that it forces the rendering of the video at a strict 60Hz from the original system so he can just buffer a couple of lines when using HDMI ... but that would not be cycle perfect (it can still be cycle accurate in that each and every cycle mimics the real HW it just has slightly more/less cycles per second compared to the original). The impact of that slight increase/decrease is usually negligible but it may impact some godforsaken games (they will behave just a tad differently at the beginning but in some pathological cases it may accumulate over time until the game behaves very differently] whether that matters to you or not is a different story (I likely wouldn't care).

  • Like 1
Link to comment
Share on other sites

That makes the emulator great but it doesn't remove the 1-2 lag in frames.

 

Lemme explain .... higan/bsnes runs on modern HW which can only render whole frames (there's no access to per-line sync afaik) as such the previous frame must be first buffered in its entirety before being displayed (this happens as the current frame is being displayed). Then there's the difference in frequency of refresh among the original HW and the HW you use the emulator on, if they are not multiple of each others you have tearing if you're not careful, to avoid tearing you need buffering, one would think you can lump the 2 together [full frame render + diff in sync] but it's not that easy depending on how much/little the sync difference is.

So told if you can simply make the emulating host HW double or triple to original HW framerate you're good and can lump in at the sub frame, but I am not sure how many people get to play on a LCD/LED display at 120/180/240Hz to begin with.

Add to that that you're going to need to speed up/slow down the original system to match an exact multiple of the screen display (no console ever run at exactly 60Hz for example) and you see how it's really hard to avoid that 1 full frame of delay.

 

Kevtris in his FPGA speeds up/down the original system logic (all of it) ever so slightly that it forces the rendering of the video at a strict 60Hz from the original system so he can just buffer a couple of lines when using HDMI ... but that would not be cycle exact (it can still be cycle perfect in that each and every cycle mimics the real HW it just has slightly more/less cycles per second compared to the original). The impact of that slight increase/decrease is usually negligible but it may impact some godforsaken games (they will behave just a tad differently at the beginning but in some pathological cases it may accumulate over time until the game behaves very differently] whether that matters to you or not is a different story (I likely wouldn't care).

no, just because the entire thing is slightly slower, it's still cycle accurate. The game has no way of knowing that it's running slower or faster than real time without some kind of time reference from the outside world, so to speak. Kind of the same argument that we're all living in what amounts to a giant simulation vs. a "real" universe. The FPGA (or other emulator) is the same type of idea. You're recreating the "universe" that the game runs in, and the accuracy of said universe isn't dependent on speed.

 

Now, if you were to feed button presses "open loop" into a system that's running slightly slower/faster than it should, of course it will desynch, but this is just because of the fact that you're feeding it signals open loop. In fact, the crystal in your typical videogame system is not perfect, and feeding a long i.e. speed run "open loop" into a videogame system could easily cause it to desynch over time because it's running slightly faster or slower.

 

This is why if you wish to "play back" a speed run on i.e. an NES, you have to synchronize it to the vblank or controller read pulses if you want the playback to work reliably. This style of playback should work fine on the nt mini running at 60.0 fps too. I don't think it's possible for a human player to "detect" the 60.0fps vs. 60.09fps, either without some kind of outside help. They will definitely notice it's running on a flat screen vs. CRT though.

  • Like 4
Link to comment
Share on other sites

That makes the emulator great but it doesn't remove the 1-2 lag in frames.

 

Lemme explain .... higan/bsnes runs on modern HW which can only render whole frames (there's no access to perline sync afaik) as such the previous frame must be first buffered in its entirety before being displayed (this happens as the current frame is being displayed). Then there's the difference in frequency of refresh among the original game and the HW you use the emulator on, if they are not multiple of each others you have tearing if you're not careful, to avoid tearing you need buffering, one would think you can lump the 2 together [full frame render + diff in sync] but it's not that easy depending on how much/little the sync difference is.

So told if you can simply make the emulating HOST double or triple to original framerate you're good and can lump in at the sub frame, but I am not sure how many people get to play on a LCD/LED display at 120/180/240Hz to begin with.

Add to that that you're going to need to speed up/slow down the original system to match an exact multiple of the screen display (no console ever run at exactly 60Hz for example) and you see how it's really hard to avoid that 1 full frame of delay.

 

Kevtris in his FPGA speeds up/down the original system logic (all of it) ever so slightly that it forces the rendering of the video at a strict 60Hz from the original system so he can just buffer a couple of lines when using HDMI ... but that would not be cycle exact (it can still be cycle perfect in that each and every cycle mimics the real HW it just has slightly more/less cycles per second compared to the original). The impact of that slight increase/decrease is usually negligible but it may impact some godforsaken games (they will behave just a tad differently at the beginning but that may accumulate over time until the game behaves very differently] whether that matters to you or not is a different story (I likely wouldn't care).

Also if it does you can just use the rgb out and the system cycles, timings, and fps will be exactly the same as the original hardware.

 

I believe the only time something like that could come into play would be in speed runs as the game clock should be synced properly so your game will be adjusted to the extremely slight difference but that'll still add up with prolonged play time.

Link to comment
Share on other sites

no, just because the entire thing is slightly slower, it's still cycle accurate. The game has no way of knowing that it's running slower or faster than real time without some kind of time reference from the outside world, so to speak. Kind of the same argument that we're all living in what amounts to a giant simulation vs. a "real" universe. The FPGA (or other emulator) is the same type of idea. You're recreating the "universe" that the game runs in, and the accuracy of said universe isn't dependent on speed.

 

Now, if you were to feed button presses "open loop" into a system that's running slightly slower/faster than it should, of course it will desynch, but this is just because of the fact that you're feeding it signals open loop. In fact, the crystal in your typical videogame system is not perfect, and feeding a long i.e. speed run "open loop" into a videogame system could easily cause it to desynch over time because it's running slightly faster or slower.

 

This is why if you wish to "play back" a speed run on i.e. an NES, you have to synchronize it to the vblank or controller read pulses if you want the playback to work reliably. This style of playback should work fine on the nt mini running at 60.0 fps too. I don't think it's possible for a human player to "detect" the 60.0fps vs. 60.09fps, either without some kind of outside help. They will definitely notice it's running on a flat screen vs. CRT though.

 

All anyone had to say was "PAL is 50hz, and PAL regions got 17% intentionally slower devices"

 

Basically the difference between 60 and 60.09 only matters to the display device, not to the player. It's likely that you could tweak the clock on a real NES/SNES to be exactly 60 or 59.94 and get it to work better on frame doublers/LCD's as well.

Link to comment
Share on other sites

 

All anyone had to say was "PAL is 50hz, and PAL regions got 17% intentionally slower devices"

 

Basically the difference between 60 and 60.09 only matters to the display device, not to the player. It's likely that you could tweak the clock on a real NES/SNES to be exactly 60 or 59.94 and get it to work better on frame doublers/LCD's as well.

 

The devices weren't 17% slower, usually the CPU clock speed difference between an NTSC machine and a PAL machine is between 1-3%. Straight ports of NTSC games to PAL consoles could be 17% slower, but a good port could minimize any decrease in speed.

Link to comment
Share on other sites

no, just because the entire thing is slightly slower, it's still cycle accurate. The game has no way of knowing that it's running slower or faster than real time without some kind of time reference from the outside world, so to speak. Kind of the same argument that we're all living in what amounts to a giant simulation vs. a "real" universe. The FPGA (or other emulator) is the same type of idea. You're recreating the "universe" that the game runs in, and the accuracy of said universe isn't dependent on speed.

 

....

This is hypothetical so bear with me.

 

Let's assume there's a game that during VBLANK mostly computes the enemy AI and it buckets it (say hard/medium/easy) based on the timing the game perceive the user responds to events.

In this case depending on how sensitive the bucketing is the fact that the whole machine runs a little slower/faster as respect to when it was programmed translates in measuring the human reaction slightly different (assuming the human tends to stay the same) and depending on how long the interval of measure is (slowing the system down will make the interval look shorter as the human appears to be quicker in system time) it may decide the human is too slow/fast and adjust the difficulty.

 

The issue on speeding up/down was not that the VBLANK is no longer in sync with the CPU hence more/less instructions are executed, that's not it, the issue is wrt possible measures the game can do on human reactions, those would be a little off and can lead to the game behave differently. Obviously if you slow down you make it easier for the human to react "faster" according to the game and viceversa.

 

I do not know how many games would be programmed like that but it is possible, maybe none, I remember one of the first PacMan on DOS, once the CPU went from 8088 to 8086 with turbo then to 286/386/486 it reached a point in which the game was unplayable (you could hear the start note and you were dead) ... yes in that case the game was not sync to VBLANK at all, and so it exacerbated the phenomenon.

 

I understand your speed-up/slow-down genius intuition (it really is, you should patent it I don't think anyone did it like that before) to be the only way to make it to work with no lag over HDMI, so I am not complaining and I realize that very few people on even less games may even ever notice if ever (likely not me) but the slow down/speed up is there built in and that makes it not cycle perfect down to the frequency (as you said though even the original crystals had to be within some tolerances but I believe those were far less than even the puny 1 over 600 of your NES core).

 

As I said this is all hypothetical and in no way geared towards saying your core in HDMI are not accurate, but if the original thing can perform ~601 in 10secs and yours can do 600 there's a difference, isn't there?

Go to 100 secs and you got 10 more VBLANK in the original ....

 

EDIT: to make the point.

Nt Mini analog output at ~60.1Hz vs Nt mini HDMI output at 60Hz. (I consider the Nt Mini analog output perfect in regard to frequency to a real NES for this example)

As I said in 100 secs one does 10VBLANK more than the other, or if you prefer in 1H one has the advantage of 6 secs over the other (the fast one at the 1H mark has 6 secs of advantage or 0.17 %), if both were running the 24H of LeMans then the analog output would beat the HDMI output by more than 2 minutes.

So in HDMI mode it can't be cycle perfect as the cycle itself is shorter/longer, but it can still be cycle accurate in that it does the same exact sequence in the same exact proportion as the original.

Link to comment
Share on other sites

Is the LCD display device itself buffering a full frame? Or is it rendering as the data comes in?

The way I had it explained to me was that regardless of if your input matches your display resolution all lcds still input everything at the same speed they would content that didn't match the display resolution and had to be upscaled because it would be a lot more effort to add a check:

if input resolution == display resolution

display

else upscale

 

And have some kind of input processor dedicated to checking the input separate from the one in the upscaler. That simply would be added costs to appease less than 0.00001% of their market share as almost no one other than hard core retro gamers give a crap because everyone else is used to the delay and modern games are designed with it in mind. You have to remember when you make a few million displays cutting out a few cents by removing extra features that appeal to extremely niche markets and cost cutting is just the way of the world. If some niche product wants to come out like the xrgb mini and sell 7k units a year to do upscaling better for $400 then the people targeting the major demographics and making billions probably don't even know it exists to laugh at it. In the future it may be ironic when there is no input delay and a new wave of retro gamers complains about the slower games responding too quickly by a fraction of a second and making their games too easy.

 

Also early lcds had the problem of accepting analogue video in, which meant that the input needed to be converted and upscaled regardless of if the resolutions matched which is where you would see like 5-6 (or more) frames of delay added and was a more significant problem. It is still a problem to this day but most quality devices today keep a digital input at 1 frame of delay-ish. I believe many of the retro gamers that despise lcds probably had a really awful experience with early gen expensive lcd tech.

 

Either way the impact a quality modern lcd has on your gaming experience is virtually nonexistant. (That said I'm sure the technology will still continue to improve and one day it won't be an issue.)

Edited by Wolf_
Link to comment
Share on other sites

This is hypothetical so bear with me.

 

Let's assume there's a game that during VBLANK mostly computes the enemy AI and it buckets it (say hard/medium/easy) based on the timing the game perceive the user responds to events.

In this case depending on how sensitive the bucketing is the fact that the whole machine runs a little slower/faster as respect to when it was programmed translates in measuring the human reaction slightly different (assuming the human tends to stay the same) and depending on how long the interval of measure is (slowing the system down will make the interval look shorter as the human appears to be quicker in system time) it may decide the human is too slow/fast and adjust the difficulty.

 

The issue on speeding up/down was not that the VBLANK is no longer in sync with the CPU hence more/less instructions are executed, that's not it, the issue is wrt possible measures the game can do on human reactions, those would be a little off and can lead to the game behave differently. Obviously if you slow down you make it easier for the human to react "faster" according to the game and viceversa.

 

I do not know how many games would be programmed like that but it is possible, maybe none, I remember one of the first PacMan on DOS, once the CPU went from 8088 to 8086 with turbo then to 286/386/486 it reached a point in which the game was unplayable (you could hear the start note and you were dead) ... yes in that case the game was not sync to VBLANK at all, and so it exacerbated the phenomenon.

 

I understand your speed-up/slow-down genius intuition (it really is, you should patent it I don't think anyone did it like that before) to be the only way to make it to work with no lag over HDMI, so I am not complaining and I realize that very few people on even less games may even ever notice if ever (likely not me) but the slow down/speed up is there built in and that makes it not cycle accurate down to the frequency (as you said though even the original crystals had to be within some tolerances but I believe those were far less than even the puny 1 over 600 of your NES core).

 

As I said this is all hypothetical and in no way geared towards saying your core in HDMI are not accurate, but if the original thing can perform ~601 in 10secs and yours can do 600 so there's a difference, isn't there?

Go to 100secs and you got 10 more VBLANK for the original ....

 

EDIT: to make the point.

Nt Mini analog out at ~60.1Hz vs Nt mini HDMI out at 60Hz. (I consider the Nt Mini analog out perfect in regard to frequency to a real NES for this example)

As I said in 100 sec one does 10VBLANK more than the other, or if you prefer in 1H one has the advantage of 6 secs over the other (the fast one at the 1H mark as 6 secs of advantage or 0.17 %), if both were running the 24H of LeMans then the analog out would beat the HDMI out by more than 2 minutes.

So in HDMI mode it can't be cycle perfect as the cycle itself is shorter/longer, but it can still be cycle accurate in that it does the same exact sequence in the same exact proportion as the original.

With all do respect, the NT Mini loses one frame over a real NES every 600 frames. So you play for ten minutes, the NT Mini is one second slower than original hardware. No human is going to notice a one second delay over a ten minute span. Maybe speed runners give a damn but IMO speed runs should be measured in frames rather than hours:minutes:seconds. And the vast majority of recording devices (except VHS)will drop frames to match 60fps or 59.94fps so it's a moot point.
Link to comment
Share on other sites

With all do respect, the NT Mini loses one frame over a real NES every 600 frames. So you play for ten minutes, the NT Mini is one second slower than original hardware. No human is going to notice a one second delay over a ten minute span. Maybe speed runners give a damn but IMO speed runs should be measured in frames rather than hours:minutes:seconds. And the vast majority of recording devices (except VHS)will drop frames to match 60fps or 59.94fps so it's a moot point.

I understand your point and I agree that unless a game is programmed to the effect that after say 1H something needs to happen and 6 seconds off may ruin the "surprise!!!!" then nobody cares.

 

My whole and complete point is that to support 60Hz fix for the HDMI std lag free/stutter free (which is an highlight of the Nt Mini imho) it REQUIRES a compromise, small or big depends on the actual console/game system, the NES seems to be at 1/600 and as you state really just a jiffy.

 

An MVS runs stock at 59.18 which if accelerated at 60Hz is now at 591/600 vs the NES 600/601, or 9 times bigger difference at 9/600 or 1.5% (compare with the 0.17% of NES), again plenty acceptable to me in order to have lag free/stutter free HDMI, just stating that there's a difference when you have to slow down/speed up the console and that such difference depends on the console/game.

 

This brings me to FPGA arcade support over HDMI .... and R-Type and it's 55Hz .... now we're talking acceleration (for 1080@60p) or deceleration (for 1080@50p) of about 10% either way (both p60 and p50 being acceptable HDMI std worldwide).... definitely something everyone will notice and why that particular game lag free/stutter free over current HDMI 1080p won't happen (I heard they were working on allowing some flexibility on the frequency of the signals but then it depends if the hi-def panels would support the whole range or just buffer and convert ... ouch).

 

Luckily the analog outputs of the Nt Mini are perfect and kevin already stated that over analog there's no speed up/slow down .... so people playing over CRT got the exact timing down to that last 1/600 for NES

When kevtris tackles the MVS in the Z3K project I bet he will do the same. SNK already twiddle with the MVS video timing when they made the AES ( https://wiki.neogeodev.org/index.php?title=Framerate ) and pushed the frequency to 59.59 already altering the experience by 0.75%, another 0.75% should not matter to have lag free/stutter free HDMI output (it won't matter to me).

 

Back to the uber-point: software emulators have the issue that modern PCs do not allow per line timing (at least afaik) so even if they go down to the last cycle they still will have to put up with 1 frame lag, FPGAs in order to win back that 1 frame (or more for bad emulators) over a digital connection have to change the timing of what they reproduce (big or small doesn't matter here). To each his own but the current digital std for image reproduction (PC monitors, HD TV and whatnot) simply does not allow for a perfect recreation of the analog of yore (no fault of the people making either emulators or FPGAs recreations ... it is just the way the current digital signaling works to stay within the standards). Actually the "tolerance" of the old CRTs is what created the issue in the first place (allowing each console manufacturer to vomit signals that are off here and there but "work" nonetheless) but that's for another day given that the same "tolerance" allowed 240p as well.

Link to comment
Share on other sites

Either way, people are making mountains out of molehills. Except for world class speedrunners who occupy a tiny fraction of the retrogaming community, the loss of one frame our of 600 is a non-issue. Games like Rtype could best be handled by allowing the core to decide if 50Hz or 60Hz is an acceptable speed for the player, ie an option to either speed up or slow down gameplay. There are no lost cycles when the entire time scale is stretched by fractions of a percent.

Link to comment
Share on other sites

Either way, people are making mountains out of molehills. Except for world class speedrunners who occupy a tiny fraction of the retrogaming community, the loss of one frame our of 600 is a non-issue. Games like Rtype could best be handled by allowing the core to decide if 50Hz or 60Hz is an acceptable speed for the player, ie an option to either speed up or slow down gameplay. There are no lost cycles when the entire time scale is stretched by fractions of a percent.

Maybe but the for the MVS case is 9 frames every 600 or 0.9 every 60, that's almost a full frame added per second ... that's why it depends on the console, I think it's great the NES runs at 60.01 greatly diminishing the impact.

Link to comment
Share on other sites

Even one frame gained or lost per second is just barely percptible to average players.

I know I play my non native HDMI consoles via XRGB mini and that adds at least 1.5 frames anyhow.

 

We are rat-holing, I am not criticizing kevtris work in the least (I actually think he is a genius to have thought about it, likely a few years ago at that), just stating that there's a technological limit to how close we can get to those analog systems when we move the output to the currently std of HDMI/DVI/DisplayPort/WhatHaveYou etc... in order to even do it we need to make a compromise, either there's lag/stutter (which is worse imho) or we alter the source material ... either way something's gotta give. The amount of altering is what a lot people focus on, I was just pointing out that for lag free/stutter free it MUST be altered to begin with, no mountain, no biggie, just a statement of facts.

Link to comment
Share on other sites

It's small stuff really. Not like PAL vs NTSC issues gamers suffered with bitd. Playing the speed optimized version of SMB in the PAL 3-in-1 on my NTSC system is a roit. It's like Mario's been hitting the 'roids... :grin:

 

One could go as far as to say a console isn't stock if the regulator/caps have been upgraded, or has A/V mod or sn't using the janky RF switch it shipped with 30 years ago. These newer FPGA systems are to make old games accessible again to modern audiences, without the shortcomings of emulation or barely working hardware.

Link to comment
Share on other sites

It's small stuff really. Not like PAL vs NTSC issues gamers suffered with bitd. Playing the speed optimized version of SMB in the PAL 3-in-1 on my NTSC system is a roit. It's like Mario's been hitting the 'roids... :grin:

Again it depends.

Think about the MVS and it's ~1 frame difference per second (speed up in that case, and it's really just like 0.8 but to simplify let's use 1). MVS games have an attract mode that run continuously, if you let them run side by side (CRT based and HDMI based sped up) in 60 seconds the HDMI one is ahead of 1 sec (if instead you had a 1-frame lag HDMI version it would always lag 1-2 frames and have stutter but it will not move ahead). Again no biggie, just putting this in context with the statement that we can have FPGA based reproduction consoles that are identical to the original AND support lag free/stutter free HDMI .... not possible, if you drop the lag free/stutter free requirement OR the HDMI requirement OR the "identical" requirement then plenty possible.

Agreed that the small penalty for lag free/stutter free HDMI is usually acceptable (either that or have an option to go to a mode with some lag and some stutter if people think it's better, and it may well be for slow plodding non twitching strategy games or chess).

Link to comment
Share on other sites

I brought my NT Mini to a friend's house yesterday, just to show him what it could do on his huge HDMI living room TV, and while it worked great for the most part, we got a "surprise" when we tried Mega Man Xtreme with the GBC core. Colors got weird during the intro sequence, and during the actual game, everything worked perfectly at first, but after passing a save point, all the sprites became dark single-colored shadows. It's like the game does something weird with the color palette, and the GBC core doesn't reproduce the expected effect properly. Has anyone else experienced this issue with this particular game? :)

Link to comment
Share on other sites

The other stroke of genius is that the Nt Mini supports all manner of analog signals at the proper 60.1fps NES speed. If someone wants to try for a speedrun, then they can always use the analog output and be content.

Good point. But then you've got display lag and upconversion artifacts. I still think the best method would be to "split" the composite feed from the NES (RCA splitters aren't impedance matched and will only darken the image) or use the RF to CRT display and record from the composite. In the case of NT Mini, composite to CRT and S-Vid to upscaler to capture device. Ideally the NT Mini could output multiple analog signal such as RGB to the upscaler and composite to CRT (output display gets lag free image and capture device gets highest quality image) but unfortunately the NT Mini don't have separate composite out so multiple outputs aren't possible.

 

There is an awful lot of diarrhea in this thread.

Nonsense. I recommend Facebook for that! :roll:
Link to comment
Share on other sites

The old fashioned way works as well, connect the RF cable to a VCR's input and then connect the VCR's output to a TV and record to tape. It won't look very good but it is accurate. Of course I doubt anyone would want to do this except for fun.

Edited by Great Hierophant
Link to comment
Share on other sites

 

I call it drama.

None of the above.

 

Simply stating that SW based emus are pegged by the refresh rate on the host machine, they will always lag by at least 1 full framebuffer at the host refresh rate (I am not aware of a modern system that gives easy SW control on a per line level ... but my knowledge is a little old so I may be wrong), at 60Hz it is ~1 full source frame, at 120Hz it is half that (as the host can present the full buffer of the previous frame in half the time and simply double it) etc.. so the best lag you can have depends on the host rendering frequency, at the extreme if you can have 14400Hz (240 x 60) you'll be lagging by at most one line (as it takes that little time to the host to render the previous frame in its entirety then it simply replicates it 240 times until the next one is ready).

As for the stutter it all depends on how mis-aligned the 2 frequencies are or if you care to speed up/slow down the emulated system.

 

Wrt FPGA the situation is similar if you want HDMI support, but the lag does not have to be minimum 1 full frame of the host renderer as kevtris proved, a few lines is all that's needed. And it can also be cycle perfect and lag free/stutter free over analog (SW emus on modern systems can't do that as the analog output for the gfx cards that still support it is also coming from a framebuffer afaik and to have no tearing effects you need to buffer the whole frame in its entirety).

 

That's really it, and I agree no-one that cares about lag free/stutter free over HDMI would care for the small speed up/slow down inherent to it.

Again no drama, I am eagerly awaiting for the Z3K and its support for 16bit systems, if there was a cheaper variant of the Nt Mini I wouldn't mind throwing my support behind kevtris 8bit work as well (not that he needs it of course), as it stands I just do not like 8bit systems enough anymore (nothing to do with kevin's work which is truly outstanding, just my personal preference).

Link to comment
Share on other sites

... Ideally the NT Mini could output multiple analog signal such as RGB to the upscaler and composite to CRT (output display gets lag free image and capture device gets highest quality image) but unfortunately the NT Mini don't have separate composite out so multiple outputs aren't possible.

If a device emits RGB and proper sync one can always use an external color encoder to obtain simultaneous multi analog outputs.

For example a Sony CXA2075 can be fed RGB and the relative sync and colorburst externally (different for PAL/NTSC) and it supports simultaneous output over composite, S-Video and RGB (it's not entirely RGB pass-thru but it's good).

There are other ICs that do the same obviously.

I do not know if the sync output of the Nt Mini in RGB mode is enough to pull this off maybe kevtris can chime in.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...