

I even checked the stuff with MAME 0.127 from 2008.
Nestopia vsync Pc#
My PC is from 2010, an Intel QuadCore, 2.33 GHz, 4 GB RAM, which should be more than enough to run MAME smoothly, so it's probably not the hardware. I don't know why vsync in Direct3D lags so much more than vsync in DirectDraw. (So, since the duration without any options is 3 frames, with vsync in Direct3D, Mario will take ~6.3 frames until he jumps.)īut if you use vsync with DirectDraw, then the lag will merely be ~0.3 frames, i.e. If you use Direct3D, vsync will produce about ~3.3 additional frames. The other option to remove tearing: vsync. Input lag is an integrated part of it and my tests confirmed it. That's why you shouldn't use triple buffering. Even the most perfect hardware cannot do anything against it: If MAME shows scene 1 while rendering scene 2 and 3 in memory, then the input will refer to scene 3 while you are seeing scene 1 on the screen. Using triple buffering adds 2 to ~3.5 frames to the general 3 frames before the output reflects the input.Īs I said above (and if I've understood triple buffering correctly), that's natural: If you render two screens in memory before displaying them, the visible output is always two frames behind the actual game scene.

And now we see how many frames the whole procedure needs with additional options.Īt first: Don't use triple buffering. So, from pressing the button to the first jumping animation on the screen it's 3 frames. Super Mario Bros." (which is also set to 60 Hz) and Mario stands on the ground and you press the jump button, then it will take 3 frames until he jumps. If you have your screen set to 60 Hz and play "Vs. Plays fine for you, input-wise, but you want to remove the tearing when the game scrolls, then you might ask yourself what options to set without adding input lag. What I was referring to: If we take MAME with its default options as the basis, how will other options increase input lag? If someone thinks that the games already lag with the default settings, then I can't help him either. Well, as I said: It's not about reducing input lag from an absolute point of view. > you figure it out let us know but I don't think anyone here has figured it out, thus > I've never seen anyone figure out all the best settings to reduce it meaningfully. RetroArch :: Couldn't find any valid shaders in XML file. I/O warning : failed to load external entity "" RetroArch: Detecting screen resolution 1680x945. RetroArch: Setting multimedia scheduling for DWM. RetroArch: Set audio input rate to: 44063.25 Hz. RetroArch: Environ SET_PIXEL_FORMAT: XRGB8888. RetroArch: Loading ROM file: C:/Users/Owner/Documents/Gaming/Emulators/Nestopia/NES Roms/Gimmick!.nes. RetroArch: Loading dynamic libretro from: "C:/Users/Owner/Desktop/retroarch-win64-0.9.8/libretro-144-nestopia-x86_64.dll" Assuming system directory is same folder as game: "C:/Users/Owner/Documents/Gaming/Emulators/Nestopia/NES Roms/". RetroArch :: system_directory is not set in config. Phoenix_last_rom = "C:/Users/Owner/Documents/Gaming/Emulators/Nestopia/NES Roms/Gimmick!.nes" RetroArch: Loading config from: C:\Users\Owner\Desktop\retroarch-win64-0.9.8\\retroarch.cfg. RAW Paste Data Copied RetroArch CMD: retroarch C:/Users/Owner/Documents/Gaming/Emulators/Nestopia/NES Roms/Gimmick!.nes -c C:\Users\Owner\Desktop\retroarch-win64-0.9.8\\retroarch.cfg -v -no-patch
