• 1 Post
  • 23 Comments
Joined 2 years ago
cake
Cake day: June 21st, 2023

help-circle






  • Because the “delayed” or real input does not correspond to the image you see on the screen. That’s why FG is most useful when you already have high base framerate as the input gets significantly lower and the discrepancy between the felt input and perceived image narrows.

    Example:

    30FPS is 33.3ms frame to frame latency (+ something extra from mouse to displayed image for input)

    With 2x FG you get at most 60FPS assuming there’s no performance penalty for FG. So you see 16.6ms + mouse to display frame to frame but input remains 33.3ms + mouse to display.

    Same from base 60FPS 16.6ms to FG 120FPS 8.3ms perceived but 16.6ms+

    Same from 120FPS 8.3ms base to FG 240FPS 4.15ms perceived…

    As you can see the difference in input gets smaller and smaller between base FPS and FG FPS as you’re increasing the base framerate.

    This is however a perfect scenario that does not represent real world cases. Usually your base FPS fluctuates due to CPU and GPU intensive scenes. And during those flucfuations you will get big inpuy delay spikes that can be felt a lot as they suddenly widen the range between perceived image and real input… Couple that with the fact that FG almost always has a performance penalty as it puts more strain on the GPU so your base framerate and therefore input will be automatically higher.





  • Baldurs Gate 3 AFAIK does not officially support FSR4 and this works with it with OptiScaler (I’ve tried on Steam Deck). Wanted to try on PC as well but game has updated to the official Linux supported version and this does not work with it because it’s Vulkan only now. My internet is slow so I can’t be bothered to redownloadalmost 100GB just to downgrade the game version. Will have to probably check what’s in my library.



  • there is a modified .dll you can use to replace the one in a game folder… AMD leaked it accidentally when they were releasing some open source stuff

    I can send you a link tomorrow or upload it, Im not at my PC right now

    edit:

    here is link https://gofile.io/d/fiyGuj

    you need to rename it to amd_fidelityfx_dx12.dll and replace the one in the game folder and it should work (in Cyberpunk). I had to use OptiScaler for Hogwards Legacy as just replacing the .dll made the game crash on launch and it was necessary to spoof it as DLSS



  • Yes, it’s the INT8, not FP8 version.

    Why would FSR had anything to do with input lag? The only reason why input lag would increase is due to FSR4 being more difficult to run on RDNA2 which would be due to lower FPS as FPS is also directly tied to input lag.

    But we are talking about 120FPS vs 150FPS here when comparing Quality Presets so I doubt you could even tell. And even if you can, just lower the preset, it will still look better and get you to the same performance.

    From multiple games I’ve tested so far my conclusion is that I am almost always CPU limited in most games even with 5800X3D (in CP2077, Hogwards Legacy, Kingdom Come Deliverance 2), most areas are CPU heavy due to a lot of NPCs and FPS drops in those areas enough where my GPU is bored, the only benefit of FSR in those areas is that FSR4 looks better but wont yield any performance benefits.