hckrnws
I wish they included the window compositor as something that can introduce latency because I'd like to learn more about it.
When I switched from Windows to Linux on the same hardware I noticed a lot of keyboard input latency when playing games, at least 150ms. This only happens to me with niri, KDE Plasma (Wayland) feels identical to Windows. So did Hyprland. I'm able to reproduce it on multiple systems when I have a 4k display running at 1:1 native scaling. On AMD cards, turning off v-sync helped reduce it but it didn't remove it. With an NVIDIA card, turning off v-sync made no difference. I believe it's semi-related to that 4k display because when I unplug that display and use my 2560x1440 monitor, it's much less noticeable despite getting a solid 60 FPS with both monitors. All that to say, there's certainly a lot more than your input device, GPU and display playing a role.
If anyone played Quake on a dial-up connection with client side prediction turned off, that is the exact same feeling. It's pressing a key and then seeing the screen update X ms afterwards.
Windows solution to this is exclusive fullscreen, which bypasses the compositor.
You can try Gamescope [1] from Valve, that's what Steam Deck uses - i think its a compositor designed to minimize latency but support the few things games need. Some compositors like KDE Plasma KWin support a direct scanout mode which is the same idea as windows' exclusive fullscreen. You might need to look for support for something similar in niri.
Thanks, I have tried gamescope but it kills the performance of games for me. All games have a lot of stuttering when I use it. It also didn't reduce the input latency. Same hardware is liquid smooth on Windows.
As far as I know niri enables direct scanout by default. It's an option you can disable if you want https://niri-wm.github.io/niri/Configuration%3A-Debug-Option.... I do not have this set which indicates direct scanout is enabled.
It's interesting because the latency is only when pressing keys on the keyboard. Mouse movement and button press latency feels as good as Windows, I can't perceive any delay. I tried 3 keyboards, it's all the same. I'm also not running anything like keyd or anything that intercepts keys. It's a vanilla Arch Linux system on both of the systems I tested.
Input lag is one of those things you feel before you can explain it. Good to finally have a resource that breaks down the full chain — controller, engine, display — instead of just blaming the monitor like everyone does
The engine section is the part most developers seem to ignore. A locked 60fps doesn't mean 16ms latency, and that gap make me surprise
I used to get into arguments all the time about how triple-buffering reduces latency, and I think it's because we lacked resources like this; people assume it adds the additional back buffer to a queue, when the traditional implementation "renders ahead" and swaps the most recently-completed back buffer. It's a subtle difference but significantly reduces the worst-case latency vs. a simple queue.
I think most people get their information from help blurbs in settings menus for PC games, which are often hilariously vague or incorrect.
Vulkan's presentation API makes this distinction explicit: VK_PRESENT_MODE_MAILBOX_KHR is the "replace if already queued" mode that actually reduces latency, while VK_PRESENT_MODE_FIFO_KHR is the pipeline-queue variant that adds frames ahead of time. OpenGL never standardized the difference,
so "triple buffering" meant whatever the driver implemented -- usually vendor-specific extension behavior that varied between hardware. The naming confusion outlived OpenGL's dominance because the concepts got established before any cross-platform API gave them precise semantics.1. It doesn’t help that on Windows’ “Triple buffering” options actually means FIFO forced three-frame buffering. So people had prestablished PTSD from those dreadfully laggy smoothing.
2. Triple buffering does not reduce latency compared to unsynced tearing. It’s a spatial vs temporal tradeoff between whether to let frequency mismatches manifest as tearing or jitter. For passive consumption of motion, losing temporal consistency in exchange for spatial cohesion is the better tradeoff and so triple buffering is appropriate. For active controls of motion and its feedback, temporal consistency is absolutely critical whereas spatial cohesion while in motion is far, far less important, so triple buffering is unacceptable in this use case.
'Input lag' should really be called 'Output lag', as most of it usually comes from the display device and/or graphics pipeline, not input devices
One area of focus missing here is game streaming / remote play (Steam Link, Moonlight, etc. over a local network).
I've come to accept input lag, but mostly play games where it doesn't matter (simple platformers, turn-based games, etc). I know steam link from my home desktop to my ~5 year smart TV is adding latency to my inputs – though I can't tell if it's from my router, desktop, or TV – but I've come to accept it for the convenience of playing on the couch (usually with someone watching next to me).
I know some blame is on the TV, as often if I just hard-reset the worst of the lag spikes go away (clearly some background task is hogging CPU). And sometimes the sound system glitches and repeats the same tone until I reset that. Still worth putting up with for the couch.
Build an sffpc, have it by the tv :)
quite a few syntactical errors on this website. I’d suggest running it through an LLM and telling it to fix the mistakes without altering anything else!
Good to know it's human written
Crafted by Rajat
Source Code