I was recently lucky enough to buy an OLED monitor and it’s great. What is not so great is the amount of flickering I get in Gnome now when I have the experimental VRR setting enabled.
Now all OLED monitors have a certain amount of VRR flicker, but I am comparing it to my Windows duel boot and it’s absolutely terrible under Gnome, like just a noticeable increase in the amount of flicker under both games and the desktop versus Windows. The only way I get Windows to flicker as much on the desktop is if I turn on “dynamic refresh rate”, which kind of appears to be what Gnome is doing all the time. I can turn on the refresh rate panel on my monitor and Gnome can fluctuate all over the place, even on the desktop, whereas Windows is steady at max refresh (again one I turn off dynamic refresh rate, which is a separate setting then VRR).
For games the flicker is way worse using proton under Wayland (which GE supports). Hunt Showdown - which I play a lot, looks incredibly flickery when vsync and Wayland are turned on, it basically has a strobing effect.
Anyone else seen this in action? Any suggestions for a fix? Should I swap over to KDE for a bit until Gnome gets this straightened out or will Plasma have the same problems?
Yes, I’ve been bothered by VRR flicker on my OLED monitor (LG 27GR95QE) since I started actively gaming on it with my Linux build a couple months ago, it was never an issue with consoles for me.
I’m on KDE FWIW, and the flicker is more pronounced during games with mouse cursor on screen afaict. I can’t compare to Windows.
I think VRR flicker is less of an issue when running games within a gamescope session, but it’s not ideal either.Kind of a bummer to hear - I was hoping KDE’s VRR implementation might avoid the issue. It may be a Wayland problem so that would be unavoidable.
Edit: did some testing with a live image tonight - at least on my machine KDE seems much better when it comes to flicker
What is not so great is the amount of flickering I get in Gnome now when I have the experimental VRR setting enabled.
The only way I get Windows to flicker as much on the desktop is if I turn on adaptive refresh rate, which kind of appears to be what Gnome is doing all the time.
I don’t totally get what you’re trying to accomplish. If you don’t want VRR in the desktop environment, are you wanting VRR only to be active when a fullscreen game or movie player is running or something?
EDIT: I’d also add that my understanding is that brightness fluctuation is kind of part and parcel with VRR on current OLED display controllers. I don’t think that it’s a fundamental limitation, that you could make a display controller that did a better job, but I’ve read articles matching up OLED monitors, and all of them that I’ve read about suffer from this. Like, if I got an OLED monitor today myself, I’d probably just set a high static refresh rate (which, fortunately, is something that OLED does do well).
Setting a high refresh rate is somewhat of a given, but won’t negate anything which VRR helps with - screen tearing. If you’re always playing with VSync on and getting constant frame rates, that’s not an issue, but that’s also far from the usual experience.
Setting a high refresh rate is somewhat of a given, but won’t negate anything which VRR helps with - screen tearing.
I mean, I’d just turn on vsync; that’s what it’s for. VRR is to let you push out a frame at the instant that it finishes rendering. The benefit of that declines as the monitor refresh rate rises, since there’s less delay until the next frame goes to the monitor.
If you’re always playing with VSync on and getting constant frame rates, that’s not an issue
looks blank
Constant framerates? You’re saying that you get tearing with vsync on if whatever program you’re using can’t handle rendering at whatever the monitor’s refresh rate is? I mean, it shouldn’t.
Running a static refresh rate with vsync will add a tiny bit of latency until the image shows up on the screen relative to VRR, but that’s a function of the refresh rate; that falls off as the refresh rate rises.
https://www.reddit.com/r/XboxSeriesX/comments/t3fn6l/can_someone_explain_vrr_like_im_5_what_it_does/
Ok, so let’s say your tv is a typical 60hz TV, that means it updates 60 times a second, regardless of the games frame rate. A 60fps game will be in perfect sync with your TV, as will 30fps because each frame will just be displayed twice. When your game is running at a frame rate in between it’s not in sync with the display any more and you end up with screen tearing, as the image being sent to the TV changes part way through the image being displayed.
VRR stands for Variable a refresh Rate. It basically means the displays refresh rate can vary to match the source of the image, so that it always stays in sync.
This a pretty good explanation of what VRR is doing. Basically makes it so you can drop frames and it still feels smooth.
Right. What I’m saying is that the benefit that VRR provides falls off as monitor refresh rate increases. From your link:
If a game on console doesn’t deliver new frame on time, two things can happen.
Console can wait for a new TV frame, delaying display time about 16.7 ms (VSYNC). Which leads to an effect called stuttering and uneven frame pacing…
If you have a 60 Hz display, the maximum amount of time that software can wait until a rendered frame goes to a static refresh rate screen is 1/60th of a second.
But if you have a 240 Hz display, the maximum amount of time that software can wait until a rendered frame is sent to a static refresh rate screen is 1/240th of a second.
OLED monitors have no meaningful framerate physical constraints from the LED elements on refresh rate; that traditionally comes from the LCD elements (well, I mean, you could have higher rates, but the LCD elements can only respond so quickly). If the controller and the display protocol can handle it, an OLED monitor can basically display at whatever rate you want. So OLED monitors out there tend to support pretty good refresh rates.
Looking at Amazon, my first page of OLED monitor results has all capable of 240Hz or 480Hz, except for one at 140 Hz.
That doesn’t mean that there is zero latency, but it’s getting pretty small.
Doesn’t mean that there isn’t value to VRR, just that it declines as the refresh rate rises.
Reason I bring it up is because I’d been looking at OLED monitors recently myself, and the VRR brightness issues with current OLED display controllers was one of the main concerns that I had (well, that and burn-in potential) and I’d decided that if I were going to get an OLED monitor before the display controller situation changes WRT VRR, I’d just run at a high static refresh rate.
It does sound like there’s a way to ask GNOME to use VRR only when fullscreen stuff is running:
I prefer Gnome+Dash2Dock- there I have set it to do VRR only in fullscreen apps (aka games).
But the user doesn’t specify what he’s done to enable that setting, and I’m not familiar with GNOME’s (mutter’s?) Wayland settings. But if you are okay with VRR only with fullscreen apps, looking into that might address the issue.