Why doesn't screen tearing happen at 5 fps?

Started by
4 comments, last by Hodgman 10 years, 2 months ago

This is one of those stupid questions...but yeah,i couldn't get it out of my head so I had to ask.

Normally,screen tearing happens when there are more than 60 fps rendered,since the flipping is done before the monitor finishes rendering what is on the front buffer.

But,why doesn't it happen below 60 fps too? How is it that it's always synchronized,so it NEVER happens that a flip is done when the monitor is still rendering from the front buffer?

Advertisement

If VSync is off it does happen below 60, even at 5. However it disappears again after 17 ms, even when the image is updated only after 200 ms, so you don't really notice it.

At 5fps, a single frame gets drawn 12 times on a 60Hz display. So you get 1/12 of the screens tearing at most. It's a lot harder to see tearing in 1 of every 12 frames than 12 of every 12 frames. Basically, you need screen after screen to tear before your brain lets you see it. Just think of how amazing it is that your brain handles interlaced video, and it will seem like nothing that your brain can handle 1/12 of frames tearing so well.

Normally,screen tearing happens when there are more than 60 fps rendered,since the flipping is done before the monitor finishes rendering what is on the front buffer.

Tearing can happen at any frame rate.

All it means is that part of the image is drawn on screen while another part of the old remains.

Even systems like PowerPoint and word processors can tear, and in fact this happens frequently. Have you ever seen a blinking text cursor where half of the cursor lit up and then an instant later the other half appeared? Or have you ever bogged down the system and then started scrolling web pages while they were loading, causing parts to be updated but other parts to remain stationary? Both of these are forms of tearing.

How is it that it's always synchronized,so it NEVER happens that a flip is done when the monitor is still rendering from the front buffer?

In games when you have vsync enabled and are rendering to a back buffer tearing is kept to a minimum. The video card will wait until the entire image is prepared and then flip from one image to the next during the vsync period. The vsync period is the relatively long delay in the video signal that allows a CRT beam to jump vertically from the bottom of the screen to the top of the screen. There are also shorter hsync periods that are a delay for the CRT beams to reset horizontally which can be used for various purposes as well.

Even using both vsync and double buffering it is still possible to have image tearing in some exceptional circumstances. The word "NEVER" doesn't apply, replace it with "RARELY" and your statement is better.

Speaking of vsync, i wonder if this stuff will caugh on. I think it's about time we universalize this kind of stuffs, crt are a thing of the past, in most of the cases.

Speaking of vsync, i wonder if this stuff will caugh on. I think it's about time we universalize this kind of stuffs, crt are a thing of the past, in most of the cases.

The funny thing about G-Sync, is that there's apparently already a VESA standard for variable refresh rates (in the display port specification).

Some LCD panels already support this spec, so if your GPU also does, then you can get variable VBLANK timings without buying a fancy new "G-Sync capable" monitor.

It seems that what we actually want is DisplayPort capable monitors cool.png

With that revealed, it seems that buying/using G-Sync chips are just a way that panel manufacturers can choose to implement the full DisplayPort specification -- with the caveat that the chip contains a DRM system that will disable these extra features if it's connected to a non-nVidia GPU...

Variable VBLANK capability makes me happy, DRM-ing the capability makes me sad wacko.png

This topic is closed to new replies.

Advertisement