Jump to content

  • Log In with Google      Sign In   
  • Create Account


Why doesn't screen tearing happen at 5 fps?


Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

  • You cannot reply to this topic
5 replies to this topic

#1 noatom   Members   -  Reputation: 707

Like
1Likes
Like

Posted 29 January 2014 - 06:15 AM

This is one of those stupid questions...but yeah,i couldn't get it out of my head so I had to ask.

 

Normally,screen tearing happens when there are more than 60 fps rendered,since the flipping is done before the monitor finishes rendering what is on the front buffer.

 

But,why doesn't it happen below 60 fps too? How is it that it's always synchronized,so it NEVER happens that a flip is done when the monitor is still rendering from the front buffer?


Alexander Turda - The place where I talk about games, coding, movies, and whatever passes through my mind.


Sponsor:

#2 Erik Rufelt   Crossbones+   -  Reputation: 2777

Like
3Likes
Like

Posted 29 January 2014 - 06:34 AM

If VSync is off it does happen below 60, even at 5. However it disappears again after 17 ms, even when the image is updated only after 200 ms, so you don't really notice it.



#3 richardurich   Members   -  Reputation: 1160

Like
0Likes
Like

Posted 29 January 2014 - 10:03 AM

At 5fps, a single frame gets drawn 12 times on a 60Hz display. So you get 1/12 of the screens tearing at most. It's a lot harder to see tearing in 1 of every 12 frames than 12 of every 12 frames. Basically, you need screen after screen to tear before your brain lets you see it. Just think of how amazing it is that your brain handles interlaced video, and it will seem like nothing that your brain can handle 1/12 of frames tearing so well.



#4 frob   Moderators   -  Reputation: 16154

Like
3Likes
Like

Posted 29 January 2014 - 11:49 AM

Normally,screen tearing happens when there are more than 60 fps rendered,since the flipping is done before the monitor finishes rendering what is on the front buffer.

 

Tearing can happen at any frame rate. 

 

All it means is that part of the image is drawn on screen while another part of the old remains.  

 

Even systems like PowerPoint and word processors can tear, and in fact this happens frequently. Have you ever seen a blinking text cursor where half of the cursor lit up and then an instant later the other half appeared? Or have you ever bogged down the system and then started scrolling web pages while they were loading, causing parts to be updated but other parts to remain stationary? Both of these are forms of tearing.

 

 

How is it that it's always synchronized,so it NEVER happens that a flip is done when the monitor is still rendering from the front buffer?

 

 

In games when you have vsync enabled and are rendering to a back buffer tearing is kept to a minimum. The video card will wait until the entire image is prepared and then flip from one image to the next during the vsync period. The vsync period is the relatively long delay in the video signal that allows a CRT beam to jump vertically from the bottom of the screen to the top of the screen. There are also shorter hsync periods that are a delay for the CRT beams to reset horizontally which can be used for various purposes as well.

 

Even using both vsync and double buffering it is still possible to have image tearing in some exceptional circumstances. The word "NEVER" doesn't apply, replace it with "RARELY" and your statement is better.


Check out my personal indie blog at bryanwagstaff.com.

#5 Vortez   Crossbones+   -  Reputation: 2189

Like
0Likes
Like

Posted 30 January 2014 - 08:43 PM

Speaking of vsync, i wonder if this stuff will caugh on. I think it's about time we universalize this kind of stuffs, crt are a thing of the past, in most of the cases.


My 3D Engine.

#6 Hodgman   Moderators   -  Reputation: 23970

Like
3Likes
Like

Posted 30 January 2014 - 11:51 PM

Speaking of vsync, i wonder if this stuff will caugh on. I think it's about time we universalize this kind of stuffs, crt are a thing of the past, in most of the cases.

The funny thing about G-Sync, is that there's apparently already a VESA standard for variable refresh rates (in the display port specification).

Some LCD panels already support this spec, so if your GPU also does, then you can get variable VBLANK timings without buying a fancy new "G-Sync capable" monitor.

It seems that what we actually want is DisplayPort capable monitors cool.png

 

With that revealed, it seems that buying/using G-Sync chips are just a way that panel manufacturers can choose to implement the full DisplayPort specification -- with the caveat that the chip contains a DRM system that will disable these extra features if it's connected to a non-nVidia GPU...

Variable VBLANK capability makes me happy, DRM-ing the capability makes me sad wacko.png


Edited by Hodgman, 30 January 2014 - 11:55 PM.





Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.



PARTNERS