Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie
Hi there,
There is an issue with role permissions that is being worked on at the moment.
If you are having trouble with access or permissions on regional forums please post here to get access: https://www.boards.ie/discussion/2058365403/you-do-not-have-permission-for-that#latest

Screen tearing

  • 20-08-2005 10:19am
    #1
    Closed Accounts Posts: 1,248 ✭✭✭


    Have an X850 pro and while wathcing a dvd i noticed tearing across the screen. This also happens sometimes while gaming. What could be causing this


Comments

  • Registered Users, Registered Users 2 Posts: 1,557 ✭✭✭dubdvd


    maybe its a heat problem ..what temps is the card running at and what heatsink do you have on it ??????if its not heat related then it could be software ??"drivers" are you using the likes of the DNA or any of the optimised drivers ...??? more info please


  • Registered Users, Registered Users 2 Posts: 15,817 ✭✭✭✭po0k


    Vsync.


  • Registered Users, Registered Users 2 Posts: 7,816 ✭✭✭Calibos


    SyxPak wrote:
    Vsync.

    ie maybe you should think about using V-Sync

    Has the OP a Flat LCD monitor??

    Anyway heres a post I made over at OCUK on the subject of tearing in games. Dunno if tearing On dvd is related

    My basic understanding of Tearing, V-sync and triple buffering is this......

    Tearing is caused when the frame rate is higher than the monitor refresh rate. So this is why you get less tearing in FarCry than HL2. FarCry is harder on the graphics card than HL2 and it cant push as many frames in FarCry. So you rarely get any tearing in FarCry because the graphics card rarely can pump out more frames per second than your refresh rate. For example, say you have a CRT monitor set at a refresh rate of 85hz. In any given game you play on that PC and on that monitor, your graphics card would have to pump out more than 85 frames per second for you to see tearing. In this scenario the graphics card may be pumping out more than 85FPS a lot more in HL2 because it is less grapically intensive for the card than Farcry where the graphics card can rarely manage to pump out more than 85FPS. Say in FarCry your FPS averages 80. Because thats less than the monitor refresh rate in this example of 85hz, you get no tearing. Now if you lowered your monitor refresh to 75hz then now you have a scenario where FarCry's average FPS of 80 is higher than the monitor refresh rate and now all of a sudden you would see a lot more tearing. Another way you might all of a sudden see a lot more tearing in FarCry even as much is in HL2 would be if you got a new graphics card. Again in this example your monitor refresh rate is 85hz say. With your existing Graphics card pumping out an average of only 80FPS you get little or no tearing as explained earlier. Now suddenly you put in your new super-duper graphics card which is capable of an average FPS of 95. New FPS of 95>refresh rate of 85hz so all of a sudden you start to see as much tearing in FarCry is in HL2.

    Now obviously you want to run your CRT monitor at as high a refresh rate as you can for the particular resolution you are running at to minimise flicker. This is what most people do and this is why tearing doesn't seem to be as much of a problem for CRT owners as LCD users. They generally run their monitors at a much higher refresh rate which is less likely to be beaten by the FPS that their graphics card can pump out in that particular game. Its only CRT users that have a more top end system and graphics card that is capable of pushing out more FPS than their refresh rate that see the tearing. Needless to say say most people no matter what type of monitor they have, have more modest systems whos FPS doesn't beat the refresh rate. So the minority of CRT users with top end systems means a minority of CRT users that see tearing.

    On the other side of the coin LCD users refresh rate is usually 60hz. LCD panels dont actually refresh like a CRT display but windows treats them as having the equivelent of a 60hz refresh rate. You might ask why LCD users aren't angry that they can't use the high refresh rates like crt owners to reduce flicker and are stuck with 60hz. Well like I said LCD displays method of refresh is different anyway and we don't see flicker even at 60hz whereas most CRT users couldn't bear the flicker they would see if they set their crt's to 60hz. So... we LCD users use 60hz. Our graphics cards only have to pump out more than 60FPS for us to see tearing. It takes only a modest PC and graphics card to pump out more than 60FPS. There are a lot more people with modest systems than top end systems. ie a lot more LCD users out there with modest systems, therefore a lot more LCD users see tearing. Its not like tearing is a problem with LCD technology, its a problem in that a lot more people have PC's capable of beating an LCD's refresh of 60hz than there are PC's capable of beating a CRT's refresh of say 85hz+

    Now V-Sync as we know is a method of combating tearing. Basically It 'Syncs' the graphics cards FPS with the monitors refresh rate so that the Graphics card never pumps out more FPS than the refresh rate ie the FPS is always kept 'less than or equal to' the refresh rate. To re-iterate, tearing happens when the FPS is greater than the refresh rate. V-Sync stops the FPS in a given game ever being more than the refresh rate...so...no tearing. The downside to V-Sync is that in its attempt to keep FPS the same as the refresh rate, if the graphics card can't generate the FPS equivelent of the refresh rate in a particularily busy part of the game it drops the FPS down to a figure half of the refresh rate. If it can't maintain that FPS (say there was a lot of action on screen) it drops it down to half of that figure again. I don't know why it 'Has' to do this but I am sure there is some techy reason. Anyway to re-use our HL2/FarCry example on an LCD monitor running at a refresh of 60hz. In most of the game the graphics card can happily maintain the FPS of 60 to match the refresh of 60hz. ie no tearing cause v-sync is on and a nice smooth playable FPS of 60FPS. Along comes a section of the game with a lot happening on screen. The graphics card can't maintain 60FPS so V-Sync drops the FPS to 30. Even it the graphics card was capable of 55FPS here V-Sync won't let the FPS merely drop to 55. It drops it down to half the refresh of 60. ie 30FPS Still reasonably playable FPS all the same. Unfortunately along comes a huge group of AntLions on screen and the graphics card can't make 30FPS either. It could only manage 28FPS. Guess what? The FPS is halved again by V-Sync down to 15FPS. Now the frame rate is unplayable and everything is jerky. ARRGGHH!! This is why people are prepared to put up with Tearing rather than turn on V-Sync to stop the tearing. Using our example but on a CRT users system running at 85hz. Say he has a top end system pumping out more FPS than his refresh of 85hz. He has tearing, so he turns on V-Sync. His FPS is synced at 85FPS to match his refresh by V-Sync. He gets to a busy part of the game and his graphics card can't manage 85FPS. So like our LCD example his FPS drops to half the refresh rate. Thing is though, he is starting with a higher figure to work from. ie 85hz rather than 60. Half of 85 is 42.5FPS. Very playable. Again the Antlions arrive and his Graphics card can't manage the 42.5. It halves to 21FPS. Now thats not a great FPS and its not exactly smooth but its nowhere near as bad is the 15FPS that the LCD user had to suffer. So what we see is that a CRT user is less likely to have tearing for the reasons explained earlier and is thus less likely to need to use V-Sync. And then even if he does need to use V-Sync to combat any tearing he might have, the negative effects of V-Sync are less of a problem anyway on his System with CRT monitor running at its higher refresh rate. Thats why you see less CRT users moaning about tearing and/or V-Sync, and its mostly us LCD users moaning.

    Now LCD monitors have been improving through the years and many of the benefits CRT monitors had over LCD panels have begun to disappear. Colours are arguably as good now on LCD's as on CRT's. The same goes for contrast levels. LCD's have a sharper image and zero refresh rate flicker. Even ghosting on LCD panels is becoming a thing of the past. So many of the reasons CRT owners had for prefering CRT types of display over LCD's are being eroded. One of the few reasons for CRT owners to stick with CRT over LCD left is........ the more common problems of tearing and the need for V-Sync with LCD's with its problems explained earlier.

    And this is where 'Triple-Buffering' comes into the story!! Basically triple buffering gets rid of the V-Sync downside of halving of FPS. So like before V-Sync trys to match FPS to the monitor refresh rate. But now when a busy section of game comes along and the graphics card can't maintain the 60FPS to match the LCD's 60hz refresh, say it only manages 55FPS like my earlier example. Instead of V-Sync ignoring the fact that the graphics card can actually do 55FPS in this part of the game and dropping the FPS down to 30FPS (Half) it actually lets the graphics card do the 55FPS. So you have eliminated tearing and you are able to keep your higher FPS. Woohoo! We are only now starting to see triple buffering make an appearance. Now at the moment none of the current games have triple buffering in their options menu be they DirectX or OpenGL based games. If you have an NVidia card and are using the latest drivers there is now an option in the graphics card driver to 'Force' triple buffering in OpenGL based games. There is no option in these drivers to force triple buffering in DirectX games. There is no option in the Latest ATI drivers to force triple buffering in either OpenGL or DirectX based games as yet AFAIK.
    There is one way of forcing triple buffering in DirectX atm but it involves installing microsofts .NET and a third party program and it only works on some games and the procedure has to be done each time you start up the game so its a bit of a pain.

    This limited availablity of triple buffering should come to an end with future GC driver releases and most if not all new games will come with a triple buffering option in-game.
    Then CRT owners will have one less excuse to move over to LCD's.

    The one thing you can do right now to prevent tearing and the need for V-Sync is to bump those game details and resolution up so that your Graphics card can never manage to pump out more FPS than your refresh rate....assuming of course that the new FPS you do get at these higher res' and detail levels isn't unplayable. :grin:


  • Registered Users, Registered Users 2 Posts: 15,817 ✭✭✭✭po0k


    Or just turn on Vsync and run your monitor at 85Hz or higher.
    If anything it'll make gaming easier as the framerate won't be fluctuating as much.


  • Registered Users, Registered Users 2 Posts: 7,816 ✭✭✭Calibos


    Not an option if he has an LCD which can only run at 60hz.


  • Advertisement
  • Closed Accounts Posts: 2,279 ✭✭✭DemonOfTheFall


    Just turn on Vsync and the tearing will disappear. As for when playing DVDs... I'd say check the DVD playing program's options and make sure its using the best renderer available, probably VMR9 (Renderless). This will do all the rendering with your graphics card rather than CPU and should help a bit


Advertisement