Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie

2 monitors use more power than one?

Options
  • 17-11-2008 8:04pm
    #1
    Closed Accounts Posts: 19,082 ✭✭✭✭


    I'm having some over heating problems with my graphics card. Will running 2 monitors from it put more of a strain on it than just running one? A notcible strain in terms of power consumption / heat generation ?

    Thanks


Comments

  • Registered Users Posts: 9,926 ✭✭✭trout


    I run two monitors from one card at home ... I've never noticed any extra overhead on the graphics card.

    Have you overclocked the graphics card in any way ? Are you running at very high res / refresh rates ?


  • Registered Users Posts: 17,371 ✭✭✭✭Zillah


    Refresh rate has nothing to do with the graphics card. As for two monitors, I don't know for sure but it sounds quite reasonable that directing a second signal could add to a card's heat generation.


  • Registered Users Posts: 82,185 ✭✭✭✭Overheal


    what card is it


  • Registered Users Posts: 9,926 ✭✭✭trout


    Zillah wrote: »
    Refresh rate has nothing to do with the graphics card. As for two monitors, I don't know for sure but it sounds quite reasonable that directing a second signal could add to a card's heat generation.

    You sure about the refresh rate ?


  • Registered Users Posts: 17,371 ✭✭✭✭Zillah


    Pretty certain. Refresh rate is a monitor function, it doesn't require processing resources in the way that resolution/polygons/textures do.

    For example, you could hook a monitor up to a TV, which has no graphics card at all, and the refresh rate would still be dependent upon the monitor itself.


  • Advertisement
  • Registered Users Posts: 9,926 ✭✭✭trout


    If you are running through the VGA port on your graphics card, you can set the refresh rate from the card/drivers. I can change my refresh rate anywhere between 60 Hz and 120 Hz. I would expect this to have an impact on the card.

    Granted the monitor has to react to it ... but I believe the card sends the signal at a given rate, and the monitor has to match it.

    It is possible to drive a signal at a higher rate than the monitor will accept ... leading to black screens, or poorly synched images.

    This doesn't hold true if the card is driving the monitor through DVI.

    Running two monitors I would expect to draw more power, but this would be a strain on the PSU and not just the graphics card. If the screens are mirrored, I would not expect to see any great strain on the card.

    Just wondering ... If the screen is extended onto two monitors ... maybe that's more of a strain, requiring more processing and hence more heat from the gfx card ? This is how I run my setup (one VGA connection, one DVI) ... and I've never seen heat related problems.


    OP ... what setup have you got ? Any more details for us ? Did this just happen ? Moar info please.


  • Closed Accounts Posts: 1,663 ✭✭✭evil-monkey


    Random wrote: »
    I'm having some over heating problems with my graphics card. Will running 2 monitors from it put more of a strain on it than just running one? A notcible strain in terms of power consumption / heat generation ?

    Thanks

    noticeable strain in terms of power consumption / heat generation?? - no


  • Registered Users Posts: 17,371 ✭✭✭✭Zillah


    trout wrote: »
    If you are running through the VGA port on your graphics card, you can set the refresh rate from the card/drivers. I can change my refresh rate anywhere between 60 Hz and 120 Hz. I would expect this to have an impact on the card.

    Granted the monitor has to react to it ... but I believe the card sends the signal at a given rate, and the monitor has to match it.

    It is possible to drive a signal at a higher rate than the monitor will accept ... leading to black screens, or poorly synched images.

    This is true, there is some communication between the GPU and monitor in regard to refresh rate, but I'd imagine the strain would be minimal, by an order of magnitutes less than a change in, say, display resolution.

    Eitherway, I'd agree that power/split proccessing are more likely candidates.


  • Registered Users Posts: 13,983 ✭✭✭✭Cuddlesworth


    4 watts more from a x2300 mobile card on a laptop in work. So, no noticeable impact.


Advertisement