Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie
Hi there,
There is an issue with role permissions that is being worked on at the moment.
If you are having trouble with access or permissions on regional forums please post here to get access: https://www.boards.ie/discussion/2058365403/you-do-not-have-permission-for-that#latest

Much of a difference (VGA v DVI)

  • 30-08-2006 8:44am
    #1
    Closed Accounts Posts: 156 ✭✭


    Hi

    I recently bought a 19" LCD screen that can support both VGA and DVI. My video card is an old nVidia MX2 400 with only VGA output. Would I notice any difference if I bought a graphics card with DVI? I don't play games if that makes any difference, mainly surfing and digital photography.

    What do you think?

    Dara


Comments

  • Registered Users, Registered Users 2 Posts: 1,315 ✭✭✭quazzy


    Dara,
    From Google Answers and wikipedia:

    The main difference is that VGA is an analog standard for computer
    monitors that was first marketed in 1987, whereas DVI is a newer and
    superior digital technology that has the potential to provide a much
    better picture.

    Here are two relevant excerpts from Wikipedia for your convenience:

    "Existing standards, such as VGA, are analog and designed for CRT
    based devices. As the source transmits each horizontal line of the
    image, it varies its output voltage to represent the desired
    brightness. In a CRT device, this is used to vary the intensity of the
    scanning beam as it moves across the screen. However, in digital
    displays, instead of a scanning beam there is an array of pixels and a
    single brightness value must be chosen for each. The decoder does this
    by sampling the voltage of the input signal at regular intervals. When
    the source is also a digital device (such as a computer), this can
    lead to distortion if the samples are not taken at the centre of each
    pixel, and in general the crosstalk between adjacent pixels is high."
    SOURCE: http://en.wikipedia.org/wiki/Dvi

    "DVI takes a different approach. The desired brightness of the pixels
    is transmitted as a list of binary numbers. When the display is driven
    at its native resolution, all it has to do is read each number and
    apply that brightness to the appropriate pixel. In this way, each
    pixel in the output buffer of the source device corresponds directly
    to one pixel in the display device, whereas with an analog signal the
    appearance of each pixel may be affected by its adjacent pixels as
    well as by electrical noise and other forms of analog distortion."
    SOURCE: http://en.wikipedia.org/wiki/Dvi



    Or in short "DVI is better for LCD screens"

    Hope this sheds some light on the subject

    Regards

    Q

    P.S dont know much about Graphics Cards themselves, but if you post back with your PC specs I'm sure there's a few boards members that can advise on a card with DVI.

    or just check out http://www.komplett.ie/k/kc.asp?bn=10412
    I think all cards now have DVI and they start at about €50.


  • Closed Accounts Posts: 156 ✭✭daramullally


    Thanks quazzy for the technical explanation.

    My Pc is a fairly old AMD 2000+ with a Gigabyte GA7vm400m motherboard that support AGP. Can anyone recommend any from here ?http://www.komplett.ie/k/kl.asp?wf=X&bn=10413&minPrice=&maxPrice=60&inStockOnly=on


  • Closed Accounts Posts: 16,713 ✭✭✭✭jor el


    Since you're going for the cheap end of things and gaming is not on the cards then it probably doesn't matter which one you choose. I'd probably go for the Radeon 9600 for €59 if it were my choice.


  • Registered Users, Registered Users 2 Posts: 21,499 ✭✭✭✭Alun


    jor el wrote:
    Since you're going for the cheap end of things and gaming is not on the cards then it probably doesn't matter which one you choose. I'd probably go for the Radeon 9600 for €59 if it were my choice.
    I have a Creative Radeon 9600 based card with DVI output to my NEC LCD and it's the bees knees, very sharp and crisp and noticeably better than VGA at the same native resulution (1280x1024 in my case).


  • Registered Users, Registered Users 2 Posts: 4,864 ✭✭✭MunsterCycling


    Have a look on www.adverts.ie , you're bound to find something there.


  • Advertisement
  • Closed Accounts Posts: 156 ✭✭daramullally


    Hi

    Thanks for all your help. I will take a look at the Radeon 9600 so.

    Cheers
    Dara


  • Registered Users, Registered Users 2 Posts: 14,012 ✭✭✭✭Cuddlesworth


    I have 2 pc's here at my desk side by side with the same dell monitor and 2 very similar cards. The same basic windows visual settings on both. But one is connected by Dvi the other Vga. The Dvi is brighter, crisper and the colours look slightly more vibrant.


Advertisement