Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie
Hi there,
There is an issue with role permissions that is being worked on at the moment.
If you are having trouble with access or permissions on regional forums please post here to get access: https://www.boards.ie/discussion/2058365403/you-do-not-have-permission-for-that#latest

VGA Vs HDMI for PC

  • 21-04-2009 3:10pm
    #1
    Closed Accounts Posts: 2


    Hi All,

    I'm looking to buy an LCD TV, with a view to hooking it up to my pc, i have a good graphics card so thats not a problem.

    what im looking for advice on is this, which is better, a HDMI cable or a standard VGA cable. I ask because a standard VGA cable will transfer resolutions of 2096 X 1600 say at up to 200Hz if you have a really good CRT Monitor. and HDMI can do up 100Hz Full HD which is supposed to be 1966 x 1080p.

    Everywhere is trying to tell me that a HDMI cable is better than the vga cable, but im sceptical because the VGA cable to me would seem to be the better choice, but im told by every assistant that the HDMI cable is far better no matter what.

    please!!!

    tell me what is better, a VGA cable or HDMI cable AND WHY????


    thanks all, really appreciate it :D


Comments

  • Registered Users, Registered Users 2 Posts: 17,164 ✭✭✭✭astrofool


    HDMI is, it's a completely digital connection, so no quality loss. For VGA, it starts off as digital (in the frame buffer of the graphics card), gets converted to Analog by the RAMDAC (loss occurs), sent down the cable, and converted back to digital on the TV (more loss).

    Full HD is 1920*1200 which is what a single link DVI cable will give, HDMI is equivilent bandwidth to dual link DVI by default (If I remember correctly), which allows you to go to 2560*1600. Refresh rate is also not an issue on LCD's.


  • Registered Users, Registered Users 2 Posts: 3,537 ✭✭✭SickBoy


    astrofool wrote: »
    HDMI is, it's a completely digital connection, so no quality loss. For VGA, it starts off as digital (in the frame buffer of the graphics card), gets converted to Analog by the RAMDAC (loss occurs), sent down the cable, and converted back to digital on the TV (more loss).

    Full HD is 1920*1200 which is what a single link DVI cable will give, HDMI is equivilent bandwidth to dual link DVI by default (If I remember correctly), which allows you to go to 2560*1600. Refresh rate is also not an issue on LCD's.

    Full HD for a 16:9 TV is 1920x1080 not 1920x1200(16:10)


  • Closed Accounts Posts: 2,227 ✭✭✭gamer


    WITH vga you HAVE to set graphics card vga out refresh rate to match your tv,too high and you risk damaging the tv tube,60mhz is a safe rate.With hdmi ,its just plug in the cable ,as far as i know.AND you have to set vga out resolution to one that tv accepts, eg average tv may only have 3 vga in resolution settings,read the tv manual 4 more info.MOST take 1024X768 ,IF tv is hd tv.


  • Banned (with Prison Access) Posts: 890 ✭✭✭CrinkElite


    I recently baught a 32" lg hdtv and it's a thing of beauty. I had until even more recently been using a vga cable with it which posed some annoying problems, for one thing there was a slight ghosting of the picture about 0.5cm to the right. very slight but noticable none the less. I also had issues with recentering the screen at different resolutions.
    It was not possable to get the screen positioned correctly in some resolutions. I found a 2.4M HDMI cable in Xtravision (of all places) for E20 and I would highly recomend the investment. Now I have a sharper picture with MUCH blacker blacks and richer colours and the television is able to fit the image to its borders automaticaly, there are also more options available in the tv's menus. I'm not sure about the screen you're concerned with but if it has HDMI in I would urge you to invest the pittens it costs for a new cable, be careful though, stay away from GameStop and Sony shop E50-60 for the exact same thing.


  • Registered Users, Registered Users 2 Posts: 4,539 ✭✭✭BenEadir


    Hi,

    I have a Dell Studio XPS 1340 with Vista Ultimate and a NVIDIA GeForce 9500M graphics card. It has a HDMI port which I've tried using to connect to my Panasonic Vierra Plasma (which has 3 HDMI ports) and I simply cannot get either picture or sound onto the TV and I know I'm selecting the right HDMI port on the TV. (I've tried them all BTW!!).

    I've looked all over and can't find a solution. Can anyone give me some pointers?

    Regards,

    Ben


  • Advertisement
  • Banned (with Prison Access) Posts: 890 ✭✭✭CrinkElite


    BenEadir wrote: »
    Hi,

    I have a Dell Studio XPS 1340 with Vista Ultimate and a NVIDIA GeForce 9500M graphics card. It has a HDMI port which I've tried using to connect to my Panasonic Vierra Plasma (which has 3 HDMI ports) and I simply cannot get either picture or sound onto the TV and I know I'm selecting the right HDMI port on the TV. (I've tried them all BTW!!).

    I've looked all over and can't find a solution. Can anyone give me some pointers?

    Regards,

    Ben

    what is max/default refresh rate of your tv?

    ifyour video card is not detecting your tv proprly it may be outputting at a higher/invalid refresh rate, I would suggest that you find the exact specs of the display either online, in your manual or in some cases in the tv menu itself,
    I would then connect a regular monitor to the computer and plug in your tv to the other dvi/hdmi port(assuming there is one), you should be able to use the nvidia control panel to configure your tv display settings on your monitor. make sure you use the correct resolution and refresh rate (1080 p 60/1080 i 30/720p/i ect.) as setting these values incorrectly can damage your set. I'm not familiar with the latest nvidia control panel but you should be able to assign your tv as the primary display device when your settings are accurate.
    needless to say you should update your video drivers to the latest version.

    hope this helped, I am by no means an expert and I feel your pain at the lack of info available for specific tvs, If this doesnt work you could maybe try the nvidia forums(if you havn't already done so.)


  • Closed Accounts Posts: 2,227 ✭✭✭gamer


    SET REFRESH RATE ON the graphics card at 60mhz,
    my hdtv max refresh rate is 60htz, 50mthz 4 720p.i checked my samsung hdtv tv manual.
    download powerstrip,free prog ,helps you set advanced settings 4 pc to tv connection.set tv out pal b.
    see your tv manual pc reccommended resolution.try 720x576,or 1024x768.
    see www.svideo.com re advanced pc2tv settings.
    ps prog may let you set graphics card refresh rate 2 50mhz,,thru custom settings.
    try 720p at 50mhz refresh rate.


Advertisement