Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie
Hi there,
There is an issue with role permissions that is being worked on at the moment.
If you are having trouble with access or permissions on regional forums please post here to get access: https://www.boards.ie/discussion/2058365403/you-do-not-have-permission-for-that#latest

hdtv or monitor?

  • 07-02-2008 2:32pm
    #1
    Moderators, Computer Games Moderators Posts: 14,723 Mod ✭✭✭✭


    Hi people,just a little dilemna to run by you tech heads if i may.
    Currently im running my gaming pc on a 22" samsung syncmaster 225bw 5 ms monitor.
    Sitting beside it is my xbox360 hooked up to a 32" panasonic viera hdtv @1080i.
    Im looking to save space and use just one display for both systems.
    I reckon the textures look better on the monitor but nicer colours and larger screen are attracting me to the hdtv.
    Any thoughts on which to go with?
    Im looking for a little nudge in the right direction here :)


Comments

  • Registered Users, Registered Users 2 Posts: 37,485 ✭✭✭✭Khannie


    32" is probably a bit big for non-gaming computer work.


  • Moderators, Computer Games Moderators Posts: 14,723 Mod ✭✭✭✭Dcully


    I shouldof said its 95% gaming and 5% browsing :)


  • Registered Users, Registered Users 2 Posts: 1,757 ✭✭✭Deliverance XXV


    I have my desktop hooked up to both my 22" LG monitor and my Sony Bravia 40"
    LCD and I have to say for gaming and internet in general there is no comparison.

    I always use the 22". The 40" is just... just too big!


  • Registered Users, Registered Users 2 Posts: 721 ✭✭✭stakey


    i'm using a 42" Sony Bravia for everything, PC, Wii, Laptop... it really depends on what you like yourself, personally I don't mind the 42" TV as a 'monitor' as it's handy for doing graphic design on, gaming and watching films, it's a bit much for browsing sometimes, but overall i'm happy with it.

    A 32" will more than likely be 720p as well so you're looking at around 1366x768 as your resolution on your desktop, you may not enjoy that too much when you compare it to the resolution you'd get on the 22" monitor.


  • Moderators, Computer Games Moderators Posts: 14,723 Mod ✭✭✭✭Dcully


    A 32" will more than likely be 720p

    Its 1080i


  • Advertisement
  • Registered Users, Registered Users 2 Posts: 37,485 ✭✭✭✭Khannie


    Dcully wrote: »
    I shouldof said its 95% gaming and 5% browsing :)

    Go for it.


  • Registered Users, Registered Users 2 Posts: 1,451 ✭✭✭Onikage


    Dcully wrote: »
    Its 1080i

    That means it can convert 1080i into 720p. It is really 720p.

    edit:
    And as I type this, I'm using a 37" monitor at 1360x768. It's great but I am sitting about 10 feet away. Any closer would be a problem.


  • Moderators, Computer Games Moderators Posts: 14,723 Mod ✭✭✭✭Dcully


    Yeah the problem is its only about 2 feet away from me on the desk.
    Ive used the hdtv all evening and i find it a strain on the eyes for browsing but great for gaming.


  • Closed Accounts Posts: 13,126 ✭✭✭✭calex71


    have a laptop connected to a 26" samsung hdtv with a vga cable, great for movies but for nomal pc stuff its head ache inducing


  • Moderators, Recreation & Hobbies Moderators, Science, Health & Environment Moderators, Technology & Internet Moderators Posts: 93,581 Mod ✭✭✭✭Capt'n Midnight


    could you get a projector and have two screens so you flip up the one near you for games and surf on the wall ?


  • Advertisement
  • Registered Users, Registered Users 2 Posts: 8,067 ✭✭✭L31mr0d


    why don't you just try the TV as your PC monitor for a week or so and see if you like it.

    The problem with 1080i is that the horizonal lines are interlaced and this will give you "interfield twitter" if you run the panel at max resolution (1920x1080) The reason for this is that, in reality, only 540 horizontal lines can be displayed at one time on a 1080i panel. It achieves the 1080 horizontal lines by switching between the odd numbered and the even numbered lines about 60 times a second.

    Now in movies or TV where the picture is more than likely always moving this interfield twitter isn't noticable at all for the average human eye. But for PC desktops and webpages with text I personally (having a 42" 1080i hooked up to my HTPC) found it VERY noticable, the text would twitch and the horizontal lines 1 pixel in height would flicker irritatingly. I ended up leaving the panel at 720p for browsing from the couch and just turn up the resolution for movies and tv programs.

    My advice would be to try out the TV as your main PC monitor and see if you like it and don't mind/notice the flicker.


  • Closed Accounts Posts: 2,174 ✭✭✭mathias


    The problem with 1080i is that the horizonal lines are interlaced and this will give you "interfield twitter" if you run the panel at max resolution (1920x1080) The reason for this is that, in reality, only 540 horizontal lines can be displayed at one time on a 1080i panel. It achieves the 1080 horizontal lines by switching between the odd numbered and the even numbered lines about 60 times a second.

    Thats technically incorrect , all flat panels display a progressive image and will do this at the native resolution of the screen, regardless of input its converted to progressive for display , only CRTs show an interlaced image.
    ( There is only one exception to this , and its proprietary to Hitachi , called AliS plasma technology and generally way below par when it comes to Plasmas and black levels... The vast majority of flat panels out there only show progressive )

    Indeed when it comes to 1080i v 1080p , on a flat panel with a resolution of 1920 x 1080 there is no difference in picture quality.

    http://www.hometheatermag.com/gearworks/1106gear/

    Any flickering or artifacts on the screen are therefore down to the scaler/electronics on the TV and are nothing to do with interlaced v progressive. The better flat panels on the market have no such issues.

    Having said that ...
    Large TV's are generally not suitable for an office environment where you are sitting with your face up to the screen as pixel size and viewing distance can make the picture seem very pixelated or grainy and this can be intensely uncomfortable.

    Its fine if you have a HTPC and surf from the couch with some sort of bluetooth/RF keyboard , but no good for the office.

    And finally , when using any flat panel for a computer , if you move away from the native resolution of the panel , the picture suffers terribly. There are no exceptions to this rule. You should never do this.


  • Registered Users, Registered Users 2 Posts: 8,067 ✭✭✭L31mr0d


    mathias wrote: »
    Thats technically incorrect , all flat panels display a progressive image and will do this at the native resolution of the screen, regardless of input its converted to progressive for display , only CRTs show an interlaced image.
    ( There is only one exception to this , and its proprietary to Hitachi , called AliS plasma technology and generally way below par when it comes to Plasmas and black levels... The vast majority of flat panels out there only show progressive )

    Indeed when it comes to 1080i v 1080p , on a flat panel with a resolution of 1920 x 1080 there is no difference in picture quality.

    http://www.hometheatermag.com/gearworks/1106gear/

    your link is not relevant. It is refering to 1080i and 1080p being equal when displaying a movie or program that has a refresh rate of 24fps. PC's do not adhere to this refresh rate. I will agree that interfield twitter is not present when watching TV and Movies that have a refresh rate of 24fps, but then I already said that in my previous post.

    This link is more relevant as it deals with interlacing and its drawbacks:

    http://en.wikipedia.org/wiki/Interlace

    If you try to run a 1080p PC signal through a panel only capable of 1080i its going to be interlaced and you will see flicker as an interlaced 1080p image will NEVER be able to display 1080 horizontal lines at the same time

    Also see this article:
    Getting my HDTV to work with my PC

    Refer to the question: "Why does my 1080i native set flicker so much?"

    EDIT: I've attached a picture I just generated that should be a good enough test for the OP. Its a 1920x1080 jpg with 540 black and 540 white alternating lines. If your panel can correctly display this resolution you should see no flicker, if it can't it will be a flickering mess.


  • Closed Accounts Posts: 2,174 ✭✭✭mathias


    This link is more relevant as it deals with interlacing and its drawbacks:

    http://en.wikipedia.org/wiki/Interlace

    If you try to run a 1080p PC signal through a panel only capable of 1080i its going to be interlaced and you will see flicker as an interlaced 1080p image will NEVER be able to display 1080 horizontal lines at the same time

    Also see this article:
    Getting my HDTV to work with my PC

    Refer to the question: "Why does my 1080i native set flicker so much?"

    The conclusions youve drawn from the above are a very common mistake when people are researching this issue on this side of the pond , they read american sites on HDTV , in america HDTVs are commonly available in CRT versions also , CRTs can display interlaced , flat panels cant ( with that one exception above ).

    Its a fact , all flat panels are inherently progressive , they dont do interlaced , its changed to progressive in the electronics long before you see it. So the flicker in that link is talking about the CRT HDTVs you can get elsewhere , they are not generally available here.
    http://digitalliving.cnet.co.uk/asktheeditors/0,39030511,49290812-1,00.htm

    And its complete nonsense that a panel cannot show a full image from an interlaced signal , it simply takes the two frames , joins them and displays the image.

    The 1080i v 1080p link is perfectly valid here where there are no CRT based HDTV's generally available ( there was a samsung available at one time but they were an almighty flop) and its not just related to 24fps , read it again.
    It is explaining that 1080 content , regardless of how its processed will produce the same output on a native 1080 TV.( flat panel of course) It doesnt matter if its interlaced or not.
    There are many such sites , heres a sample

    http://www.google.ie/search?hl=en&q=1080i+vs+1080p&meta=

    Edit : to clarify heres the same guy , essentially talking about the same subject , but this time read the second paragraph.
    http://blog.hometheatermag.com/geoffreymorrison/0807061080iv1080p/

    When flat panels here refer to max 1080i and so on , they are talking about the maximum signal they will accept , they are not talking about how they display.


  • Registered Users, Registered Users 2 Posts: 6,638 ✭✭✭zilog_jones


    mathias wrote: »
    And its complete nonsense that a panel cannot show a full image from an interlaced signal , it simply takes the two frames , joins them and displays the image.
    That is quite far from the truth. With 50i or 60i video, the display has to de-interlace the picture into 50 or 60 progressive frames. This is done in several different ways, often adaptively depending on the type of motion occuring - but it's far from a perfect science and the TV can often get it wrong, leading to the picture or some elements of the picture flickering, visible "combing" effects, blurring, etc.

    If it "simply takes two frames and joins them" (I assume you mean two fields) when there is motion in inbetween every field, it will just weave the fields together which looks like a mess (like this), and you will be losing temporal resolution as it will now be 25 or 30 Hz instead of 50/60. This happened on my TV when I tried playing a PS1 game on a PS3 over HDMI - I don't know if it was the PS3's fault or my TV just didn't know how to deinterlace 576i over HDMI, but was unplayable.


  • Closed Accounts Posts: 2,174 ✭✭✭mathias


    It de-interlaces the content and makes a progressive signal , without going into the technicalities of it , my phrase was an attempt to describe it in a non technical way , as the details are already linked in the above posts at least twice.

    but yes , you are right , some TV's dont de-interlace properly and leave various artifacts.


  • Registered Users, Registered Users 2 Posts: 37,485 ✭✭✭✭Khannie


    mathias wrote: »
    de-interlace properly

    Those two words do not belong in the same sentence IMO. :) Interlaced signals are all well and good (in fact, pixel for pixel, they would provide a better viewing experience than progressive) until you have some sudden motion. Then their drawbacks become all to clear.


  • Subscribers Posts: 6,408 ✭✭✭conzy


    Buy a decent 24" or 30" PC monitor that supports HDCP and maybe has component inputs..

    Then you have the best of both worlds, small pixel pitch for PC use, and the size and inputs for HD / Console use


Advertisement