Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie
Hi there,
There is an issue with role permissions that is being worked on at the moment.
If you are having trouble with access or permissions on regional forums please post here to get access: https://www.boards.ie/discussion/2058365403/you-do-not-have-permission-for-that#latest

1080p or

  • 27-09-2010 6:30pm
    #1
    Registered Users, Registered Users 2 Posts: 395 ✭✭


    can anyone tell me what is the difference between 1080p(my tv) and 1080i
    and my dtt box is 1080i when tuned in


Comments

  • Registered Users, Registered Users 2 Posts: 685 ✭✭✭lgs 4


    jamescc wrote: »
    can anyone tell me what is the difference between 1080p(my tv) and 1080i
    and my dtt box is 1080i when tuned in
    1080p is full hd ,1080i hd ready, put it in simple terms full hd has more pixs per square cm


  • Registered Users, Registered Users 2 Posts: 32,417 ✭✭✭✭watty


    i = interlace
    Broadcast in Europe is either 576i or 1080i, half the lines (even) sent in 1/50th of a second field, the other half inbetween gives the alternate (odd) lines in next 50th of a second. these two fields of 288 or 540 lines give 25frames per second of 576 or 1080 lines.

    p= progressive
    There is no interlacing. All the lines are drawn sequentially.

    If your TV ccan't do 1080i as well as 1080p natively on the display it has to "deinterlace" This works perfectly for content originating on film (inherently not interlaced) but blurs the picture a bit, especially on moving content that is from a interlaced camera or standards converted from USA content of 480i, 480p 720p or 1080i as their fps is 30 or 60 compared to our 25 or 50.

    Look up wikipedia


  • Registered Users, Registered Users 2 Posts: 395 ✭✭jamescc


    ok thanks


  • Closed Accounts Posts: 3,683 ✭✭✭Kensington


    lgs 4 wrote: »
    1080p is full hd ,1080i hd ready, put it in simple terms full hd has more pixs per square cm
    Not correct. 1080i and 1080p both have exactly the same number of pixels - 1920x1080. The difference between the is in the way the image is display on your screen - interlaced vs progressive.

    1080i sends all of the odd lines of each frame first, and once all of the odd lines are sent then sends all of the even lines (or vice versa).
    1080p sends all of the lines in one complete go.

    EDIT: Beaten by Watty!


  • Registered Users, Registered Users 2 Posts: 15,852 ✭✭✭✭The Cush


    Both Irish and UK HD DTT is 1080i. Don't know of any DTT network that use 1080p.


  • Advertisement
  • Registered Users, Registered Users 2 Posts: 395 ✭✭jamescc


    The Cush wrote: »
    Both Irish and UK HD DTT is 1080i. Don't know of any DTT network that use 1080p.

    sorry when i set the box up for first use it set up hdmi with 1080i but i do not know if it made any difference that my tv set uses 1080p (would it work)


  • Posts: 0 [Deleted User]


    I don't know of any broadcast network (full stop) that uses 1080p, it requires a colossal amount of bandwidth. The only things you'd really see it on would be games consoles or Blu-ray discs.


  • Registered Users, Registered Users 2 Posts: 32,417 ✭✭✭✭watty


    1080p uses twice the bandwidth.

    DVDs and BluRay are often stored "progressive" but at 24fps or 25fps* and the 60Hz/30fps is 3:2 pulled down to interlace and 50Hz/25fps is 1:1

    [*25fps is done by playing the film at 25FPS and pitch correcting the audio by 24/25ths]

    Browse wikipedia.


  • Registered Users, Registered Users 2 Posts: 15,852 ✭✭✭✭The Cush


    Karsini wrote: »
    I don't know of any broadcast network (full stop) that uses 1080p, it requires a colossal amount of bandwidth. The only things you'd really see it on would be games consoles or Blu-ray discs.

    Some of the US satellite networks (e.g. DirecTV) use it I believe and Austria's ORF use it for satellite distribution between regional centres.


  • Registered Users, Registered Users 2 Posts: 32,417 ✭✭✭✭watty


    Ah but in USA they have terrible 3:2 pull down artifiact from Film. It's a film channel on US satellite, it Might be 30fps. Also they use 720p, 1440x720 x 60 = 1920 x 1080 x 30 approximately. Again because 3:2 pulldown. I think they may have Fake HD progressive on terrestrial about 840 x 640p


  • Advertisement
  • Closed Accounts Posts: 3,683 ✭✭✭Kensington


    The Cush wrote: »
    Some of the US satellite networks (e.g. DirecTV) use it I believe and Austria's ORF use it for satellite distribution between regional centres.
    DirecTV use it alright, but I think only on their on-demand content. Not on linear broadcast.
    jamescc wrote:
    sorry when i set the box up for first use it set up hdmi with 1080i but i do not know if it made any difference that my tv set uses 1080p (would it work)
    It doesn't make much of a difference. All broadcast on DTT at present is interlaced anyway, if your box had a 1080p setting on it, then all it would be doing would be de-interlacing the content to 1080p. You would ultimately still be viewing an interlaced broadcast. 1080p TV just means that it can accept a native 1080p signal.


  • Registered Users, Registered Users 2 Posts: 2,711 ✭✭✭fat-tony


    watty wrote: »

    If your TV ccan't do 1080i as well as 1080p natively on the display it has to "deinterlace" This works perfectly for content originating on film (inherently not interlaced) but blurs the picture a bit, especially on moving content that is from a interlaced camera or ...
    Just want to verify something with you watty. Don't all home plasma/LCD panel TVs have to de-interlace broadcast signals in any case, as the display is progressive in nature? Obviously Blu-ray or similar signals are(can be) progressive, so there is no de-interlacing being done by the TV. I thought that only CRT displays used interlacing, but that all conventional panel TVs were progressive. Or are some panels actually using interlacing?


  • Registered Users, Registered Users 2 Posts: 32,417 ✭✭✭✭watty


    CRTs for TV are interlaced

    CRTs for Monitors are Progressive (except millions of years ago, but only on highest resolution, i.e. 800x600@60Hz progressive and horrible 1024x768@43Hz interlaced)

    Plasma, DLP, LCOS, LCD etc can be natively progressive, interlaced (both or either) or refreshed in blocks natively. Native Refresh rate may not even match video rate, which blurs image. Even native "Progressive" panels rarely update the screen in the actual progressive manner of CRT, which draws one line at a time sequentially with a moving spot. Typically larger resolution panels may be two halves each with separate row and column controllers, or even four quadrants. Lower resolution, may use a single matrix of row and column and depending on design have a fixed progressive refresh or variable rates of refresh that can be progressive or interlaced or even pairs of lines at a time at 1/2 resolution.

    Some HD Ready native Progressive panels don't even de-interlace 1080i! They throw away half the lines (540) and resample the 540 lines to fit the screen. I.e. they progressively display only alternate fields (all odd or all even)

    Most sub 11" LCDs are 235 to 240 lines. i.e. 1/2 525 line NTSC's 480 visible. Only show alternate fields, i.e. all odd or all even. For PAL 625 (or Digital version) of 576 visible lines they just display 235 or so of the 288 lines of alternate fields.

    You are now sorry you asked.

    The best solution is panels that natively support progressive and interlaced and different frame rates. Many 1366 x 768 HD ready are 60Hz progressive (really WXGA PC panels), thus have to convert everything as here we are 50Hz field 25fps. 288 or 540 of the lines 50 times a second, thus giving full frame 25 times a second of 576 or 1080. Since all Film source is 24fps, "full" progressive at 50Hz is no advantage.


  • Registered Users, Registered Users 2 Posts: 2,711 ✭✭✭fat-tony


    Hmmmm - thanks for the extensive reply, I think! Not sure that I am any the wiser, though! I would have thought that a standard - say Panasonic 42" full HD TV, with 1920x1080 panel, would display images progressively. That is, it would take two successive fields and combine them (using various interlacing strategies) and display it as one progressive frame at 25 frames per second. I know there are all kinds of refresh rate gimmicks enhancements used in TVs (my own Panny uses 100Hz) but I wasn't aware that any panel actually used interlacing to display fields - thought that was only a CRT thing using phosphors with a decay time suitable for viewer's persistence of vision. Can you point me at a TV model which interlaces its display as it is something which always intrigued me?


  • Registered Users, Registered Users 2 Posts: 32,417 ✭✭✭✭watty


    Sorry... no

    Since there are two fields with 1/2 the lines every 50th of second, progressive display at 25 fps would not be hard, but reduce the temporal resolution.

    The cleverer and more normal de-interlace, try to fill in the alternate lines from alternate 50Hz field to give 50fps.

    read about bob, weave, blend and inverse telecine http://en.wikipedia.org/wiki/Deinterlace WARNING. Article contains contentious errors.

    You can ONLY do perfect de-interlacing if the source was same frame rate as interlaced video and progressive.

    ie 25fps Film (24fps film converted to PAL) will perfectly de-interlace to 50fps progressive.
    It's impossible to de-interlace 50fps progressive or 25 fps interlaced (50 fields per second) perfectly from 576i 25fps 50Hz or 1080i 25fps 50Hz to 50fps progressive without either some artefacts or blurring.

    LCD and Plasma panels do in fact have inherent persistence. If you use high speed photography on supposed 5ms response LCDs you will see a trail.

    So for Broadcast TV an interlaced native display is best. For DVD or BD it doesn't really matter. For PC at close distance, progressive is best.

    The USA has a problem. 24FPS film doesn't go to 30fps interlace without artefacts. It's not a problem for the 25fps world.
    See http://en.wikipedia.org/wiki/Telecine#2:3_pulldown
    (Only NTSC 30fps world issue, i.e. North America, Japan)

    The eye also in centre of view has more persistence of vision than at edges. Flicker, if noticeable at all, is most noticeable at edge of vision (detecting predators attack?).



    So ideal is a TV that does 1080p for PC, games, Console
    (If USA DVD, and BD, but here it doesn't matter!)
    and does NATIVE 1080i on display for Broadcast TV. Only single line high larger horizontal lines and edges will have 25Hz flicker, with natural lag of screen and persistence of eye, it's not much worse than on CRT unless you look out of edge of your vision.

    Note that 1920 x 1080i and 1920 x 1080 p full HD are SAME resolution. The 50fps progressive simply has twice the refresh rate for horizontal detail less than 2 line high than 25fps 50 field per second interlace.

    De interlacing without annoying artefacts always reduces resolution.

    No manufacturer I've seen explains how the panel is actually refresh or advertises native Interlace support, partially because of issue of Film content in USA interlace market.

    See also http://en.wikipedia.org/wiki/HDTV_Blur


  • Registered Users, Registered Users 2 Posts: 2,711 ✭✭✭fat-tony


    Thanks watty. So in practice, no domestic TV panel displays its output in an interlaced fashion - it's progressive! When you refer to native interlaced panels as being best, you are referring to an ideal world scenario - yes?


  • Registered Users, Registered Users 2 Posts: 32,417 ✭✭✭✭watty


    I didn't say that at all.

    What I implied was that if a TV is marketed as "1080p Full HD", you actually have no idea how it displays the 1080i. Or indeed how it manages the 1080p input either.

    Some may de-interlace only and some may be able to natively display 1080p 50 or 1080i25 and some might have menu option to switch interlacing on/off, which would be suggesting able to display native interlacing.


  • Registered Users, Registered Users 2 Posts: 2,711 ✭✭✭fat-tony


    I don't really mean to be a "pedantic pete" on this, but you are inferring, no make that stating, that "So for Broadcast TV an interlaced native display is best." (post #16). However, there is no way to easily verify if any particular TV model supports interlaced display natively - so how do you choose the "best"? I've got a Panasonic plasma and a Sony LCD and neither has any facility in the user/setup menus which refers to interlacing. I'd take a small wager that not too many TVs have.

    De-interlacing methods are most often mentioned in codec settings on PCs etc. where you are taking TV originated material and transcoding it to another format or optimising the display of the data on the PC screen etc. Obviously if you can de-interlace in advance in software or with hardware support then you can use an optimal strategy to present the data in progressive format to the PC display or TV which saves on subsequent processing.


  • Registered Users, Registered Users 2 Posts: 32,417 ✭✭✭✭watty


    Choosing "Best" of anything Electronic is getting very fraught.
    • Less technical information published than ever before.
    • Outright lies from Marketing
    • Misrepresentation (e.g. LED Tvs that are LED backlit LCD, Stereoscopic called 3D)
    • Cost reduction Engineering meaning later models may be poorer
    • Fashion (Shiny screens are cheaper and worse than Matt, but promoted as better)
    • 1080p promoted as higher Resolution than 1080i
    • Equipment widely sold unsuitable for local market
    • Equipment marked CE /!\ that is actually illegal in Ireland
    • "Best" can depend on your application
    • HD Ready TVs that only display 540 lines resampled to fit screen
    • HD sets that can only ever be HD Monitors (no Tuner that does HD anywhere)
    Much more :(

    Go and look at the TV running off Live Broadcast. Not a demo disk.
    Search Internet for Model and note that negative or positive points may be wrong as most people don't know what they talking about. See other posts by same person on similar subjects.

    I've seen PC software that is much worse than some TV sets for De-interlacing. You are right that a lot of real time PC de-interlacing is terrible (look at sideways scrolling text) but good encoding packages can do better. Real time needs dedicated HW. Unless the PC GPU on GFX card does it well, the PC real time SD or HD display is rubbish. Even with HDMI, few PCs output 1080i, 1080p is the PC output standard. Getting decent 576i (SD video) from GFX cards is few and far between because of the dominance of 60fps progressive.
    Ignore reviews at online sellers.


Advertisement