Announcement

Collapse
No announcement yet.

720p vs. 1080i: A Fascinating Argument by ABC's Randall Hoffner

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #31
    Originally posted by |Mehen|
    Well I'm not sure what they have for non-digital sources in terms of fps, but I'm sure the higher end analog cameras are a lot higher than 1080 maybe add a few zeros.

    As for transmitting the signal, DVI works fine at those resolutions and framerrates for computers, no reason they wouldn't for tv. I'd love to be able to play the latest and greatest PC games on a HDtv at 1080p - the computer hardware is out there to do it - with AA/AF too

    And what about next gen consoles?

    Every Thursday Stephen Speicher contributes The Clicker, a weekly opinion column on entertainment and technology: The conversation always goes the same way: "So 720p is progressive, right?" "Correct – that's what the p is for" "But 720p has fewer pixels than 1080i, right?" "Correct – bigger number and all" "But 1080i is only 30 frames per second and is interlaced compared with 720p's 60 frames of progressive goodness." "Correct" "So why not just get a 1080p display?"


    Jerry Jones

    Comment


    • #32
      here's someone's comment on that article:

      "To heck with the PS3, I want to hook my 'puter up and play World of Warcraft @ 1080p60!!

      Thru digital please (DVI->HDMI), not analog."

      It seems that RIGHT NOW 1080p might not be all that useful, but almost all of the people who have left comments agree that 1080p is a good future-proof way to go.
      Q9450 + TRUE, G.Skill 2x2GB DDR2, GTX 560, ASUS X48, 1TB WD Black, Windows 7 64-bit, LG M2762D-PM 27" + 17" LG 1752TX, Corsair HX620, Antec P182, Logitech G5 (Blue)
      Laptop: MSI Wind - Black

      Comment


      • #33
        1920x1080 is more pixels than most people have on their desktop computer. I've a pair of 1600x1200 Dell LCD panels, but I'd probably be satisified with a single 1920x1080 display for most things.

        Good HDTV displays at 1080p could bring the true converison of computers and TV, right now any computer stuff displayed on a TV set is a serious compromise in computer usability.

        --wally.

        Comment


        • #34
          Originally posted by wkulecz
          1920x1080 is more pixels than most people have on their desktop computer. I've a pair of 1600x1200 Dell LCD panels, but I'd probably be satisified with a single 1920x1080 display for most things.

          Good HDTV displays at 1080p could bring the true converison of computers and TV, right now any computer stuff displayed on a TV set is a serious compromise in computer usability.

          --wally.

          The problem may be precisely that... the lack of good HDTV displays at 1080p.

          And the reason, according to Peter Putman, CTS (Certified Technology Specialist), is the ***refresh rate*** of consumer HDTVs.

          He makes a compelling point, in my view.

          Why?

          Because 1080/24p content is, in reality, ***down-converted*** to 1080/30i when it is broadcast.

          He acknowledges that it is theoretically possible to broadcast 1080/24p content, ***however,*** current HDTV technology makes that unrealistic.

          Why?

          Because "none of the consumer HDTV sets out there would support the non-standard horizontal scan rate required."

          "And you sure wouldn’t want to watch 24Hz video for any length of time; the flicker would drive you crazy after a few seconds."



          What about 1080/60p?

          Putman makes another compelling point...

          "To cut manufacturing costs, most HDTV sets run their horizontal scan at a constant 33.8 kHz, which is what’s needed for 1080i (or 540p)."

          "1080p scans pictures twice as fast at 67.6 kHz."

          "But most of today’s HDTVs don’t even support external 720p signal sources, which requires a 44.9 kHz higher scan rate."

          Here's another ripper:

          "The leading manufacturer of LCD TVs does not support the playback of 1080p content on its own 1920x1080 products, whether the signal is in the YPbPr component or RGB format."

          "Only the INDUSTRIAL monitor version of this same LCD HDTV can accept a 1920x1080p RGB signal."

          More bad news:

          "To show a 1080i signal, many consumer HDTVs do the conversion from interlaced to progressive scan using an economical, “quickie” approach that throws away half the vertical resolution in the 1080i image."

          "The resulting 540p image is fine for CRT HDTV sets, which can’t show all that much detail to begin with."

          "And 540p is not too difficult to scale up to 720p."

          "But a 540p signal played back on a 1080p display doesn’t cut the mustard."

          "You will quickly see the loss in resolution, not to mention motion and interline picture artifacts."

          (And I've seen these artifacts, myself, in HDTV demonstrations.)

          "Add to that other garbage such as mosquito noise and macroblocking, and you’ve got a pretty sorry-looking signal on your new big screen 1080p TV."

          What about DLP (digital light processing) sets?

          According to Putman:

          Your "1080p TV may not have full horizontal pixel resolution if it uses 1080p DLP technology."

          "The digital micromirror devices used in these TVs have 960x1080 native resolution, using a technique known as “wobbulation” to refresh two sets of 960 horizontal pixels at high speed, providing the 1920x1080 image."

          "To summarize: There are no fast refresh (30Hz or 60Hz) 1080p production or transmission formats in use, nor are there any looming in the near future – even on the new HD-DVD and Blu-ray formats."

          "The bandwidth is barely there for 1080i channels, and it’s probably just as well, because most TVs wouldn’t support 1080p/60 anyway – they’d just convert those signals to 1080i or 540p before you saw them."

          What about 720p.

          More bad news:

          "The 1280x720 progressive-scan HDTV format, which can be captured at full resolution using existing broadcast cameras and survives MPEG-2 compression better than 1080i, doesn’t make it to most HDTV screens without first being altered to 1080i or 540p in a set-top box or in the HDTV set itself!"

          "So what chance would a 1080p signal have?"

          It does make one think.

          Jerry Jones

          Comment


          • #35
            Interesting discussion guys. Last year at NAB I was a speaker in a post-production conference where we debated the merits of 720p vs. 1080i. I was on the side of 720p. You can get very technical with the arguments but in the end it comes down to how it looks. Both standards can look very good when done right. The trick of course is doing it right.

            From what I have seen, given both standards done well, unless you are sitting very close to a very large high quality screen, with great source material, it's very hard to tell the difference. When I could tell the difference I preferred 720p when there was motion and 1080i with static images. Obviously this is because 1080i is progressive with a static or nearly static image.

            I don't like interlaced formats. I don't like dealing with field order, flicker, deinterlacing, and jaggies. But truth be told, the jaggies are less pronounced with 1080i simply because the resolution of the format is so much higher than SD interlaced formats.

            The thing about this is the delicate balancing act of resolution and bandwidth. For example, take a SD stream. Assume moderate motion and a high quality source. Compress it at 1mbps and it will look, well bad. Now compress it at 2mbps and it will look dramatically better, by 3mbps it's starting to look like decent video. The improvement to 4mbps will still be significant but less than from 2 to 3. By 5 or 6mbps you've pretty much topped out the quality assuming you are using a good compressor and don't have high motion scenes like water waves.

            So now if you have 8mbps bandwidth available would it be better to use if for your SD stream or slightly increase the resolution and slightly "starve" the higher res stream? Of course this is an academic exercise since we can't broadcast this custom SD+ stream, but I would say the the optimum use of the bandwidth would be to use the slightly higher resolution format at a slightly less than what it should be bandwidth. As you have probably noticed, as resolution of the format becomes higher the artifacts due to insufficient bandwidth become less apparent. A 320x240 stream at a bandwidth 25% below the level that shows no artifacts will look "better" than a SD stream 25% below it's required bandwidth for no artifacts. Of course the SD stream will have a higher bandwidth but the point is that at a certain threshold it's better to move from a lower definition format with excess bandwidth to a higher definition one that is slightly starved for bandwidth. More bandwidth is always better, but at a certain point it is wasted unless the source video is encoded at a higher resolution.

            Back to reality. 720p is at the capability of most of the HDTV displays on the market right now. Sure there are 1080p displays popping up, but most plasmas are right around 720p resolution. When you factor in the fact that 1080i formats show interlacing artifacts on the inherently progressive HD displays and you have a pretty good argument for 720p.

            If you also have a look (no pun intended) into resolving limits of the eye and various display sizes and viewing distances you find that you need quite a large monitor and to be sitting very close to it in order to exceed the resolution. I personally feel that SD looks very sharp to 27" screen size and a little soft at 32". I'm talking 4/3 aspect ratio here so it will even hold up better for widescreen. With that being said most of the larger screens being sold today are 45" LCDs and 50" plasmas. A good set with a good 720p signal and look fantastic. I think 1080i might start to show an advantage over 65" or so. Again this assumes a great set and signal.

            It has been my experience that with MPEG-2 compression you can get a good, relatively artifact free, picture with 720p and 1080i (anamorphic HDV format) with about 16mbps bandwidth. This assumes a good multipass encoder and that you are starting with good source material. The problem is that for live events we are at the mercy of the MPEG-2 (realtime) encoders that the broadcasters are using. I saw some very impressive realtime MPEG-2 encoders at NAB last year from JVC.

            Right now I'm concerned less with the format that is streaming into my living room and more with what is done with that signal when it leaves the camera block. If it is shot, encoded, and transmitted properly I believe I will have a great picture in my living room.

            - Mark
            Last edited by Hulk; 10 February 2006, 23:21.
            - Mark

            Core 2 Duo E6400 o/c 3.2GHz - Asus P5B Deluxe - 2048MB Corsair Twinx 6400C4 - ATI AIW X1900 - Seagate 7200.10 SATA 320GB primary - Western Digital SE16 SATA 320GB secondary - Samsung SATA Lightscribe DVD/CDRW- Midiland 4100 Speakers - Presonus Firepod - Dell FP2001 20" LCD - Windows XP Home

            Comment


            • #36
              Here's another great article about the differences between LCDs and PLASMAs:

              hdtvexpert.com is your first and best source for all of the information you’re looking for. From general topics to more of what you would expect to find here, hdtvexpert.com has it all. We hope you find what you are searching for!


              Jerry Jones

              Comment


              • #37
                For me, the PLASMAs seem to be less risky than LCDs.

                Why?

                Because LCDs can develop defective pixels.

                Imagine spending two grand on a new HDTV LCD screen only to have an annoying dead pixel (or multiple dead pixels) appear within a few months of purchase.

                Samsung was rumored to have adopted a "zero defective pixel policy," but it was *only* applicable in South Korea.

                Samsung's USA policy appears to be as follows:



                For a 15" Monitor - 7 or more bad pixels required before replacement is granted;
                For 17" and 19" Monitors -10 or more bad pixels required before replacement is granted;
                For 21" - 24" Monitors - 17 or more bad pixels required before replacement is granted.

                Jerry Jones
                http://www.jonesgroup.net

                Comment


                • #38
                  It appears PLASMAs can also suffer from defective pixels, according to this recent PC WORLD article:



                  "Both types (PLASMAs & DLPs) of TVs are subject to defective pixels."

                  "Five manufacturers in this group provided a pixel-defect policy."

                  "The best was Panasonic's, which provides coverage if more than three pixels are defective."

                  "But Mitsubishi's guarantee that 99.99 percent of its pixels will function would allow up to 104 defects on the 50-inch TV's 1365-by-768-pixel screen."

                  "Most warranties provide coverage for only a year."

                  Jerry Jones

                  Comment


                  • #39
                    "ZBD (0 bright dot) policy for LCD panel
                    The panels of ASUS [PM17TU] LCD Monitors are the best in the business. Even if one bright dot is found, users can exchange for a new panel within one year of purchase. This is a demonstration of ASUS confidence in its quality management and dedication to providing unrivaled products."
                    Q9450 + TRUE, G.Skill 2x2GB DDR2, GTX 560, ASUS X48, 1TB WD Black, Windows 7 64-bit, LG M2762D-PM 27" + 17" LG 1752TX, Corsair HX620, Antec P182, Logitech G5 (Blue)
                    Laptop: MSI Wind - Black

                    Comment


                    • #40
                      Originally posted by |Mehen|
                      "ZBD (0 bright dot) policy for LCD panel
                      The panels of ASUS [PM17TU] LCD Monitors are the best in the business. Even if one bright dot is found, users can exchange for a new panel within one year of purchase. This is a demonstration of ASUS confidence in its quality management and dedication to providing unrivaled products."
                      Here's the link:

                      Asus has introduced the PW191, the company's first 19-inch LCD display, which features a 1440 x 900 native resolution, an 8ms response time, and a 600:1 contrast ratio. It also has the same "zero bright-dot" gurantee as some of Asus's other products, which means it can be returned for a refund if there's even a single "bright" (read: dead) pixel on the display.


                      I hope the other manufacturers do the same.

                      Good for ASUS!

                      Jerry Jones

                      Comment


                      • #41
                        Originally posted by wkulecz
                        What ever the technical details, ABC got it fixed for the super bowl. HDTV quality was as good as anything I've seen yet. If they use the same system for Monday night football I may start watching it again.

                        It did seem the audio had a poor mix -- ambient noise often made commentary unintelligable, not that I really paid much attension to what they were saying.

                        --wally.

                        OTOH, ESPN-HD (owned by ABC) Sunday's broadcast of the "Pro Bowl" was horrible. Not wide screen and poor quality -- even worse than FOX NFL has often been. They should be embarassed to have overlaid the "HD" logo on the broadcast.

                        --wally.

                        Comment


                        • #42
                          Wow:



                          "Japan's Sony said on Tuesday that about 400,000 Bravia brand LCD and rear-projection TVs launched last year may malfunction due to faulty software."

                          Jerry Jones

                          Comment

                          Working...
                          X