Announcement

Collapse
No announcement yet.

Quality: YUY2 vs. HuffYUV 2.1.1 vs. Matrox MJPeg

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Quality: YUY2 vs. HuffYUV 2.1.1 vs. Matrox MJPeg

    There has been some discussion in the forum recently concerning the relative sharpness of various capture formats. After much back and forth on this I decided that some actual tests were in order.

    How to do it? Here's what I came up with;

    1. send a live video feed of a test image to the capture card. I used a U.S. Air Force lens test target and a Sony Hi8 cam. The target was 3 meters from the cam and the lens was adjusted to have the target fill the frame.

    2. capture 1000 video frames to 704x480 *.avi's using each of the formats under discussion;

    YUY2 raw

    HuffYUV 2.1.1 @ "best"

    Matrox MJPeg @6.6:1 compression

    3. import these clips into Premiere 6.0 and export a 704x480 *.bmp from the middle of each *.avi.

    4. crop the images to allow for faster downloading. They were then joined into a single *.bmp in Photoshop and labeled.

    5. this image was then converted to a 100% quality first generation JPEG to make it small enough for downloading with a 56k connection (150k).

    The resulting images show the actual image quality of the video capture, not what shows up on the monitor during playback or in PC-VCR etc. Guess which is most important?

    You can judge their quality and sharpness for yourselves. The page may be a tad slow on a 56k connection due to the image being 150k in size.

    http://hometown.aol.com/videoschool/...age/index.html

    Yes, HuffYUV does optimize the YUY2 frame a bit. IMHO this makes it even more desirable to use.

    IMHO MJPeg is NOT sharper by any stretch. The Gibbs artifacts and quantization used in DCT codecs prevent that.

    Dr. Mordrid


    [This message has been edited by Dr Mordrid (edited 16 January 2001).]

  • #2
    OK Doc, you have proven that the Marvel (or RR-G) works properly in NTSC. But look at the two pictures published by MGU2, which were captured from a PAL card.
    URL's:

    http://www.geocities.com/mgu222/yuy2.jpg

    http://www.geocities.com/mgu222/mjpeg.jpg

    Look more specifically to the words BBC and WORLD in the upper left corner. The deformation you see on the B's and R in the YUY2 sample are typical of a badly implemented deinterlacing algorithm or a badly implemented resizing algo. Also, see how the word Washington, big and bold enough that the horizontal lines of the letters are two scan lines wide, does not show the same problems. Now the problem is that the deinterlacing errors for us PAL users definitely blurs an everyday picture (I mean not a picture made of text, but a common scene) far too much.
    The picture when just watching TV in PCVCR is of the same bad quality. And what is even worse is that even if you capture at 352x288 (only one PAL field), the same "blurring" is present, despite the fact that no deinterlacing should be necessary for half res.
    Sorry Doc, but for us Europeans, the Marvel is useless. And this has nothing to do with the phase alternation in the colour coding
    Michka


    [This message has been edited by Michel Carleer (edited 16 January 2001).]
    I am watching the TV and it's worthless.
    If I switch it on it is even worse.

    Comment


    • #3
      I'd be very interested in seeing images done as above using a PAL camera. What would be even more interesting would be to see a set done with the color off (saturation=0 in the capture proggie) and at normal settings with color.

      I still think this effect has something to do with the phase shift in alternating PAL fields. among other things.

      This article is from the U of Surrey's electronic engineering school website seems to indicate why NTSC may appear sharper and at least two reasons for loss of sharpness in PAL;

      NTSC/525 Advantages

      Higher Frame Rate -
      Use of 30 frames per second (really 29.97) reduces visible flicker.

      Atomic Colour Edits - With NTSC it is possible to edit at any 4 field boundary point without disturbing the colour signal.

      Less inherent picture noise - Almost all pieces of video equipment achieve better signal to noise characteristics in their NTSC/525 form than in their PAL/625.


      PAL/625 Disadvantages

      More Flicker - Due to the lower frame rate, flicker is more noticable on PAL/625 transmissions; particularly so for people used to viewing NTSC/525 signals.

      Lower Signal to Noise Ratio - The higher bandwidth requirements cause PAL/625 equipment to have slightly worse signal to noise performance than it's equivalent NTSC/525 version.

      Not a good start at all.

      Loss of Colour Editing Accuracy - Due to the alternation of the phase of the colour signal, the phase and the colour signal only reach a common point once every 8 fields/4 frames. This means that edits can only be performed to an accuracy of +/- 4 frames (8 fields).

      That is not good....

      Variable Colour Saturation - Since PAL achieves accurate colour through cancelling out phase differences between the two signals, the act of cancelling out errors can reduce the colour saturation while holding the hue stable.

      Anyone want a *.pdf of the test pattern? You just print it out to an 8x11 page of bright white paper and start shooting.

      Dr. Mordrid


      [This message has been edited by Dr Mordrid (edited 17 January 2001).]

      Comment


      • #4
        Doc,
        You said:
        >
        I still think this effect has something to do with the phase shift in alternating PAL fields. As to if the problem is the phase shift itself or Matrox's treatment of same....
        <
        Being used to watch PAL TV, I do not think the effects you see in MGU2 pictures are due to phase alternation. These would produce hue errors, not luminance errors. Unless of course the Matrox implementation of PAL is so weird that the colour information leaks quite badly in the luminance processor. But even so, it would produce diagonal moving lines of successive brighter and darker luminance.
        Another point is that when watching TV, and this is exactly what MGU2 also found out, the picture quality on the monitor suffers the same artefacts. Now the strange thing is that if you start recording in MJPEG, then the picture on the monitor suddenly becomes better, stays so as long as you record, and goes back to the fuzzy state when stopping recording. Now if it was a phase alternation problem, capturing or not would not change anything to it, would it?
        The difference in picture treatment (but without telling that it is deinterlacing) has been confirmed to me some time ago by Haig of Matrox tech support. He told me that they cannot treat the picture the same way during capture, because the Marvel captures without merging the two fields, as it should. Obviously he did not understand me correctly and supposed I wanted the picture during capture to be deinterlaced as well, whereas I just wanted the contrary: no deinterlacing at all, even when not capturing.

        Unfortunately, I won't be able to capture your test pattern because I removed all traces of the Marvel drivers on my PC, and I don't want to reinstall them because they will replace the VfW registry entries for the drivers of my Pinnacle PCTV, which would then stop working.
        What I can do however is try to find back pictures that I captured some 6 months ago that will show you the difference in quality between the Marvel and the PCTV. They are (or were) snapshots done by the Marvel and the PCTV software, and exactly reproduce what is seen on screen during normal TV watching. I don't think I captured B/W pictures however, I'll check. In addition, I have a problem in that I don't have a web page to show them. That's why I never published them before.
        By the way, Doc, are you a real true doctor (I mean a doctor in Medicine)? Or a false doctor like me (PhD in Physical-Chemistry)?
        Michka

        While I was writing my piece of *put the word you like here*, Doc added some quotes. There are things which I don't understand in the quotes. Particularly: the higher bandwidth required by PAL/625. In my view, 25 x 625 = 15625 and 30 x 525 = 15750 scan lines per second will give you about the same bandwidth, unless the horizontal resolution too is far lower for NTSC than for PAL. Is this the case?
        Also, the paragraphs about colour editing accuracy talks about cutting and inserting in a clip, WORKING ON THE COMPOSITE SIGNAL, with the colour encoded as phase and amplitude of the subcarrier. It does NOT pertain AT ALL to editing a YUV or RGB or whatever digitized picture where the phase notion does not exist. Once you get to the Y, U and V components (or R, G, B), you can't talk about phase alternation, because there is no phase at all anymore. The phase is only a way of encoding the hue on the subcarrier, it is not part of YUV or RGB. Otherwise, you would not be able to cut your clips at arbitrary picture boundaries in Premiere or MSP6, not even with NTSC (it takes 4 fields to get to the same phase/colour point in NTSC, says Doc's quote).

        [This message has been edited by Michel Carleer (edited 17 January 2001).]

        [This message has been edited by Michel Carleer (edited 17 January 2001).]

        [This message has been edited by Michel Carleer (edited 17 January 2001).]
        I am watching the TV and it's worthless.
        If I switch it on it is even worse.

        Comment


        • #5
          Actually the PAL phase alteration actually prevents hue errors. It's saturation errors that PAL is prone to because of the phase shift, and those could occur on several consecutive lines. The result according to the tech literature:

          vertical luma resolution

          NTSC: 480
          PAL: 576

          So far, so good. We know you PAL guys have more scanlines and luma is resolved at face value in both systems.

          vertical chroma resolution

          NTSC: 480
          PAL: 200

          This is where it looks like PAL is losing it. The phase shift and it's tendency to spread saturation errors vertically takes its toll in vertical color resolution. NTSC doesn't spread color errors in one scanline to another.

          An interesting way to confirm this would be for a PAL user to take the USAF pattern and test it with both mono (saturation = 0 in the capture proggie) and color video captures and frame extractions.

          The bandwidth issue refers to PAL's higher luma bandwidth (subcarrier at 4.43 mhz vs. NTSC's 3.58 mhz). More room for luma = less room for chroma = worse color s/n ratio.

          Dr. Mordrid


          [This message has been edited by Dr Mordrid (edited 17 January 2001).]

          Comment


          • #6
            Agreed, the PAL system lowers the chroma vertical resolution more than NTSC. Agreed too, if the phase alternation enables to recover more accurate hues, it may well lead to an underestimation of the saturation. That's because the length of the sum of two vectors is equal to the sum of the lengths of the two vectors only if the two vectors are aligned (the angle between them is zero). The highest the phase error, the smallest the recalculated saturation. At the limit, if the angle between the two is 180, the length of the sum vector is the difference of the lengths of the two vectors. But there are ways to deal with this. After all, you can calculate the mean value of the length and the mean value of the phase angle independently, and not perform a vector sum.

            Now Doc, did you ever wonder why the frequency of the colour subcarrier is so weird(4.43MHz for PAL and 3.58MHz for NTSC)?
            If you don't know, here is the explanation:

            Remember, to start with, TV only transmitted B/W pictures plus sound on a sound subcarrier frequency (5.5MHz for PAL, don't remember the figure for NTSC). When a mean of transmitting colour was devised, the engineers wanted the old B/W TV sets to still be able to receive the TV signals and properly show them, albeit in B/W only. They also did not want to increase the total signal bandwidth. The colour signal had to be forward compatible.
            Then came a bright idea. The TV picture is in fact decomposed in successive scan lines, for transmission. In a real world picture, the luma signal on successive scan lines does not vary considerably. The result of this is that if you look at the frequency spectrum of the luma signal, most all energy is concentrated around the line scanning frequency (15625Hz for PAL and 15750Hz for NTSC) and it's harmonics. So, looking at the luma frequency spectrum, you will see sharp peaks centered around (for PAL) 15625, 31250, 46875, 62500, ... Hz. Moreover, a real world picture shows far more slow variation of the luma than fast ones, so that the intensity of these peaks decreases when you go up the harmonics ladder. Of course, the chroma signal shows exactly the same behaviour. Both luma and chroma spectra look like a comb. So the idea was to choose the frequency of the colour subcarrier as being a multiple of the line frequency + 0.5. Then all chroma peaks will fall exactly in-between the luma peaks and it is possible to sort of "interleave" the colour information in the luma bandwidth and still be able to separate them at reception. You must certainly have read in recent TV, VCR or capture cards specs or publicity about them using comb filters.
            What does this mean in the end? Well that it is not because the colour subcarrier of NTSC is at a lower frequency that the luma bandwidth of NTSC is smaller.
            OK, this does not mean either that it is the same, only that the value of the luma bandwidth is not tightly linked to the frequency of the chroma signal and that it can be chosen almost independently. I don't have at hand the actual values that were chosen for each system.
            On the subject of chroma resolution I also want to add that if I remember correctly, both for NTSC and PAL the frequency excursion of the colour subcarrier (resulting from the phase and amplitude modulation) is limited to 1 MHz. So, if I am not mistaken, the chroma bandwidth is roughly the same for both transmission systems. Which in turn means that if the chroma vertical resolution is higher for NTSC, then it must be lower horizontally.

            Doc, I feel that the lecture we have been giving to the other members of this forum (if some of them are still reading) about TV picture coding and transmission is a bit scrambled indeed.

            Michka
            I am watching the TV and it's worthless.
            If I switch it on it is even worse.

            Comment


            • #7
              <font face="Verdana, Arial, Helvetica" size="2">Originally posted by Michel Carleer:
              Doc, I feel that the lecture we have been giving to the other members of this forum (if some of them are still reading) about TV picture coding and transmission is a bit scrambled indeed.

              Michka
              </font>
              I'm still reading.....
              but my brain hurts!

              Now. Who's on first?


              Comment


              • #8
                "Now. Who's on first?"
                Yes, that's exactly how I feel too
                Michka
                (self-criticism, don't get me wrong)
                I am watching the TV and it's worthless.
                If I switch it on it is even worse.

                Comment


                • #9
                  I live in NTSC land. I've been to Europe twice. England once, Hungary once. I don't watch a lot of TV, but to me the picture quality of PAL looked noticably better than the NTSC I'm used to at home.

                  So while NTSC might "be better" in theory, in my limited experience, PAL appears better implemented in practice -- at least for over the air broadcasts vs. my local Time-Warner cable.

                  --wally.

                  Comment


                  • #10
                    Wally,
                    As was mentionned in another post, NTSC chose to put the emphasis on frame rate at the expense of picture resolution. The NTSC choice is supposed to lower the amount of flickering and hence lower eye strain. PAL did exactly the opposite:
                    NTSC: 525 scan lines, 30fps
                    PAL: 625 scan lines, 25 fps
                    Note that if the number of pixels per scan line is the same, then the bandwidth (number of pixels per second) is roughly the same.
                    However, the lower frame rate and supposed increase in flickering can be compensated by using longer lived phosphors. Also, the flickering problem completely disappears when using modern LCD or plasma screens.
                    Anyway, PAL has a higher picture resolution.
                    When I travel to the USA or Canada, I am happy I do not have to watch your low res TV for more than a week or so.
                    Michka
                    I am watching the TV and it's worthless.
                    If I switch it on it is even worse.

                    Comment


                    • #11
                      Someone mailed me last night and is going to run a series of tests using the USAF pattern. It should be interesting to see how they turn out.

                      As for NTSC vs. PAL and who's better....it's one of those deals where the answer depends on what aspect of the beast you're discussing. PAL has better color resolution in the horizontal but worse in the vertical. NTSC has a faster frame rate but falls apart with dot crawl and hue instabilities. Then there is signal to noise ratio and other dissimilarities.

                      As for how each appears to the individual, that too can (and does) vary. I find PAL uncomfortable to watch because of the flicker.

                      The reasons: my vision is pretty far above normal in terms of acuity (20/10) and my visual recovery times are also faster than normal. As a result I can perceive 25 fps flickering very easily. This makes life interesting in theaters where the effect is even worse at 24 fps. Having this kind of vision at 52 drives my opthalmologist crazy, but gives him easy access to a research subject.

                      As usual it's a more complex topic than might be expected, right Michka?

                      Dr. Mordrid



                      [This message has been edited by Dr Mordrid (edited 17 January 2001).]

                      Comment


                      • #12
                        Wally, I've made the same experience. While I don't exactly know the tech details I really didn't like the NTSC picture when I first saw it. The extra 5fps NTSC offers are much too few to actually eliminate flicker, besides better TVs here in Europe use circuitry to double the refresh-rate - and hell, you do have diffculties to tell a "100Hz" model from a 50Hz one if you don't have the direct comparision.
                        To me the main disadvantage of NTSC is not the resolution, it's the colors! They're always looking somewhat "artificial" (a bit like when watching TV on computer-monitors), you can change between a picture that is a bit more green or a bit more blue, but you don't get NATURAL colors as with PAL. This was the first time I really understood the Never The Same Colour (NTSC) -joke...

                        Joachim

                        Oh, btw, full PAL overscan is 768x576@25fps (as an old-time Amiga user you do know that ), NTSC is 720x480@30fps
                        In 32 bit, this gives:
                        44236800 Bytes/s for PAL and
                        41472000 Bytes/s for NTSC.
                        So PAL does indeed require slightly higher bandwidth than NTSC.
                        But we named the *dog* Indiana...
                        My System
                        2nd System (not for Windows lovers )
                        German ATI-forum

                        Comment


                        • #13
                          From what I have seen of NTSC, PAL seems to have the edge as far as transmitted TV goes. My system at home is a true PAL/NTSC system and quite frankly I cant tell the difference between PAL and NTSC DVDs using S video cables.

                          My 2 cents worth
                          paulw

                          Comment


                          • #14
                            <font face="Verdana, Arial, Helvetica" size="2">Originally posted by paulw:
                            From what I have seen of NTSC, PAL seems to have the edge as far as transmitted TV goes. My system at home is a true PAL/NTSC system and quite frankly I cant tell the difference between PAL and NTSC DVDs using S video cables.

                            My 2 cents worth
                            </font>

                            That's right. In such a near-to-ideal setup NTSC may actually be better looking because of the higher frame-rate and higher SNR. But under real-life conditions (transmitted TV) it's just too prone to give those disturbing hue-errors.

                            [This message has been edited by Indiana (edited 17 January 2001).]
                            But we named the *dog* Indiana...
                            My System
                            2nd System (not for Windows lovers )
                            German ATI-forum

                            Comment


                            • #15
                              Well that also explains a lot. Hardly anyone in my area uses broadcast TV anymore. Our whole area is on fiberoptic digital TV.

                              Our NTSC is flat gorgeous and clean as a whistle because of the perfect signal from the digital box. It'll be even better next month when our tax refund comes back and we pick up our Sony Vega

                              Dr. Mordrid

                              Comment

                              Working...
                              X