Announcement

Collapse
No announcement yet.

Answer To Doc Mordrid's 1080 Production Question

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #31
    Originally Posted by Apulo
    That must be quite an old article (or a very behind author). Current 24p screens can display at 24p multiples. There's no such thing as a native display rate, only native resolution. Flatscreens have for a long time been able to do at least 50 and 60Hz. There is motion stutter at 24p, because that's what's present in the original picture itself (watch next time you go to the movies). It's preferable to the irregular stuttering that comes from the conversion of 24Hz to 60Hz. There's no flicker however, because there's no screen refresh like with CRTs due to the technology.
    I think you missed an important word in his post
    We have enough youth - What we need is a fountain of smart!


    i7-920, 6GB DDR3-1600, HD4870X2, Dell 27" LCD

    Comment


    • #32
      Jerry is currently taking a 2 day forum vacaction - enforced.

      I suggest he reflects on his walls of text and evangelising.

      I suggest all reflect on pointless arguing.

      An no, I am not fair, murc has never been a democracy, not when Ant ran it, and not now.
      Juu nin to iro


      English doesn't borrow from other languages. It follows them down dark alleys, knocks them over, and goes through their pockets for loose grammar.

      Comment


      • #33
        Originally posted by Jerry Jones View Post
        That makes no sense whatsoever.

        I was the first to post the link back in the old thread about the native pixel matrix of consumer high definition camcorders often being far less than a true 1920 x 1080 pixels.

        If anything, your late-to-the-game acknowledgement of that fact argues against your pushing 1080p TVs on everybody because pixel shifting means your favorite type of HDTV is -- in fact -- overkill for the display of consumer high definition video.

        Thanks for making my point.



        Jerry


        There are consumer cameras, prosumers camera, and professional cameras. Of course the ideal solution is to have a large sensor with natively high resolution but that's just not feasible for consumers. They can't spend $50k or more on a video camera!

        There are plenty of video cameras coming to market (and on the market) that can produce a good 1080p picture. Perhaps they all don't have the image quality of a Panavision Genesis but they are getting better and better with each passing generation. The early sample were pretty bad as we discussed.

        Like any silicon based technology, the video market is moving forward and a pretty amazing rate. I think most people would agree with that. If someone wishes to purchase a 1080p display to watch Blu-Rays and HD-DVDs I don't think they should be demonized for that viewing decision. And I think it reasonable to assume that as the technology progresses we'll be seeing more and more 1080p content.
        - Mark

        Core 2 Duo E6400 o/c 3.2GHz - Asus P5B Deluxe - 2048MB Corsair Twinx 6400C4 - ATI AIW X1900 - Seagate 7200.10 SATA 320GB primary - Western Digital SE16 SATA 320GB secondary - Samsung SATA Lightscribe DVD/CDRW- Midiland 4100 Speakers - Presonus Firepod - Dell FP2001 20" LCD - Windows XP Home

        Comment


        • #34
          OK, ignorant as I am, I have a Q on the 24Hz flicker issue.
          I have a broad notion of what causes flickering with CRT-TVs and monitors (to be sure, electrons are fired at fluorescent things which light up with a decaying intensity. It is not possible (or ussual) to fire them all at the same time and keep m fired up).

          How can that occur with LCD? Or is this a Plasma thing? I can see how motion could be screwed with low refresh (or I guess especially replacement) rates, but flicker?
          Join MURCs Distributed Computing effort for Rosetta@Home and help fight Alzheimers, Cancer, Mad Cow disease and rising oil prices.
          [...]the pervading principle and abiding test of good breeding is the requirement of a substantial and patent waste of time. - Veblen

          Comment


          • #35
            An LCD screen consists of a fluorescent backlight with a flicker rate of 5-10 kHz and the phosphors have a persistance exceed several cycles, so the light is essentially "DC". In front of that is an LCD device which consists of a sandwich of two sheets of glass with a viscous liquid in between. This liquid, slightly similar to cholesterol in physical characteristics, has the property, called nematic, of aligning its molecules in an electric field, so that they are normally transparent but become opaque. So, if the glasses have horizontal and vertical conductive but transparent lines respectively, each pixel can be individually addressed. The colour is added by a trichrome filter in a dot pattern in front of the electrodes on the front glass. Like that, it would be like a CRT and display a real time scanned image. So they improved it by making the back electrode an array of about 2 million transistors which form a memory. The transistors are addressed simultaneously typically 75 times/sec and take their on/off status from a RAM which holds the current video frame (not field). Of course, this is a simplified description, but the basics are there, showing that an LCD (or plasma) has all its pixels refreshed simultaneously, while a CRT has to scan in real time, by fields (not frame), because it works from a single beam of electrons which is rather like a beam of light illuminating each letter on a page, in turn.

            Analogically, the same holds true in cameras. The old iconoscope, image orthicon, plumbicon etc. tube cameras scanned the electric charge with an electron beam. Modern cams, with CCDs or MOSFET sensors, can have the image of a frame stored in a memory which is scanned to provide a time-dependent signal, but the image is averaged over a frame and is not punctual like in some of the old tubes (other camera tubes relied on an averaging discharge of a capacitor over a frame, though).
            Brian (the devil incarnate)

            Comment


            • #36
              So, uhm, would an LCD which refreshes it's on/off status in RAM 24 times a second have flicker? What am I missing?
              Join MURCs Distributed Computing effort for Rosetta@Home and help fight Alzheimers, Cancer, Mad Cow disease and rising oil prices.
              [...]the pervading principle and abiding test of good breeding is the requirement of a substantial and patent waste of time. - Veblen

              Comment


              • #37
                No, I understand where the confusion comes from. The word refresh in the CRT technology means that with the decaying intensity of the phosphor, the pixel needs to get a new burst of electrons to maintain its intensity, a "refresh". Basically, pixels on a CRT are pulsing though hardly visibly, all the time, even when the picture stays the same. It's the phosphor that causes it. This is the origin of flicker.

                With LCDs (or similarly with plasma screens) there is no decay.

                Lets say you have a pixel #100. In frame 1 it has a value of R10 G20 B30. That actually means there are 3 subpixels on an LCD screen, 3 separate "light valves" for each primary color. Looking at one color would be enough for this example, but lets keep it at 1 pixel level for clarity's sake.

                In frame 2, the pixel #100 has still the same R10 G20 B30 value and this gets written to the light valves of this pixel. Since there's no change they maintain their position. There's no technical need for them to shut and go to R0 G0 B0, and then back to R10 G20 B30 again. So there's no flicker, the stream of light passing through them remains absolutely constant.

                Refresh on the LCD screen means the value table for each subpixel on the screen gets refreshed at a certain fixed interval. But only when the value is different from the previous frame does a pixel (consisting of 3 light valves) on the screen actually change.

                This means that even if you were to have a refresh rate of 1Hz on an LCD screen, there would be no flicker. Movement on screen would be extremely jerky, but no flicker.

                So 24p, which is the same as in the cinema, has no flicker on LCD.

                In a cinema projector this is different (old analog 35/70mm projectors). Since a movie consists of several still pictures on a strip, and this strip moves before a lens to create the illusion of movement, the light needs to be shut down each time the strip moves and puts the next still in place to be projected. Otherwise we would see the transport of the strip, instead of a series of still pictures. This is done with a turning wheel with parts cut out to let light through. This shutter is what determines the frequency with which a picture gets projected. Because 24Hz would flicker a lot, each still actually gets projected 3 times in a row, briefly interrupted by the shutter each time, but the still in the mean time does not move. This means there's actually more flicker but at a higher frequency, which is much less annoying for the human eye.

                But the amount of judder when the camera pans or tilts stays the same! Remember, it's the same picture that gets projected 3 times, there are no intermediate steps being created. So a camera movement of 1 second still is divided into 24 steps.

                An LCD screen that shows 24p will also show this judder. But it is the way it was shown in the cinema as well, and it is an important factor in the 'cinema feel' of a picture. If you'd create in between pictures to increase the frequency, it will actually start to look more like video/TV pictures than a cinema movie.

                Some screens do not support 24p, and they will show movies at 60Hz. To do this, they convert the frequency by repeating frames/fields at an irritating cadence. Pans and tilts will look smoother but a few times per second there will be a very annoying abrupt jerk in the motion.

                While the 24p movement isn't as smooth, it is regular, and therefor much less annoying.

                Edit: for plasma, this works the same way. But while (sub)pixels on an LCD are light valves letting the light from the backlight through or blocking it, each (sub)pixel on a plasma screen is actually a little 'lightbulb' by itself. No backlight, the (sub)pixel itself generates the light. Again, there's no need to switch this 'lightbulb' off and back on, the value can be changed continuously without going to zero in between.
                Last edited by Apulo; 17 March 2008, 11:48.
                Apulo

                Comment


                • #38
                  OK, I think I get what you are saying: showing 24p as 72p being 24p yields the same distortion as simply showing 24p: when objects are moving (relative to POV so moving camera is the same) 'fast' so that either the object is not sharp on a single 'picture' or (with fast shuuter times and 'long'times between 'pictures') the movement between two consequtive pictures is large, it does not matter between 24p or 24x3p. But moving to 60p introduces additional distortions. I can imagine all that, but is that what some mean by flicker of 24p??
                  Join MURCs Distributed Computing effort for Rosetta@Home and help fight Alzheimers, Cancer, Mad Cow disease and rising oil prices.
                  [...]the pervading principle and abiding test of good breeding is the requirement of a substantial and patent waste of time. - Veblen

                  Comment


                  • #39
                    I don't think so. Some people might just think "24 frames per second? Wow, that must flicker!"

                    Try it out. Go to a good electronics shop and ask specifically for a demo of 1080p Bluray playback at 24p on an LCD that supports it.
                    Apulo

                    Comment

                    Working...
                    X