Announcement

Collapse
No announcement yet.

Interlaced capture for film shows

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Interlaced capture for film shows

    I was wondering about the benefits of capturing films (from TV tuner or VHS) half or full frame.
    I'm talking about movies originally shoot on film (so non interlaced)
    Do the fields show differences from the frame?
    If not, would it yield the same results (when output back to TV) capture 1/4 frame than 1/2 frame.
    BTW I'm using PAL so it's closer to film fps than NTSC. Does it matter?
    Thanks

  • #2
    If you're just editing "natural" video (no special F/X, detailed graphics, titles with small fonts etc.) then half frame (352x480) is just fine. You loose a bit of horizontal rez but no color data is lost. Vertically you still have the full height so it works out.

    The main advantage, of course, is that your capture files will be half the size of full frame captures.

    The one thing you'll notice is that it'll look stretched vertically on the computer monitor, but is should play back fine to the TV output.

    As for quarter frame, this is problematic. Since 352x240 is half width AND height you loose a whole video field. This causes a significant loss of detail and quality. Just ask anyone who has seen a VideoCD on a TV set.

    Dr. Mordrid


    [This message has been edited by Dr Mordrid (edited 01 December 2000).]

    Comment


    • #3
      DrMordrid:
      How come gets a film interlaced when broadcasted?
      I mean:
      Suppose that I want to capture one chapter of "Charlie's Angels" from Sony channel.
      The original material was on film, so no interlacing was done when converting to video. (Am I missing something?)
      If the above is right, what's the point of interlaced captures.
      I think that I'm using the terms interlacing and high vertical resolution (576 in PAL) as the same thing.
      Please enlighten me as usual

      Comment


      • #4
        Both the NTSC and PAL video standards use interlace as part of their specs, so it's unavoidable. The reasons for its use goes back to the old days of TV.

        One is the slowness of the original electronics gear.

        Another is the narrow bandwidth of the broadcast signals.

        A third reason, and the one that explains why we're still using a 50+ year old system, is that no one wanted to make the installed base of B&W TV's obsolete when color came out. Pure econo-politics, logic be damned.

        Basically to get the same vertical number of vertical scan lines with the old/slow gear would have required twice the bandwidth than was allocated. Interlace allows the slow phosphors used on the TV's display to act as a "memory", storing one field while the next one is being drawn.

        This low-tech solution to the analog bandwidth problem causes some headaches, as you have likely noticed. One is artifacting. Another is that the first lines drawn in the first field can start to fade before the last in the second field are drawn. This is the source of the familiar horizontal stripes of a TV display. One set of scan lines is dimmer than the adjacent set.

        This will all be moot in the States soon as there is to be a switch to Progressive Scan video over the next 7 years. Progressive Scan uses fully drawn frames with no interlacing, much like the video on a computer screen. In fact these PS TV's will be more computer monitor than traditional TV set.

        One of the first of these new sets is the new Sony WEGA. It supports both 480 interlaced and 480 progressive scan. The image quality of 480p is drop-dead-gorgeous. I can't wait to see a 1080p set.

        Here's a more detailed article on this from the EETimes.com website;

        http://www.eet.com/story/OEG19991207S0049

        Dr. Mordrid


        [This message has been edited by Dr Mordrid (edited 01 December 2000).]

        Comment

        Working...
        X