I have looked high and low for discussion on this, which I thought would be common but is non-existant, so I hope someone out there may have travelled this path before.
The only real solution to de-interlacing genuine interlaced sources, whether captured through G400 or ripped from DVDs whose original source was video (not film), is to split the fields and double the frame rate (whether you interpolate up the number of lines double to maintain single-frame quality and aspect ratio or not).
BUT although that can give you a beautifully smooth hi-res playback on your monitor (providing your hardware has the oomph) if, like most of us, you want to watch films on the TV via the overlay there seems to be another problem. So far as I can work out, the overlay just won't buy it.
Whereas an interlaced 25fps (30fps if you're NTSC) 576 line (480 if you're NTSC) clip plays back beautifully smoothly on the TV-out, the identical video stream as a progressive 50 fps (60fps ditto)288 line (240 ditto) does not - indeed the stuttering is worse than the version decimated back to 25fps.
This is infuriating and seems totally unecessary! The original version IS being played de facto on the TV at 288 lines 50fps, so why can't I play field-only-height double-frame-rate configured AVIs just the same?
I hope someone has an answer, and a solution. If the hardware designers overlooked this requirement though, any ideas how it might be 'fooled' into working? For instance, a mini-app that takes a progressive half-height double-rate avi and re-interlaces each two fields 'on the fly' before passing to the overlay.
P.S.
Note to the MPEG: how difficult would it have been to include interlacing as part of the MPEG-4 spec, hmmm, hmmm?
The only real solution to de-interlacing genuine interlaced sources, whether captured through G400 or ripped from DVDs whose original source was video (not film), is to split the fields and double the frame rate (whether you interpolate up the number of lines double to maintain single-frame quality and aspect ratio or not).
BUT although that can give you a beautifully smooth hi-res playback on your monitor (providing your hardware has the oomph) if, like most of us, you want to watch films on the TV via the overlay there seems to be another problem. So far as I can work out, the overlay just won't buy it.
Whereas an interlaced 25fps (30fps if you're NTSC) 576 line (480 if you're NTSC) clip plays back beautifully smoothly on the TV-out, the identical video stream as a progressive 50 fps (60fps ditto)288 line (240 ditto) does not - indeed the stuttering is worse than the version decimated back to 25fps.
This is infuriating and seems totally unecessary! The original version IS being played de facto on the TV at 288 lines 50fps, so why can't I play field-only-height double-frame-rate configured AVIs just the same?
I hope someone has an answer, and a solution. If the hardware designers overlooked this requirement though, any ideas how it might be 'fooled' into working? For instance, a mini-app that takes a progressive half-height double-rate avi and re-interlaces each two fields 'on the fly' before passing to the overlay.
P.S.
Note to the MPEG: how difficult would it have been to include interlacing as part of the MPEG-4 spec, hmmm, hmmm?
Comment