Announcement
Collapse
No announcement yet.
AVID Announces Support for 24p & Panasonic's AG-DVX100
Collapse
X
-
I know there is almost a reverent respect for 24fps but I would rather have the 30fps progressive that my Optura Pi does. If we're moving ahead in resolution why can't we move ahead in fps as well? I know, I know, it makes transfer to film easier and better, I'd rather have the OLD standard changed rather than make our NEW gear adhere to old standards.
I like to think that eventually we'll be at some hi-def standard that has higher resolution, progressive frames, and greater than 24fps. If people want that film look they can actually use film, or software can emulate it.- Mark
Core 2 Duo E6400 o/c 3.2GHz - Asus P5B Deluxe - 2048MB Corsair Twinx 6400C4 - ATI AIW X1900 - Seagate 7200.10 SATA 320GB primary - Western Digital SE16 SATA 320GB secondary - Samsung SATA Lightscribe DVD/CDRW- Midiland 4100 Speakers - Presonus Firepod - Dell FP2001 20" LCD - Windows XP Home
-
Because the move is on to make video more like film;
1. HD very often uses 24p.
2. 24p eliminates interlace artifacting because it's, well, non-interlaced. This produces a rather large increase in perceived quality on the display.
3. Movies production in general is going digital and these are largely being shot at 24p, which is now filtering downstream in the market.
4. Editing in progressive mode allows for higher quality compositing of graphics, keyed effects, rotoscoping and titles.
5. Storing the data on the DVD at 24p saves space vs. 29.970 fps by allowing a lower bitrate/second while still using the same number of bytes/I-frame.
20% fewer I-frames = fewer bytes used/second of playback = more video on the same size disk.
Use a bit higher number of bytes/I-frame and you get another quality bump.
6. Shooting at 24p to start with elimates having to do a telecine (down-conversion to 24 fps) in the MPEG encoding process for DVD, which saves time. Time = money.
7. A lot of DVD's are already encoded at 24p with the inverse telecine conversion back to an interlaced display being done by the deck itself. If the capability is there might as well use it....
8. Cable is also interested in video at 24 fps because of the reasons given in #4. This saves them bandwidth, letting them offer more channels using the same pipe.
Just about every major player is starting to come out with 24p features in their products, so it's time to get used to it
Dr. MordridLast edited by Dr Mordrid; 18 January 2003, 02:24.Dr. Mordrid
----------------------------
An elephant is a mouse built to government specifications.
I carry a gun because I can't throw a rock 1,250 fps
Comment
-
The reason in #4 is enough for me to use it. I was in the area of ripping DVD's to other formats about 2 years ago, just to see what kind fo stuff was involved and to learn about video compression. I found out quite early that by using 24fps that you would actually increase the quality, because with the limited storage space on a CD, you could actually allocate more bits to the video. Instead of trying to use a lower bitrate for more frames, you could use a higher bitrate for less frames and it was always better. You could not see the fps difference either. And the fact that it was originally shot at 24fps proves that it is not a bad idea. I like 24fps alot, I am glad to see that it is going to be a more widely used standard.
The satellite area is a nice idea. We have 4 channels right now getting pumped to us overseas and the actual bandwidth that we have is limited. There were some major problems with panning and action scenes being jerky or pixelated very badly. They have fixed a few, but if they swiched to 24fps I am sure that they could get a lot better quality. I never thought about it. I will send them an email on Tuesday and take all of the credit. HAHAHAHAHA.
Thanks for the idea Doc.WinXP Pro SP2 ABIT IC7 Intel P4 3.0E 1024M Corsair PC3200 DCDDR ATI AIW x800XT 2 Samsung SV1204H 120G HDs AudioTrak Prodigy 7.1 3Com NIC Cendyne DVR-105 DVD burner LG DVD/CD-RW burner Fortron FSP-300-60ATV PSU Cooled by Zalman Altec Lansing MX-5021
Comment
-
I am NOT endorsing any interlaced format. All I'm saying is that there really is no need to adhere to 24fps instead of something higher like 30fps (progressive, not interlaced) except for backword compatibility. There is no magic in that number, in fact pans at 30fps would obviously look smoother.
We are going to be moving to a completely new television standard in a few years. TV's are going to be more like computer monitors, accepting a variety of digital input sources at different resolutions via scaling and frame rates. I know that the old standards will stay around for some time but eventually they will go away in the mainstream. Just like records have gone away in the mainstream (please no LP lectures, I'm an audiophile, yes analog sounds good, I know I've owned a recording studio).
With the latest Star Wars movie it was also shown that non-film shooting is possible with good results. For me the "film look" doesn't have so much to do with flickering and stuttering during pans as the increased contrast and warm tones it produces. The frame rate aint' going to help that! 24fps is a step backwards from 30fps progressive. That's my only point.
I guess if you really have to have the flickering you can go 24fps.
Canon XL1 and GL2 can both do 30fps progressive.
So, as I initially said, 24fps is a step backword and only good for backward compatibility.
Doc - I am not arguing your points, you know A LOT more about the standards than I do. But, all of your points deal with backward compatibility, interlacing (I never endorsed that), and saving bandwidth.
As for bandwidth, yes, there is a bandwidth limitation and a decision to be made as how to best use it. Resolution vs. frame rate. I think in the future better compression schemes and delivery systems will allow for us to have both.
It's kind of like the current CD situation. 30 or so years ago when the standard was agreed upon there was a lot of debate regarding sampling frequency and word length. Now, we find that 20 or 24 bits at 96khz or so would have been a lot closer to "perfect sound" we were assured we were getting at 16bit/44100Hz. Granted the technology really wasn't around to do it then, but now I would rather shoot for standards that have room to grow.
The failures of DVD-A and SuperAudio by Sony show how difficult it is to change a standard once it has been "mainstreamified."
I would hate to see backward compatibility be the prime component for defining our video standard for the next 50 years.
MarkLast edited by Hulk; 18 January 2003, 13:52.- Mark
Core 2 Duo E6400 o/c 3.2GHz - Asus P5B Deluxe - 2048MB Corsair Twinx 6400C4 - ATI AIW X1900 - Seagate 7200.10 SATA 320GB primary - Western Digital SE16 SATA 320GB secondary - Samsung SATA Lightscribe DVD/CDRW- Midiland 4100 Speakers - Presonus Firepod - Dell FP2001 20" LCD - Windows XP Home
Comment
-
Actually it's forward compatability, but that's semantics.
The basic reason is that they've finally realized that high frame rates arent' everything. Quality counts too and, as our PAL friends know, 24-25 fps is enough for TV.
Dr. MordridDr. Mordrid
----------------------------
An elephant is a mouse built to government specifications.
I carry a gun because I can't throw a rock 1,250 fps
Comment
-
Fair enough.
I was recently transferring my wedding footage (it's taken me 5 years to get around to it) and after checking some of on a DVD I burned, I was really surprised at how good the footage looked.
The camera used was a Sony 8mm, not hi8, just regular 8mm. As far as I know, the max resolution for this format should be around 240 or so lines. Waaay below dv.
Anyway, I captured using huffyuv, edited and then compressed MPEG-2 using TMPGEnc with as little messing with the source as possible. No cropping or resizing.
Under bright lighting conditions the footage looks much better than the low resolution of the format would suggest. Not as good as footage from my dv cam, but way better than it ought to look.
This got me thinking. Resolution is important, but one should not overlook optics and proper lighting. If 240 lines can look that good, maybe we have some distance to go with today's dv standard before it "tops out."Last edited by Hulk; 18 January 2003, 19:05.- Mark
Core 2 Duo E6400 o/c 3.2GHz - Asus P5B Deluxe - 2048MB Corsair Twinx 6400C4 - ATI AIW X1900 - Seagate 7200.10 SATA 320GB primary - Western Digital SE16 SATA 320GB secondary - Samsung SATA Lightscribe DVD/CDRW- Midiland 4100 Speakers - Presonus Firepod - Dell FP2001 20" LCD - Windows XP Home
Comment
-
The main reason DV looks good is because of its high proportion of lumance to chroma. This disproportion in NTSC DV is caused by its reduced colorspace of 4:1:1 vs. analogs 4:2:2 (luma being the 4 and the two chroma channels being the 1's & 2's), with higher numbers being better.
This means that in a 720 wide frame there is a horizontal color resolution of 360 2 pixel wide samples in analog while NTSC DV only has 180 4 pixel wide samples. Since the human eye is more sensitive to luma vs. chroma such a mix has more contrast and is brighter to the eye.
Consumer DV is pretty much "topped out" already because of its low color resolution and 25 mbit/s bitrate.
DVCPRO (a pro format) is a notch up because it uses twice the bitrate (50 mbit/s) and a 4:2:2 colorspace like analog. Gorgeous, but pretty expensive.
The HDCAM digital format takes a major step up by using a bitrate of up to 100 mbit/s in HD formats. Over $65,000/camera
BTW: there was an article on 24p and TV production in the New York Times a while back;
See....you've probably already seen 24p and didn't even know it
Dr. MordridLast edited by Dr Mordrid; 19 January 2003, 03:24.Dr. Mordrid
----------------------------
An elephant is a mouse built to government specifications.
I carry a gun because I can't throw a rock 1,250 fps
Comment
Comment