I was explaining the diffence between interlaced video and progressive scan to a friend of mine the other day and he asked me why we started out with the interlaced format. I beleive it was for some technical reasons but I really don't know. Can someone please explain this to me?
Announcement
Collapse
No announcement yet.
One more time, why do we have interlaced video in the United States?
Collapse
X
-
One more time, why do we have interlaced video in the United States?
- Mark
Core 2 Duo E6400 o/c 3.2GHz - Asus P5B Deluxe - 2048MB Corsair Twinx 6400C4 - ATI AIW X1900 - Seagate 7200.10 SATA 320GB primary - Western Digital SE16 SATA 320GB secondary - Samsung SATA Lightscribe DVD/CDRW- Midiland 4100 Speakers - Presonus Firepod - Dell FP2001 20" LCD - Windows XP HomeTags: None
-
Interlaced video is used everywhere, not just in the US.
The reason was to double the perceived frame rate in order to lower the flickering, by sending two fields (half pictures) consecutively instead of one full frame. This trick was used because at the time, electronic circuits where not fast enough to raise the frame rate. Another effect of doubling the frame rate would be to double the necessary bandwidth of each channel, which is not the case with the interlace trick.
Michka
I am watching the TV and it's worthless.
If I switch it on it is even worse.
-
Hulk
Michka is right, but the real reasons were that
a) the original space available for TV transmissions in the VHF band was limited. By interlacing, the bandwidth required was nearly halved, allowing more channels to be inserted into the available space.
b) you will remember that electronic TV was first introduced by the BBC, England, for the London district, with trasmissions from Alexandra Palace (Ally-Pally) in 1936. In them-thar days, high-bandwidth amplifiers used pentodes, especially of the Mullard EF-37 types, with carbon anode-load resistors. This combination gave a relatively poor signal-to-noise ratio. By limiting the bandwidth to 2.5 - 3 MHz (or Mc/s, as it was then known), it was possible to extend the acceptable transmission range to cover the whole of the Greater London Area, with an acceptable noise level. You may be interested to know that the original transmission characteristics were 405 lines, 25 fps, 2:1 interlacing 5:4 aspect ratio at 45 MHz upper vestigial sideband. This standard remained up to about 1960 when it was gradually replaced by the 625 line standard. In the mid-50s, I lived in Cambridge, about 60 miles N of Ally-Pally, and was able to receive the transmissions with a just-acceptable (at the time) noise level. I was working on portable microwave Tx/Rx systems for outside broadcasts and we were able to transmit the horse racing from Newmarket to Ally-Pally (for Brits, with Raymond Glendenning commenting!) with a just acceptable quality! The cameras we were using at the time were massive brutes using a 4" Zworykin Iconoscope tube. How did we achieve synch at the time? We used the power frequency! As Ally-Pally and everywhere within reach of it were on the National Grid, the frequency, which varied considerably at the time, according to the instantaneous load, was the same everywhere. This also had the advantage that, if the smoothing in the receiver was not perfect, the hum bars remained stationary and were therefore less visible.
At the time, I was working for the telecomms side of the Pye Group in a little village called Fen Ditton, just outside Cambridge.
A wee bittie history for you all.
------------------
Brian (the terrible)Brian (the devil incarnate)
Comment
-
A truly great post Brian! I enjoyed it very much.
Have you ever considdered to be the 1st man (m/f) to publish his/hers memoire online on a forum?
Ghydda - looking forward to hear more
------------------
2+2=5 - but only for extremly large values of 2.As I always say: You can get more with a kind word and a 2-by-4 than you can with just a kind word.
My beloved Parhelia was twotiming with Dan Wood - now she's gone forever and all I got is this lousy T-shirt
|Stolen Rig|RetroGames Rig|Workstation Rig|Server Rig|
Comment
-
Brian,
Thanks for the detailed explanation. One thing still confuses me about point (a).
Why would the bandwidth for 60 frames per second consisting of 240 vertical lines each be less than the bandwidth of 30 frames per second consisting of 480 lines each?
I think I am missing something important.- Mark
Core 2 Duo E6400 o/c 3.2GHz - Asus P5B Deluxe - 2048MB Corsair Twinx 6400C4 - ATI AIW X1900 - Seagate 7200.10 SATA 320GB primary - Western Digital SE16 SATA 320GB secondary - Samsung SATA Lightscribe DVD/CDRW- Midiland 4100 Speakers - Presonus Firepod - Dell FP2001 20" LCD - Windows XP Home
Comment
-
You can think of interlacing as a form of analog compression (before digital codecs). Keep in mind, they are still using interlacing in some High Definition implementations (anything with "i" after it) - to help with the resolution vs. bandwidth tradeoff. You can have higher interlaced resolution at the same bandwidth as lower rez progressive video.Please visit http://spincycle.n3.net - My System: Celeron 300a(@450/2v),Abit BH6, 128mb RAM, Win98SE, Marvel G200TV, Diamond MX300, Maxtor DiamondMax Plus 20g system drive, DiamondMax Plus 40 capture drive, IBM 8g Deskstar program drive, Adaptec 2940UW SCSI, 9gb Barracuda UWSCSI video drive, Hitachi GD-2500 DVD-Rom, UltraPlex CD-Rom, Plexwriter CD-recorder, Viewsonic PT775, Soundworks 4.1 speakers
Comment
-
Transmitting 60 fields instead of 30 frames does not change the bandwidth, but it increases the perceived frame rate. Or better said, it lowers the apparent flickering, without increasing the bandwidth.
Thanks for your nice and informative extension on my own reply, Brian.
I would like to point two things:
The total frequency range used to transmit TV channels has increased considerably since early TV by adding UHF and other bands, but it has changed nothing to the bandwidth constraints because of the quasi explosive increase in the number of channels. Even though current electronic circuits allow far more. That's why electronic engineers are struggling to put digital and HDTV in, as much as possible, the same bandwidth as our plain old analog TV.
At the same time as UK 405 lines TV started to gain public interest in the early 50's, France had developed and implemented a completely different standard with 1219 scan lines. It was used in several countries around Europe. The bandwidth per channel was of course far higher at about 8 MHz. Anyway, this was kind of HDTV before it's time. As in the UK, colour TV at 625 lines gradually replaced the original standard in the 60's. So, UK went up the resolution scale, when at about the same time France went down.
Michka
[This message has been edited by Michel Carleer (edited 19 April 2001).]I am watching the TV and it's worthless.
If I switch it on it is even worse.
Comment
-
( read this ether in the back of premiere 4 read me or in a book i rented from a library ?)
I read somewhere that the mane problem at time was the phosphor inside the TV screen?
i will try to explain,
to display a picture on the screen the gun must sine a beam of light onto the inside
of the screen which is coated with a layer of phosphor . the beam of light scans across
the screen.
The gun must repeat this all the way down the screen before it can start the next frame.
the engineers found that they couldn't get the electronics and the gun too scan fast enough
to get back to line 1 before line 1 phosphor started to lose its glow
this made the picture flicking with uneven brightness?
the engineer's set the gun up so that it would miss a line , the gun then reached the bottom
of the screen in half of the time the gun then starts on all the even lines
this gave a picture with less flicker an more even brightness
Michel Carleer , the "1219 scan lines" are you referring to HD-mac (d2mac/Dmac/B/C/mac)
maybe it was the first version of MAC? ,i watched the EURO test feeds (wow) 5/6 years ago.
I've not seen HD-mpeg2 yet, you lucky americans.
zeb.
[This message has been edited by zeb7 (edited 21 April 2001).]My PC :Matrox G400TV AMD Duron750mhz@850mhz,256Mb,Abit KT7133raid,10gb ibm,10gb seagete,20gb7.2k-rmp fujitsu,LG CDWR 40x16x10
win98se
Entertainment : P150mhz@160mhz,16mb,VX MBoad,PCI-TNT with TV/out,H+ dvd,Creative x5 dvd
Comment
-
"Michel Carleer , the "1219 scan lines" are you referring to HD-mac (d2mac/Dmac/B/C/mac)
maybe it was the first version of MAC? "
No this was the French TV standard in the 50's and 60's. And in Belgium too. Analog signal.
So, 50 years ago, we already had HDTV in some countries in Europe. But this was B/W only, and it died with the advent of colour TV at 625 lines.
Michka
I am watching the TV and it's worthless.
If I switch it on it is even worse.
Comment
-
Zeb
Phosphor technology is such that, even in the early days, the persistence could be optimised. If the persistence is too long, you get streaking of moving images but long-persistence phosphors were used for many applications, such as radar screens. In fact, the shorter the persistence, the better it is visually. It is the persistence of the eye, not the phosphors, that renders TV possible. Modern phosphors have quite a short persistence, as you can judge by taking a photo of a screen using a still film camera with a between-lens shutter, at 1/1000 sec: you will see only a few lines, implying the rest are already dark. (This also causes the strobing effect seen when asynchronous movie-filming TV screens).
For the anecdote, in early b/w TV sets, the CRT had a low-to-medium-persistence white phosphor which was, in reality, similar to that used in fluorescent lights. This was actually a mixture of various phosphors of different colour and persistence characteristics. If you rotated a spoked wheel on the image at a speed where it strobed slowly, this often resulted in apparent colour banding, because the orangish phosphor had a longer persistence than the bluish one. The BBC did conduct some experiments, about 1955, where they tried to exploit this phenomenon to produce pseudo-colour. Of course, they failed miserably!
Just to correct what seems to be a misapprehension, the CRT does not produce a light beam, but an electron beam. When an electron strikes a phosphor molecule with a certain energy, it causes an electron within the molecule to change its energy state for a very short time. this change of energy state causes it to "jump" from one position to another and, in so doing, it releases, in the form of radiation, a "packet" of visible light. This phenomenon is called electron-excited fluorescence.
------------------
Brian (the terrible)Brian (the devil incarnate)
Comment
-
Thanks, Michka and Brian (the terrible)My PC :Matrox G400TV AMD Duron750mhz@850mhz,256Mb,Abit KT7133raid,10gb ibm,10gb seagete,20gb7.2k-rmp fujitsu,LG CDWR 40x16x10
win98se
Entertainment : P150mhz@160mhz,16mb,VX MBoad,PCI-TNT with TV/out,H+ dvd,Creative x5 dvd
Comment
-
Chris
Maybe it would help if I say that I was professionally involved, as an engineer, in TV development up to about 1957. I did my degree dissertation on colour television in 1951, believe it or not. We were a team of five students working on the project. We built a camera with 3 iconoscopes and dichroic mirrors/filters but, because of the size of the iconoscopes, it was a monster and we had to use a f4.5 lens with 15" focal length to get it all in. It worked only in bright sunlight! For the receiver, we used 3" white projector tubes, running at 35 kV, with 3 lenses and filters, producing an image about a foot across in a black-outed room. We had 6 coax cables connecting the camera to the receiver: 3 Mhz luminance, and 0.5 MHz each RGB, line synch and frame synch. It took us a full academic year to set this up (most of the major components were slightly faulted ones donated by EMI). We had worked out the reduced bandwidth for the 3 chrominance signals and we had suggested in the written paper a few ways they could possibly be multiplexed together but we did not have time to actually reducing the six signals into a single one. We used the then current standard of 405 lines/25 fps. Our prof said it was the most ambitious dissertation project he had come across, but the written papers we did, each for our part, were rather weak, because of the lack of time. Unfortunately, I have lost them in the intervening 50 years (my mother had an attic clear-out, without my knowledge, and I lost enormous amounts of papers, books, and some valuable photographic and electronics material, when she suddenly decided to move house).
Also for the anecdote, I saw a very early EMI shadow mask tube about 2 or 3 years later, at a professional exhibition in London. It was probably about 12" diameter and had quite a coarse phosphor dot pattern (possibly about 1.5 mm per triad). It could be viewed only in a dark room at about 2 or 3 metres distance to get a good visual impression of the image, but it was exciting. The camera they were using was much smaller (but still big) than our's with 3 3" image orthicons and they had only two coaxes (luminance plus synch and multiplexed chrominance with a 3 MHz sampling rate).
Going back even earlier, a team of students had done the first colour TV at the same college (Heriot-Watt, Edinburgh) a year earlier than us, using the technique developed by Baird in the late 1930s, with rotating colour filters in front of the single-tube camera and receiver. They used a 3:1 interlacing with a field frequency of 75 Hz, so that each line within a group of three was a different colour. The visual effect was not very good and the sound of the flying saucers (about a yard across for the 5" monitor tube receiver rotating at about 1000 rpm with multiple filters) was horrendous. It worked though, once they had achieved synch between the camera and monitor disks and with the image.
When I see a camcorder with an LCD screen which can be slipped into a pocket, you will not be surprised at the wonder I feel about the progress in 50 years of my career.
I did make another very brief incursion into pro video in 1964. At the time I was working for Kudelski, the maker of the Nagra tape recorders (I was with them for 10 years). For those who don't know it, the Nagra was the prime portable tape recorder used for the sound on both film news reporting and on-location feature film sound recording, with pilot synch to the cameras. They were about the size of a notebook computer case and were superb, robust analogue recorders, with virtually no wow or flutter, even when being moved, quite violently. At that time, VTR was in its infancy (Ampex recoders using 2" tape, with a console weighing about a tonne). I suggested to Stefan Kudelski that it would be great if we could shrink an Ampex down to a transportable size. A year later, Sony brought out one with a 1/2" tape and we took it apart. Its image quality was quite mediocre. SK then pronounced that video would never take over from chemical film and the idea was dropped. He actually revived it about 10 years later and he did develop a Nagravision pro portable video recorder (after I had left the company), but it was never a great commercial success, like his Nagra audio recoders were (and still are, although they are no longer the key product of the Kudelski group).
Chris, I don't think "awesome" is the mot juste. Oxford defines it as "inspiring awe; dreaded". No, maybe you're right, my rambling reminiscences are possibly dreaded Oh, and by the way, 20 years ago I was using the HP8x series computers for data capture and number-crunching of some instruments my company made!
------------------
Brian (the terrible)Brian (the devil incarnate)
Comment
Comment