Thought I might post this here for you guys who think you "need" to have an HDMI connection for a really good picture. I hear this sort of thing all day from tech-fad-chaser types, who insist upon HDMI. They are just being played for suckers.
Announcement
Collapse
No announcement yet.
Article on DVI/HDMI vs Component Cables
Collapse
X
-
Very good read. This is why my local AV shopcarries RCA DLP TVs. You have the option to disable all the internal video converters and whatnot and use an external video processor and component video to get the best possible signal.
Though those new JVC HD-ILA displays are looking pretty tempting these days.“Inside every sane person there’s a madman struggling to get outâ€
–The Light Fantastic, Terry Pratchett
Comment
-
Originally posted by Jammrock
... This is why my local AV shopcarries RCA DLP TVs. You have the option to disable all the internal video converters and whatnot and use an external video processor and component video to get the best possible signal. ...<TABLE BGCOLOR=Red><TR><TD><Font-weight="+1"><font COLOR=Black>The world just changed, Sep. 11, 2001</font></Font-weight></TR></TD></TABLE>
Comment
-
HDMI isn't typically for the A/V'phile because they want high quality audio and you're not going to get that from a TV (at least no one's bothered in trying to build such a thing yet). HDMI is nicer than DVI for routing because of supporting longer cable lengths and its smaller connector head (fishing through walls). Longer runs (like 1000' feet) are possible with FDDI based DVI or HDMI cables though they get quite expensive. You can also use repeaters to extend the lengths, at a cost. Then there's the whole copy protection issue you have to consider. One of the nice things about digital cables is that they're relatively inexpensive compared to component cables and are less bulky. My most critical viewing is straight digital anyway so I don't want any extra D/A conversions and I can handle any digital-digital transformation myself in my HTPC. I use a WS LCD (with a plethora of video inputs) for my TV viewing right now. Most of my analog viewing has already been digitized anyway either by my DVDR or the content provider (via BUD satellite TVRO or DVD and such).<TABLE BGCOLOR=Red><TR><TD><Font-weight="+1"><font COLOR=Black>The world just changed, Sep. 11, 2001</font></Font-weight></TR></TD></TABLE>
Comment
-
There's no real point in having a cable technology that runs HD picture and sound if it won't run 100-200 feet. Typically you want that in whole-house distribution systems, but in this case, firewire does a much better job (on the video side at least). The only problem is no manufacturers support firewire because you can record the signal that comes down that wire. This was why back in the day movie studios could not own theaters, because of conflicted interest. The damned government is not doing its job nowadays though. Companies like Sony should not be able to control content AND decide the industry standard by which that content is distributed.
Comment
-
Originally posted by KvHagedornThere's no real point in having a cable technology that runs HD picture and sound if it won't run 100-200 feet. ...
I wont' be able to debate this much at the moment due to the new site move. See my post in SF.<TABLE BGCOLOR=Red><TR><TD><Font-weight="+1"><font COLOR=Black>The world just changed, Sep. 11, 2001</font></Font-weight></TR></TD></TABLE>
Comment
-
Well, I'm suffering through the JS script prompts for now so I can pick this thread up again if you want to discuss this further.
Originally posted by KvHagedornThere's no real point in having a cable technology that runs HD picture and sound if it won't run 100-200 feet. ...<TABLE BGCOLOR=Red><TR><TD><Font-weight="+1"><font COLOR=Black>The world just changed, Sep. 11, 2001</font></Font-weight></TR></TD></TABLE>
Comment
-
HD, ED, and SD TV will need to coexist for some time to come. There’s still an overabundance of SD video (both old and new content) that we will want to watch as well as the higher res formats. SD video is problematic on HD displays due to resolution and aspect ratio mismatches so it’s actually many times preferable to watch them on a SD display. I have a collection of SD LDs that I will still want to enjoy and DVDs are only ED. The new 1080p displays (such as the Qualia 006) can handle SD very well but are quite expensive and have very high-end processors for scaling and de-interlacing. One can always down-res new HD content and route it through traditional SD channels or simply capture the SD version of the same content (the major U.S. networks broadcast both SD and HD of some shows). HD seems only appropriate in the HT room for now mainly because HD isn’t of much value unless you have a display large enough to allow you to resolve the definition at the appropriate viewing distance.
I have no compelling reason to route other than SD about the house right now since my HD viewing requires my HTPC and my 23” LCD which are in my HT/living room (new large HD display system coming soon™). I’ve routed the SD video through coax all these years but still utilized speaker wire for my more critical listening in the remotely located master bedroom (both headphones and quality speakers). The less critical listening elsewhere inside and outside the house utilize the coax for both modulated video and audio. I wanted to route A/V digitally back in ’88 but the market wasn’t ready for me.
Error correction can always be placed onto these new interfaces and DTV already utilizes FEC in the TS data for ATSC and QPSK via satellite. I would think that there are FEC capable devices that can be utilized for longer runs of DTV across DVI and/or HDMI.<TABLE BGCOLOR=Red><TR><TD><Font-weight="+1"><font COLOR=Black>The world just changed, Sep. 11, 2001</font></Font-weight></TR></TD></TABLE>
Comment
-
I'm not impressed with this article. This paragraph here is quite misleading:That might be true, were it not for the fact that digital signals are encoded in different ways and have to be converted, and that these signals have to be scaled and processed to be displayed. Consequently, there are always conversions going on, and these conversions aren't always easy going. "Digital to digital" conversion is no more a guarantee of signal quality than "digital to analog," and in practice may be substantially worse. Whether it's better or worse will depend upon the circuitry involved--and that is something which isn't usually practical to figure out. As a general rule, with consumer equipment, one simply doesn't know how signals are processed, and one doesn't know how that processing varies by input. Analog and digital inputs must either be scaled through separate circuits, or one must be converted to the other to use the same scaler. How is that done? In general, you won't find an answer to that anywhere in your instruction manual, and even if you did, it'd be hard to judge which is the better scaler without viewing the actual video output. It's fair to say, in general, that even in very high-end consumer gear, the quality of circuits for signal processing and scaling is quite variable.
Also, they talk about analog surviving long cord runs w/o signal boosting. Who cares if you boost a digital signal? No harm done, unlike analog.Last edited by Wombat; 7 March 2005, 14:16.Gigabyte P35-DS3L with a Q6600, 2GB Kingston HyperX (after *3* bad pairs of Crucial Ballistix 1066), Galaxy 8800GT 512MB, SB X-Fi, some drives, and a Dell 2005fpw. Running WinXP.
Comment
Comment