Announcement

Collapse
No announcement yet.

explanation of DVI and how it relates to the parhelia

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • explanation of DVI and how it relates to the parhelia

    what is the difference between DVI-I, DVI-V and DVI-D, (pros and cons) and how do these relate to the parhelia and the banding problem, my next upgrade will probably be a TFT-monitor, and I want to make the right choice.

    Im not sure if this should be in general hardware or matrox hardware, feel free to move it, if I posted in the wrong forum.
    Last edited by TdB; 6 December 2002, 07:35.
    This sig is a shameless atempt to make my post look bigger.

  • #2
    Well, here is an overview...
    All About DVI is an in-depth resource for answers to all types of DVI questions, including the difference between single-link and dual-link, how digital and analog DVI are unique, and explanations and pictures of the various DVI connectors.



    Jörg
    pixar
    Dream as if you'll live forever. Live as if you'll die tomorrow. (James Dean)

    Comment


    • #3
      thanks!

      I guess parhelia supports them all, and that there is no banding when using DVI-I and DVI-D.

      BTW has DVI dual link anything to do with matrox dual DVI, as seen here:


      AFAICS the dual link is for really high resolutions like 2048*1536
      while the matrox dual DVI is for dualhead, however some high end tft monitors supports 2 DVI-inputs, can this be used to fake DVI dual link with matrox dualhead?
      I assume parhelia only supports single link DVI, because it has a max res of 1600*1200, or did matrox raise that to 1920*1080?.

      another question: is there a refreshrate limit for DVI, like there is for analog connections, apart from the limit on the monitor, meaning does the parhelia have any additional limitations?
      Last edited by TdB; 6 December 2002, 08:20.
      This sig is a shameless atempt to make my post look bigger.

      Comment


      • #4
        EIZO製品情報。コンピュータモニターなどの各種映像関連 機器紹介、ビジネスソリューション、ニュース、サポート、EIZO オンライン ショッピング、用語解説、会社情報など


        According to the manual, the lcd panel behaves as 2 different monitors, so I don't see why it wouldn't be possible to use the 2 outputs of any dual DVI card...


        Jörg
        pixar
        Dream as if you'll live forever. Live as if you'll die tomorrow. (James Dean)

        Comment


        • #5
          well, there is no way I can afford that monitor.
          but I can still dream.

          do you have experience with tft monitors, are they usable for gaming, and how is the color reproduction?
          This sig is a shameless atempt to make my post look bigger.

          Comment


          • #6
            well... the answers would be
            1. yes (but not with the recent crop, so view my other answers in this perspective)
            2. gaming depends (some are better than others - perhaps current LCD users can comment)
            3. for pure photo-editing, it is not sufficiant (IMO). Most LCDs tend to be brighter in the center than at the corners (esp. noticable when displaying a uniform dark colour full screen), and very few are able to provide colours as accurate as CRTs : the number of colours on a LCD is usually less than what can be shown with 16-bit colour depth). I know that physically, we cannot distinguish that many colours, but the computer needs such a large palette. There actually is a visible difference between 16-bit and 32-bit in e.g. Photoshop (due to the colourmanagement); of course, you need to get sourcefiles that have huge colourdepths.


            Jörg
            pixar
            Dream as if you'll live forever. Live as if you'll die tomorrow. (James Dean)

            Comment


            • #7
              well, the newer tft-montors can (according to the spec-sheets) show 16.7 million colors, I quess that means 8bit per channel, that is a 24bit framebuffer, so gigabit will probably be impossible, but traditional 32bit color should be supported (the alpha channel is never shown on the monitor anyway).

              Im not doing any photo editing, mostly gaming and programming, however I want my games to look good.

              which leads to another point, some newer games might be too hardware demanding to run in the monitors native resolution, how good are they at "emulating" other resolutions?
              If I buy a monitor with a native resolution at 1600*1200, then I guess it will be okay at 800*600, but how good will it look at 1024*768?
              If I buy a cheaper monitor with a native resolution at 1024*768, how will lower resolutions look at such a monitor?

              I hope someone with experience in newer TFT monitors can give me that answer.

              can anyone with a parhelia/tft combo comment on compatibility concerning the banding issue?, is there a difference between DVI-I and DVI-D in that regard?
              Last edited by TdB; 6 December 2002, 09:33.
              This sig is a shameless atempt to make my post look bigger.

              Comment


              • #8
                DVI-D single channel = 165mHz bandwidth or max res of 1600x1200. Dual channel = 330mHz bandwidth or a max res of 2048x1536

                DVI-A is not part of the DVI spec and only is there to allow analog output from a RAMDAC ie pass through.

                DVI-I is both DVI-D (single or dual channel) + DVI-A

                Each of Parhelia's output's are single channel DVI-I

                all DVI specs are established by Digital Display Working Group www.ddwg.org and head by Intel
                "Be who you are and say what you feel, because those who mind don't matter, and those who matter don't mind." -- Dr. Seuss

                "Always do good. It will gratify some and astonish the rest." ~Mark Twain

                Comment


                • #9
                  Originally posted by Greebe
                  <snip>Each of Parhelia's output's are single channel DVI-I
                  <snip>
                  For TripleHead they must use some sort of dual single-channel trick on the special cable they include (one single channel to one monitor, the other to the second) so the second DVI must have all the pins active -even if not as a real dual channel DVI plug (?).

                  Comment


                  • #10
                    No need to waste those extra pins that would normally be used for dual channel DVI
                    "Be who you are and say what you feel, because those who mind don't matter, and those who matter don't mind." -- Dr. Seuss

                    "Always do good. It will gratify some and astonish the rest." ~Mark Twain

                    Comment


                    • #11
                      how do they do it then? you can't replicate the same signal to both screens, you'd end-up with the same picture on each (?)

                      moreover, you're loosing signal quality if you're using one set of pins for 2 screens instead of one (unless you pump twice as much current on the wires).

                      correct me if I'm wrong.

                      Comment


                      • #12
                        Greebe just said how. Dual-channel DVI would use pins that are unused in a single-channel setup, like Parhelia's. So, they have some free pins.
                        Gigabyte P35-DS3L with a Q6600, 2GB Kingston HyperX (after *3* bad pairs of Crucial Ballistix 1066), Galaxy 8800GT 512MB, SB X-Fi, some drives, and a Dell 2005fpw. Running WinXP.

                        Comment


                        • #13
                          So any monitor that says DVI-D will get me a banding free digital image, while a DVI-I monitor might be a digital or an analog monitor, and if it is an analog monitor it will use an analog DVI-A connection(with banding) , to the parhelia.
                          But if it is a digital monitor with DVI-I connection, then it will work as a DVI-D, meaning no banding.
                          This sig is a shameless atempt to make my post look bigger.

                          Comment


                          • #14
                            Originally posted by Wombat
                            Greebe just said how. Dual-channel DVI would use pins that are unused in a single-channel setup, like Parhelia's. So, they have some free pins.
                            No. Dual channel uses twice the pins of single channel (even if there are unconnected ones on each channel).

                            He didn't explicitly says so, but I guess you're right in saying they must be using unused pins on the single channel -that might be (part of) a reason why there's the banding issue.

                            Comment


                            • #15
                              Originally posted by VJ
                              I know that physically, we cannot distinguish that many colours, but the computer needs such a large palette. There actually is a visible difference between 16-bit and 32-bit in e.g. Photoshop (due to the colourmanagement); of course, you need to get sourcefiles that have huge colourdepths.
                              Actually, we can see the difference between 16-bit (5 or 6 bits per channel) and 24/32-bit (8 bits per channel). You can easily see it when you create a smooth gradient (without dithering). This has nothing to do with colour management or the application that is used, just with the sensitivity of our eyes. This is why <8-bit/channel LCD's aren't good for serious colour evaluation, with any software.

                              It becomes a different story when you look at the difference between 8 and 16 bits per channel. This difference can hardly be seen by the human eye, but performing multiple operations in 8-bit precision can easily lead to noticable quantization effects. This includes the multiple transformations that occur in colour managed workflows.

                              This last problem can be solved by working with high-precision 16-bit/channel files and has nothing to do with the colour resolution of your output device. However, nothing can help you if your output device can only display 6 bits/channel...

                              Comment

                              Working...
                              X