Announcement

Collapse
No announcement yet.

DVI vs VGA port

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • DVI vs VGA port

    Hi,

    I just got my parhelia, and while hooking it up to my NEC MultiSync FP955, I noticed that it has both a DVI port and a VGA port. Is there a difference in quality if I use the DVI port?

    Thanks,

    -V-
    ASUS P2B-DS REV 1.06 D03 w/ DUAL 1.4GHZ Tualatins; Matrox Parhelia; M-Audio Delta 410

    Apple Powerbook G4 - 1.33GHZ

  • #2
    Im certain that theres a LOT of experts in that field around here...

    But they might not be out before darkness creeps over the lands...



    I have never seen a CRT with a DVI port tho'. Strange

    ~~DukeP~~

    Comment


    • #3
      I think Greebe had a DVI-D CRT when he was BB'ing the P.

      If it's a DVI-D interface, then I guess it's your choice as to whether the quality is better with one or the other - all you're doing is choosing between the DAC on the graphics card and the DAC in the monitor.

      If it's a DVI-A interface, then it's still analogue from the card to the screen, so you're just choosing between the quality of the cables AFAIK.
      DM says: Crunch with Matrox Users@ClimatePrediction.net

      Comment


      • #4
        its DVI-A

        Comment


        • #5
          Might still be worth experimenting with, in reference to the infamous banding issues.
          P4b@2.7, AOpen ax4spe max II, 4X Parhelia 128 with Zalman zm80c and fan -or- ATI Radeon X800GTO, 1024mb.

          Comment


          • #6
            Originally posted by Marshmallowman
            its DVI-A
            Since it's analog, it will still band.

            Comment


            • #7
              yeah, but its Digital till it gets to the monitor. From there, it uses the DAC in the monitor, right?
              ASUS P2B-DS REV 1.06 D03 w/ DUAL 1.4GHZ Tualatins; Matrox Parhelia; M-Audio Delta 410

              Apple Powerbook G4 - 1.33GHZ

              Comment


              • #8
                No. The A in DVI-A stands for "Analogue" and it's basically the same as the old fashioned signal you get with the normal VGA port, just using different plugs and sockets at the end AFAIK. So the DAC is still part of the graphics card, and the CRT just displays the signal as-is. no conversion needed.
                DM says: Crunch with Matrox Users@ClimatePrediction.net

                Comment


                • #9
                  Originally posted by GNEP
                  No. The A in DVI-A stands for "Analogue" and it's basically the same as the old fashioned signal you get with the normal VGA port, just using different plugs and sockets at the end AFAIK. So the DAC is still part of the graphics card, and the CRT just displays the signal as-is. no conversion needed.
                  sounds right
                  P4b@2.7, AOpen ax4spe max II, 4X Parhelia 128 with Zalman zm80c and fan -or- ATI Radeon X800GTO, 1024mb.

                  Comment


                  • #10
                    still, I'd try it if it's free or cheap!
                    P4b@2.7, AOpen ax4spe max II, 4X Parhelia 128 with Zalman zm80c and fan -or- ATI Radeon X800GTO, 1024mb.

                    Comment


                    • #11
                      I think the DVI-A may be a better connector than your normal 15pin, but I would think the difference may be hard to notice, but it may be worth a shot.

                      I am not sure its worth buying an expensive new cable for it tho....

                      Comment

                      Working...
                      X