Announcement

Collapse
No announcement yet.

X800 Pro pre-order

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #16
    Originally posted by bsdgeek
    Doesn't Matrox use external chips for DVI? Chrontel or some such?
    Your correct... http://www.chrontel.com/products/7301.htm
    "Be who you are and say what you feel, because those who mind don't matter, and those who matter don't mind." -- Dr. Seuss

    "Always do good. It will gratify some and astonish the rest." ~Mark Twain

    Comment


    • #17
      looks like something may have changed at allstarshop:

      P4b@2.7, AOpen ax4spe max II, 4X Parhelia 128 with Zalman zm80c and fan -or- ATI Radeon X800GTO, 1024mb.

      Comment


      • #18
        They took out all of the specs.

        Also, first leaked benches here
        Ladies and gentlemen, take my advice, pull down your pants and slide on the ice.

        Comment


        • #19
          Hmm, I didn't get very far before I stopped reading the flamewar over there.

          However, so far nobody seems to be mentioning that the THG review probably has the nv40 using the PS1.1 shader path. So the X800 is <I>really</I> trouncing it.
          Gigabyte P35-DS3L with a Q6600, 2GB Kingston HyperX (after *3* bad pairs of Crucial Ballistix 1066), Galaxy 8800GT 512MB, SB X-Fi, some drives, and a Dell 2005fpw. Running WinXP.

          Comment


          • #20
            Originally posted by Wombat
            Hmm, I didn't get very far before I stopped reading the flamewar over there.

            However, so far nobody seems to be mentioning that the THG review probably has the nv40 using the PS1.1 shader path. So the X800 is <I>really</I> trouncing it.

            If you continue reading you will find that all benches were posted.

            For those that don't want to look. The X800 Pro is inline with the 6800U in most benchmarks. The X800 XT puts a hurt on the 6800U. The X800 loses horribly in the CoD benchmark(seems like a driver tweak or maybe Nvidias OGL ICD is that much better).

            AFAIK PS 1.1 vs. PS 2.0, I think that Nvidia has to use more PS 1.1 but it still uses PS 2.0. That is how I understoood it after reading several forum posts at B3D.





            Last edited by Helevitia; 3 May 2004, 16:13.
            Ladies and gentlemen, take my advice, pull down your pants and slide on the ice.

            Comment


            • #21
              I'd like to see the numbers for Parhelia on the same setup..

              Comment


              • #22
                Originally posted by Wombat
                Hmm, I didn't get very far before I stopped reading the flamewar over there.
                Beyond3D's signal to noise ratio has certainly gotten much worse recently.

                KvH: I can save you the trouble of waiting to find one, the score is ATi: 1 Gillion, Matrox: 1.

                Comment


                • #23
                  Originally posted by Helevitia

                  AFAIK PS 1.1 vs. PS 2.0, I think that Nvidia has to use more PS 1.1 but it still uses PS 2.0. That is how I understoood it after reading several forum posts at B3D.
                  The 1.1 shaders cause some pretty crappy IQ problems. If you force the 6800 to use PS2.0 (by telling the game it's an R300), then these IQ problems go away, BUT the 6800's score drops by <B>20%</B>.



                  Makes the X800 kick its ass even harder.
                  Gigabyte P35-DS3L with a Q6600, 2GB Kingston HyperX (after *3* bad pairs of Crucial Ballistix 1066), Galaxy 8800GT 512MB, SB X-Fi, some drives, and a Dell 2005fpw. Running WinXP.

                  Comment


                  • #24
                    Originally posted by KvHagedorn
                    I'd like to see the numbers for Parhelia on the same setup..
                    You could put a graph up and insert another 25 video card scores and you'd have to scroll a long way down to find the Parhelia.
                    X800 it still doesnt do triple head.
                    Alcohol and Drugs make life tolerable.

                    Comment


                    • #25
                      Originally posted by Wombat
                      The 1.1 shaders cause some pretty crappy IQ problems. If you force the 6800 to use PS2.0 (by telling the game it's an R300), then these IQ problems go away, BUT the 6800's score drops by <B>20%</B>.



                      Makes the X800 kick its ass even harder.
                      It still wouldn't be an equal comparison since the 6800 would be doing FP32 vs. FP24. (Unless it's different for X800 now)

                      Comment


                      • #26
                        I'm not sure about the FP precision. Still, it's equal enough for me, since it's actually making the 6800 run as a DX9 part, not DX8.1
                        Gigabyte P35-DS3L with a Q6600, 2GB Kingston HyperX (after *3* bad pairs of Crucial Ballistix 1066), Galaxy 8800GT 512MB, SB X-Fi, some drives, and a Dell 2005fpw. Running WinXP.

                        Comment


                        • #27
                          Taken from the B3D review:

                          The precision of the shader core remains at FP24 per component.
                          Referring to the X800.
                          Ladies and gentlemen, take my advice, pull down your pants and slide on the ice.

                          Comment


                          • #28
                            what's the big deal when the DAC's are still at 8 bits per channel?
                            We have enough youth - What we need is a fountain of smart!


                            i7-920, 6GB DDR3-1600, HD4870X2, Dell 27" LCD

                            Comment


                            • #29
                              Originally posted by tjalfe
                              what's the big deal when the DAC's are still at 8 bits per channel?
                              A higher precision reduces rounding errors. Which results in less visual artifacts.
                              Main: Dual Xeon LV2.4Ghz@3.1Ghz | 3X21" | NVidia 6800 | 2Gb DDR | SCSI
                              Second: Dual PIII 1GHz | 21" Monitor | G200MMS + Quadro 2 Pro | 512MB ECC SDRAM | SCSI
                              Third: Apple G4 450Mhz | 21" Monitor | Radeon 8500 | 1,5Gb SDRAM | SCSI

                              Comment


                              • #30
                                I still doubt any human can really tell the difference, except perhaps in extreme cases
                                We have enough youth - What we need is a fountain of smart!


                                i7-920, 6GB DDR3-1600, HD4870X2, Dell 27" LCD

                                Comment

                                Working...
                                X