Announcement

Collapse
No announcement yet.

First X800 review

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #16
    I don't know about the affordability... let's see, for those of us with Parhelia's and three monitors... since no new cards support three screens, maybe I'll just have to sell my parhelia and one of my 21 Inch monitors... then maybe I can afford one of the flashy ones... Bah, who am I kidding, I don't think I could give up the three screens....

    Leech
    Wah! Wah!

    In a perfect world... spammers would get caught, go to jail, and share a cell with many men who have enlarged their penises, taken Viagra and are looking for a new relationship.

    Comment


    • #17
      In my opinion, the biggest advantage ATI has is form factor. I don't want my video card dictating the power supply I have to get, and I don't want it taking up more than 1 slot. Even if the ATI were slightly slower, I'd still take it over that nvidia beast.

      However, for my personal use, I'm just glad this new card is out so that I can get a better price on a 9600XT
      Lady, people aren't chocolates. Do you know what they are mostly? Bastards. Bastard coated bastards with bastard filling. But I don't find them half as annoying as I find naive, bubble-headed optimists who walk around vomiting sunshine. -- Dr. Perry Cox

      Comment


      • #18
        Don't go for the 9600, you can find good deals on the 9800, a much nicer card, for not much more money now.
        Gigabyte P35-DS3L with a Q6600, 2GB Kingston HyperX (after *3* bad pairs of Crucial Ballistix 1066), Galaxy 8800GT 512MB, SB X-Fi, some drives, and a Dell 2005fpw. Running WinXP.

        Comment


        • #19
          Just a quick question, that will probably take a long answer...

          What exactly are the differences in a 9800 Pro and a 9800 XT? (I know there is more than just clock speed, just can't remember what it is..)

          Leech
          Wah! Wah!

          In a perfect world... spammers would get caught, go to jail, and share a cell with many men who have enlarged their penises, taken Viagra and are looking for a new relationship.

          Comment


          • #20
            9800 pro's are availble in 128m and 256.

            9800 XT's are only supposed tobe 256m and have a highe clock speed, asl *some* XT's have thermal sensor which is used to monitor the temp of the GPU, this is used for the "overdrive feature"(automatic OCing).

            But their are a lot of 9800xt's without thermal sensors as well, but they cannot use "overdrive"

            Comment


            • #21
              OK, I am just thinking out loud here.

              My initial impressions are the X800 is a better quality product and the pros out weigh the cons compared to the 6800.

              First of all, people are not looking at the overall picture. All of the reviews suck ass! Notice that almost all reviews don't test the 6800U in anything higher than 4xAA/8xAF. Why? Because the X800 owns the 6800U in those tests. Don't believe me? Go read the Hexus review. Even the 9800XT beats out the 6800U in those settings! And the games are still playable with 1280x1024 or 1600x1200 with thos settings turned on as well!

              Example:


              Now think of the massive power required for the 6800U. Nvidia recommends a 480W power supply! Holy smokes! This alone would prevent me from buying this card.

              What about the "optimizations" people are still doing for Nvidia so they can keep up with ATI. If you change farcry.exe to fartcry.exe the 6800U runs slower. Imagine that

              Gabe said today that HL2 runs 40% faster on the X800. Damn!

              PS3.0 - a pipe dream for now.

              OK, I know it's not all a bed of roses in the ATI camp. For instance, the R420 is just a supercharged R300. This means the drivers are already mature. This is why the card is so fast out of the gate. The 6800U is a new breed. The drivers will mature and performance will increase much more over the X800. It will be interesting to see these two cards 6 months from now. X800 performace will improive marginally at best.

              I don't run Linux so I can't comment on the drivers for that. The general consensous is that Nvidia's is much better. Dualhead support is supposed to be better as well for Nvidia.

              The other thing that bothers me is that no one ever reviews the non gaming features. I really wnat to know more about the other features but oh well.

              OK, so that's it for now.

              Dave
              Ladies and gentlemen, take my advice, pull down your pants and slide on the ice.

              Comment


              • #22
                Yeah the ATI card looks good, but with opengl/linux in mind the NV card is higher on my list, the 6800 will trounce the x800 in linux and that is the nub of my next buying decsion.

                There is some talk that ATI is in the prodcess of total rewriting there opengl drivers (in windows) , which if true will probably filter down to linux (opengl=linux)

                But ATI has burnt me with the linux drivers so far, they did a bit of work a year ago...but if anything perfomance has been going backward since then.

                So for me a 6800 GT is in the right price/perfomnce/COMPATABILTY ballpark , and failing a linux miracly from ATI it will in all likely hood be my nect major video card upgrade.

                pity about matrox, they really have become a dead loss with regard to consumer video cards.

                Comment


                • #23
                  The HTPC folks are giving the X800 the thumbs up because of its low power requirements. There are conflicting reports on its WMP9 decoding relative to the 6800 Ultra and I'd like to see more info on this feature and benchmarks.
                  <TABLE BGCOLOR=Red><TR><TD><Font-weight="+1"><font COLOR=Black>The world just changed, Sep. 11, 2001</font></Font-weight></TR></TD></TABLE>

                  Comment


                  • #24
                    @Marshmallowman: why would you spend 500$ on a graphicscard and buy a new PSU to play Tuxracers?? Beats me...

                    @Helevitia: accoring to your graph, the R420 completely owns the NV40. NV has always been very fast when no IQ optimization was used. As usual, Ati ows them when the eyecandy is turned on.

                    Also, the NV40 needs a 50% improvement in performance to reach the R420. I think ATi will have a new revision before that happens...

                    If I had to choose a new card, that'd definitely be a R420. In the meantime, I'll stick to my 9800Pro - I don't think there's a game out that pushes it to its limit yet...
                    Last edited by Kurt; 6 May 2004, 04:12.

                    Comment


                    • #25
                      Guess it's a good thing that my PSU pumps out 550W.... Still not sure if I'll upgrade to a nVidia card or not... Though it's quite annoying to have X kill the screen with the 2.6.5 Linux kernel on my Parhelia....

                      Leech
                      Wah! Wah!

                      In a perfect world... spammers would get caught, go to jail, and share a cell with many men who have enlarged their penises, taken Viagra and are looking for a new relationship.

                      Comment


                      • #26
                        You have to remember though when ATi is owning Nvidia at 6x vs. 8x that Nvidia is doing Supersampling (along with multisampling at the same time, 4xSS + 2xMS and another mode at 4xMS +3xSS). IMHO, it's better image quality there and it will work on older games (only place where it's useful) where ATi cannot.

                        Having the ability to run Linux is nice, and Nvidia is the only company to explicitly support FreeBSD.

                        Also, I think I read one review somewhere that did some power testing, and IIRC you really won't need a new power supply.

                        With Dual DVI on the reference cards, better desktop management, and increased manufacturing quality (most Nvidia boards are "BBN" reference through Flextronics) since around the FX era, along with the aforementioned pros, things are looking better for Nvidia's future with me.

                        Kurt: Remember that Nvidia will also have a new revision, NV45, coming soon.

                        Comment


                        • #27
                          because of the scaling involved in that graph I wam willing to be that there are quite a number of optimizations that NVidia has to do with their drivers. Because of the nature of both the GFFX and the 6800U series, they are extremely sensitive to optimization, and a poorly optimized driver can cause massive performance losses.

                          oh yeah, wouldn't believe too much of what Hexus puts out. they are sloppy with their reviews...

                          The X800 XT gets to scrap with 6800 Ultra and 9800 XT using BETA CATALYST drivers (which I think will be validated and WHQL'd as CATALYST 4.5 in the very near future).

                          Time constraints prevent me from looking at display driver quality too much, so everything was tested at the maximum possible image quality settings provided by the driver control panel (High Quality). That means full trilinear texture filtering throughout and lack of any Compressonator analysis between High Quality, Quality and Performance driver settings.

                          Basically it was made to work as hard as possible at all times, at each tested display setting.

                          Tested resolutions and settings are identical to the NV40 article.

                          1024x768, no aniso, no anti-aliasing, full trilinear filtering
                          1024x768, 8X aniso, 4X anti-aliasing, full trilinear filtering
                          1024x768, 16X aniso, 8X(NV40)/6X(R420 & R360) anti-aliasing, full trilinear
                          1280x1024, 16X aniso, 8X(NV40)/6X(R420 & R360) anti-aliasing, full trilinear

                          1600x1200 with 16X aniso and 4X anti-aliasing numbers were recorded ready for when NVIDIA can get a 6800 Ultra back to us for futher testing. I'll update this article with updated graphs as soon as that happens.

                          For people that'll moan about the 8X AA mode choice for the NV40 yet again, it's the highest AA mode the card supports and people buying the cards will, quite rightly, assume that they can just whack everything up to maximum and still have playable framerates. That's the 8X mode on NV40 and it's slow, there's no escaping it. A "fairer" mode comparison will have to wait for NVIDIA to supply another NV40.
                          "And yet, after spending 20+ years trying to evolve the user interface into something better, what's the most powerful improvement Apple was able to make? They finally put a god damned shell back in." -jwz

                          Comment


                          • #28
                            Originally posted by Kurt
                            @Marshmallowman: why would you spend 500$ on a graphicscard and buy a new PSU to play Tuxracers?? Beats me...
                            There's this one game, you might have heard, of, UT2004, or something.

                            Also, you can get a lot of real work done on Linux machines too, with the various rendering and CAD programs.
                            Gigabyte P35-DS3L with a Q6600, 2GB Kingston HyperX (after *3* bad pairs of Crucial Ballistix 1066), Galaxy 8800GT 512MB, SB X-Fi, some drives, and a Dell 2005fpw. Running WinXP.

                            Comment


                            • #29
                              The difference is that of GF3 vs R8500 and not that of Ti4600 vs R9700. It's by no means dramatic, so we can't talk about owning.

                              Furthermore, 6800 is a new architecture, while the X800 is based on R300 and the nV has much more room for driver improovements, while Ati has already a tweaked mature driver.

                              Other things:
                              - 6800 has dual DVI
                              - 6800 has better OpenGL support
                              - 6800 has better 'nix support
                              - 6800 has PS3.0
                              - nVidia has based on perception from forums generally had better drivers (I mean stability, features, multimonitor implementations, not cheats)
                              - nVidia has cheated recently, Ati has cheated in the past

                              I think the differences are not huge and that decisions will be made based on features. I currently don't need a gaming card of that class, but if I'd be buying now, I'd proabably lean towards nVidia.

                              Comment


                              • #30
                                I'm definitely leaning towards an nVidia card, my only question is... how good is the image quality on the normal 2D display ? In the past, nVidia hasn't exactly been anywhere NEAR Matrox's quality (which is why I still have a matrox card, well that and Surround Gaming just simply ROCKS!!)

                                Leech
                                Wah! Wah!

                                In a perfect world... spammers would get caught, go to jail, and share a cell with many men who have enlarged their penises, taken Viagra and are looking for a new relationship.

                                Comment

                                Working...
                                X