Announcement

Collapse
No announcement yet.

Radeon HD 2900 XT Review

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #16
    Mehen - the 65nm process is mature enough that there really is no reason for them not to have shipped their flagship product on it. Seriously. With all the delays they had and with the architecture they were aiming at, they should have just forgotten 80nm and gone for broke w/ 65nm. The 8800 architecture is sufficiently advanced enough that even at its die size it poses serious threats.

    As far as drivers... no... ATI's whole driver model is broken and has been broken for a really, really long time. The fact that they keep releasing drivers with horribly obvious problems (was it the 7.2 or the 7.3 release that people here kept having problems even installing?) is a pretty good indication that they have a really, really bad process for developing their drivers. Back when I was doing OGL programming and having to deal with the 8500/9700 shader paths (under OpenGL 1.3 using card specific render paths and GL extensions, no less) they were horrid. For every bug they fixed in a driver release, it wasn't uncommon for them to have at least one additional regression. A good driver release was only one regression.

    And as far as high performance at high resolutions being proof of specific driver issue that magically cures itself at higher resolutions... bullshit. Check out AnandTech's Performance Info. Specifically, look at the graphs that show FPS versus resolution, and check each card. ATI's falls in a fairly uniform pattern - consistent with the X1950. The only cards that are having abnormal changes in performance at 2560x1600 is the 8800 series... which suggests that either 1) the GTS parts are running into the upper limits of their architectures, or 2) they have a driver problem (either bug or by design) which leads to a cumulative performance degradation at higher resolutions.

    If it was a driver problem causing low performance at lower resolutions but magical performance at higher resolution the graphs would be a lot smoother across all resolutions and would show a lot less of a drop off for the 2900 as compared to the 8800. That was how I know for a fact that Matrox's Parhelia drivers had a really ****ed up OpenGL stack - The P would hold a certain plateau of performance at lower resolutions and would show little to no drop in performance when increasing resolutions.

    Another thing that tends to happen with driver bugs is that one program will perform great and another will perform horribly. This is simply not the case. It performs predictably and uniformly on pretty much all benchmarks (3DMark is the only exception). It's troubling, especially when you consider that the 2600 does fairly well given its resources. My bet is that it is a design problem with their compiler/scheduler that is causing it to not scale well with the additional processing capabilities of the 2900. This should be able to be fixed with software and a lot of time, but it's not easy to do.

    Or maybe not. I don't have a card here to poke at.

    Best comparison I have heard right now is to the GeForce FX line. Yes, later respins and driver updates fixed a lot of the criticism people had with the product line, but it pretty much took a whole new generation to fix most of the problems. Right now, both companies are on about an 18 month gap between major product lines, with respins occurring somewhere in the middle of it. That puts ATI's respin coming out a few months before NVidia's next major release.

    About the *only* thing that ATI has going for it right now is the same thing NVidia had with the GeForce FX - it gave them a much better understanding of how things worked and challenged a lot of old presumptions about what is and is not necessary with their chips. It's a new architecture, and derivative works should perform a lot faster due to tweaking the basic design. NVidia has been building their GPU's on a more modular architecture since the GeForce FX and even the 8800 series draws very heavily from the lessons learned from it. ATI will, in time, learn how to better balance their chips and software.
    "And yet, after spending 20+ years trying to evolve the user interface into something better, what's the most powerful improvement Apple was able to make? They finally put a god damned shell back in." -jwz

    Comment


    • #17
      Originally posted by Helevitia View Post
      Mehen, Are you telling me that you rather buy the 2900 XT over the 8800 GTS because of equal or better performance at high resolutions?

      Let me point out a few things to you.

      1. BFG 8800 GTS for $329 after rebate vs. $399 and higher for the 2900XT
      2. 8800 GTS has better IQ. It's a sad day when ATI has to play catch up in IQ category to Nvidia
      3. 8800 GTS runs way cooler vs. the really hot 2900 XT
      4. In almost all benchmarks that 90% of the market will run, the 2900 XT loses
      5. 2900 XT is waaaaay late to the game.

      I really wanted to like the R600. I don't like Nvidia, but Nvidia's DX10 cards are superior in every way that I can think of except for the one you pointed out about high resolutions.
      a couple of points on the higher resolutions...

      1) 2560x1600 is pretty rare for output devices.
      2) Devices that support that output resolution cost more than I have spent on a PC in a long time. If I was dumping that much into a display, I'd put a bit more money into my graphics cards as well.
      3) The performance difference between the 8800GTS and 2900XT is, well, not that much. when the 2900XT wins, it wins by less than 10fps and the 2900XT's framerate is below 30fps.

      it's really not even a point I would consider basing any decisions off of...
      "And yet, after spending 20+ years trying to evolve the user interface into something better, what's the most powerful improvement Apple was able to make? They finally put a god damned shell back in." -jwz

      Comment


      • #18
        Originally posted by Helevitia View Post
        Mehen, Are you telling me that you rather buy the 2900 XT over the 8800 GTS because of equal or better performance at high resolutions?

        Let me point out a few things to you.

        1. BFG 8800 GTS for $329 after rebate vs. $399 and higher for the 2900XT
        2. 8800 GTS has better IQ. It's a sad day when ATI has to play catch up in IQ category to Nvidia
        3. 8800 GTS runs way cooler vs. the really hot 2900 XT
        4. In almost all benchmarks that 90% of the market will run, the 2900 XT loses
        5. 2900 XT is waaaaay late to the game.

        I really wanted to like the R600. I don't like Nvidia, but Nvidia's DX10 cards are superior in every way that I can think of except for the one you pointed out about high resolutions.
        Did you even read my post? I said the R600 was a bust. I'm just saying it shows that DAAMIT is CAPABLE, as shown by the performance in higher resolutions. I wouldn't make any decisions either way right now.
        Q9450 + TRUE, G.Skill 2x2GB DDR2, GTX 560, ASUS X48, 1TB WD Black, Windows 7 64-bit, LG M2762D-PM 27" + 17" LG 1752TX, Corsair HX620, Antec P182, Logitech G5 (Blue)
        Laptop: MSI Wind - Black

        Comment


        • #19
          Interesting... I guess that the 2900XT does not actually support full HD Video decoding...

          TechReport article.

          Definately another blow against AMD right now...
          "And yet, after spending 20+ years trying to evolve the user interface into something better, what's the most powerful improvement Apple was able to make? They finally put a god damned shell back in." -jwz

          Comment


          • #20
            I suspect what with all the power leakages rumored, it is possible they deactivated the maximum to get the card to run at the required speeds without pulling even more power...
            PC-1 Fractal Design Arc Mini R2, 3800X, Asus B450M-PRO mATX, 2x8GB B-die@3800C16, AMD Vega64, Seasonic 850W Gold, Black Ice Nemesis/Laing DDC/EKWB 240 Loop (VRM>CPU>GPU), Noctua Fans.
            Nas : i3/itx/2x4GB/8x4TB BTRFS/Raid6 (7 + Hotspare) Xpenology
            +++ : FSP Nano 800VA (Pi's+switch) + 1600VA (PC-1+Nas)

            Comment


            • #21
              Originally posted by DGhost View Post
              Interesting... I guess that the 2900XT does not actually support full HD Video decoding...

              TechReport article.

              Definately another blow against AMD right now...

              Nope it doesn't but since ATI drivers are crap on playback anyway I wouldn't want to play it back through one of their cards anyway.

              Got a feeling when they can shrink the die it will be enabled for the 2900. The lesser cards do support it. Anyway the above will still apply.
              Chief Lemon Buyer no more Linux sucks but not as much
              Weather nut and sad git.

              My Weather Page

              Comment

              Working...
              X