Announcement

Collapse
No announcement yet.

HL2 Perf. Comments

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #31
    AquaMark confirms the Halflife2 numbers:


    Pretty solid here - the current Nvidia GPU is dead, when it comes to DirectX 9.

    Nvidia must have known this for some time - how come they havent replaced it with a newer model? I mean, its fine and all up til now, not that many applications using directx 9 now. But they should have been ready for the fall releases - no secrets there...



    ~~DukeP~~

    Comment


    • #32
      Head on over to Beyond3d and read their preview on it. They always provide the best articles.

      Comment


      • #33
        Now John Carmack agrees on the Halflife2 findings..


        Yet another of the AUCH thingies..


        ~~DukeP~~

        Comment


        • #34
          Yeah, I was just about to post that!

          Comment


          • #35
            Heres a question on this whole faisco with Nvidia.....when Microsoft was developing DX9, didnt they use the R8500 as the development card? And if they did was there any reason in particular that they did? I'm just wondering if ATI might have had the upper hand developing a DX9 card since MS did their development work with one of their products and Nvidia didnt have anything soild to work with?
            Why is it called tourist season, if we can't shoot at them?

            Comment


            • #36
              Heres a question on this whole faisco with Nvidia.....when Microsoft was developing DX9, didnt they use the R8500 as the development card? And if they did was there any reason in particular that they did? I'm just wondering if ATI might have had the upper hand developing a DX9 card since MS did their development work with one of their products and Nvidia didnt have anything soild to work with?
              As I said in another thread , the problem with Nvidia was they never could predict how advanced ATi R3xx architecture would be. Since the beggining, Nvidia cards were poor performers on the DirectX version they were targeting. This was mostly a commercial strategy. An improved/next generation card would be launched 6 months later, get raving reviews, and there you go, time to upgrade.
              If R3xx didn´t exist, there would be nothing to compare NV3x with, and everybody would think how great the FX5800 was. But Nvidia was caught with its pants down, and its whole DX9 line just couldn´t compete - and you can´t change a whole chip architecture design in a couple of months, it usually takes years to complete. So the only thing Nvidia could do was to release an insanely overclocked NV30 and NV35 just after, way before schedule. ATi, having a much better design, just needed to improve yelds/clock speeds and slowly managing the advantage they have.

              Comment


              • #37
                Yeah I think it was supposed to be the "introductory" cg card that would lead the industry down the cg garden path.

                But it didn't

                Comment


                • #38
                  LOL that reminds me of CG!

                  Their investment on CG became a total waste of money. They should have spent that money and invest it to develop better hardware!

                  Its just like the AMD64 theory. If Athlon 64 cannot perform well (which I doubt) then AMD is done for because they probably spent big bucks to develop x86-64 tech.

                  The only good thing nVIDIA has is their driver's flexability ***not the image quality aspect*** i.e. nVIEW is more powerful than HydraVision. I use them both, and I preview nVIEW over HyrdaVision. Does anybody here actually use it? (I use transparency window dragging)

                  Comment


                  • #39
                    Chrono_Wanderer: Try to instal UltraMon and you´ll not miss Nview anymore

                    Comment

                    Working...
                    X