Announcement

Collapse
No announcement yet.

More GeForce FX Details (Benchmarks)!

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #31
    my assumptions are based on architectural information that ATI has not made readily available. NVidia, on the other hand, has made equivilent information available.

    i am forced to rely on other people for this... while I am fairly positive my information is accurate, it cannot be confirmed or denied...

    as far as NVidia acctually completeing one demo or benchmark without BSODing... so what if it crashes two or three times in a demonstration of it? It was running fine for quite a while at comdex, and that level of stability is at least on par with what ATI offers... at least if the people posting in rage3d are to be believed... in addition they have not shipped the card... if it was unstable when they shipped the card that would be a different story... but....
    "And yet, after spending 20+ years trying to evolve the user interface into something better, what's the most powerful improvement Apple was able to make? They finally put a god damned shell back in." -jwz

    Comment


    • #32
      While i believe that the Geforce FX has more inherent flexibility in terms of pixel shading,the fact is we'll have to wait the better part of two years to see that in action as far as games are concerned...

      And even then,that's assuming that developers actually use the extra capabilities,since i belive that most will use the basic DX9 feature specs in the first place(V.S 2.0 and P.S 2.0)....


      Not to mention that at perhaps 500$ for the top end model,it isn't exactly an easy sell when it's direct competitor is currently selling for under 350$ right now and in most cases,at least in the near and medium term,they'll both perform roughly the same....
      note to self...

      Assumption is the mother of all f***ups....

      Primary system :
      P4 2.8 ghz,1 gig DDR pc 2700(kingston),Radeon 9700(stock clock),audigy platinum and scsi all the way...

      Comment


      • #33
        whats probably more important than benchmarks is this:
        While we couldn’t see it, the fan cooling the heat pipes was very loud – we are talking almost Delta-like volume levels. Possibly, as we get closer to seeing these cards in retail, nVidia may tweak the cooling systems to a more noise tolerable level – at least I hope so.

        When quizzed by a gamer at the sound levels coming from the back of the card, an nVidia rep was quick to suggest that it wouldn’t matter much because gamers would be using headphones during their gaming. Unless the cooling technology has thermal throttling (which it very well may, mind you) I would have to disagree with this notion.

        Say you are listening to music or fragging away with your desktop speakers, the hum of the cooling fan will still be audible since we do not all use headphones.
        no matrox, no matroxusers.

        Comment


        • #34
          I don't know if the cooling is really that loud, but c't wrote that the cooler makes two PCI slots unusable, because of it's size
          Specs:
          MSI 745 Ultra :: AMD Athlon XP 2000+ :: 1024 MB PC-266 DDR-RAM :: HIS Radeon 9700 (Catalyst 3.1) :: Creative Soundblaster Live! 1024 :: Pioneer DVD-106S :: Western Digital WD800BB :: IBM IC35L040AVVN07

          Comment

          Working...
          X