Announcement

Collapse
No announcement yet.

Doom3 benchmarks

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #46
    I remember us all talking about the Parhelia before it arrived and wondering if we should purchase it to play Doom 3. It was obvious it was too early to buy a Doom 3 card then and that was for a predicted summer '03 release of the game. Sure glad I didn't bother.
    <TABLE BGCOLOR=Red><TR><TD><Font-weight="+1"><font COLOR=Black>The world just changed, Sep. 11, 2001</font></Font-weight></TR></TD></TABLE>

    Comment


    • #47
      Originally posted by Helevitia
      The leaked beta ran on it. Apparently it was horribly slow.
      yes, it ran worse than any other card out there. but that was also before Carmack decided to pull legacy render paths except for the NV10 one.

      Doom has dropped support for vendor-specific vertex programs
      (NV_vertex_program and EXT_vertex_shader), in favor of using
      ARB_vertex_program for all rendering paths. This has been a pleasant thing to
      do, and both ATI and Nvidia supported the move. The standardization process
      for ARB_vertex_program was pretty drawn out and arduous, but in the end, it is
      a just-plain-better API than either of the vendor specific ones that it
      replaced. I fretted for a while over whether I should leave in support for
      the older APIs for broader driver compatibility, but the final decision was
      that we are going to require a modern driver for the game to run in the
      advanced modes. Older drivers can still fall back to either the ARB or NV10
      paths.

      The newly-ratified ARB_vertex_buffer_object extension will probably let me do
      the same thing for NV_vertex_array_range and ATI_vertex_array_object.
      this basically means that since the Parhelia cannot support the ARB extensions and is unlikely to support the the extensions required to run the NV10 render path, it probably will not work. I am not too positive about the NV10 path and what extensions it uses, but odds are that it will not run.

      no, it will not run on par with an 8500. i know the capabilities of the OpenGL fragment shader extensions, and unless they fixed the rather serious design problems that were in it, they do not perform as well as the 8500 counter parts.
      "And yet, after spending 20+ years trying to evolve the user interface into something better, what's the most powerful improvement Apple was able to make? They finally put a god damned shell back in." -jwz

      Comment


      • #48
        Why can't it use the ARB path? He (as in John Carmack, in the email I received April 2004 and have quoted a few times) said it did, even though it was slow.

        edited for clarity (a day later).
        Last edited by bsdgeek; 1 August 2004, 10:17.

        Comment


        • #49
          the leaked E3 2001 demo was before he killed legacy support. IIRC, the Parhelia only has EXT_vertex_shader support (implemented per ATI's extension specs) and does not have the extensions required for the NV10 paths.
          "And yet, after spending 20+ years trying to evolve the user interface into something better, what's the most powerful improvement Apple was able to make? They finally put a god damned shell back in." -jwz

          Comment


          • #50
            But I'm not talking about the NV10 path, I'm talking about the ARB path. In the quote you used above, he said that older drivers can fall back to either ARB or NV10.

            Comment


            • #51
              agreed, I am not too positive about what he was referring to (perhaps the prefered path for vertex shaders would be to use some of the newer extensions instead)... i was, however, able to dig up a few threads from back in the day here on MURC....

              the first, is about this very same .plan update...

              the second is a list of what extensions are supported by the Parhelia. checking out the list you can see that the ARB extensions that are supported either deal with texture management, "light points" , and of matrix math.

              so... it cannot "fall back" to either the ARB or NV10 path, as it (at present) does not support them...
              "And yet, after spending 20+ years trying to evolve the user interface into something better, what's the most powerful improvement Apple was able to make? They finally put a god damned shell back in." -jwz

              Comment


              • #52
                Time to buy my X800!

                Comment


                • #53
                  Originally posted by Elie
                  Time to buy my X800!
                  Just don't bother with the 12 pipeline Pro.

                  I should be picking up my D3 early next week or even Monday (preordered from EB). I have the BFG 6800 GT OC with a 2.8 GHz P4 (have oc'ed to I believe 3.5). I should be getting a really good idea of what this game is capable of doing. The latest investigations looks like ATI may always have trouble with their Hierarchical-Z implementation on games like D3. The NVIDIA 6800 GPU reportedly is a much smoother renderer than the X800 as well because of the Hierarchical-Z implementation. I've yet to do much 3D with my 6800 but I'm quite interested in comparing the other features of these cards, such as their HD and video processing support. Of course you only hear about D3 benches right now.
                  <TABLE BGCOLOR=Red><TR><TD><Font-weight="+1"><font COLOR=Black>The world just changed, Sep. 11, 2001</font></Font-weight></TR></TD></TABLE>

                  Comment


                  • #54
                    I got a x800 VIVO (12 pipeline) pro.

                    thats bad...

                    But i softmodded it to a 16 pipeline xt pe

                    thats good!

                    But it runs a little hot.

                    thats bad...

                    but i getting a nice cooler for it on the 10th

                    thats good

                    but i dont have doom3 yet...

                    ...

                    ...

                    ...

                    thats bad.

                    Oh...

                    (the simpsons)
                    |CPU|Intel P4 2.8GHz(800MHz) @ 3.4GHz(980MHz)|CPU Cooling| CoolerMaster Hyper 6 CPU Heatsink & Fan, Arctic Silver 5 |Mobo|MSI 865PE Neo2-FIS2R (Bios 2.4)|RAM|2 x 256MB Geil PC3200 RAM, Dual Channel, 5:4, 2-3-3-5|HDD|2 x WD 36G 10000RPM SATA (8MB Buffer), RAID 0|Video Card|256MB Sapphire x800pro VIVO 475/450@540/520, 16 pipelines enabled, 3D Connect bios|Video Card Drivers|Omega v2.5.90|CD-RW|LiteOn CD-RW|LCD|17" ViewSonic VP171s|PSU|Enhance 400W|K&M|Logitech MX Duo|OS|Windows XP, SP2, DirectX 9c

                    Comment

                    Working...
                    X