Announcement

Collapse
No announcement yet.

HL2 Perf. Comments

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #16
    Nvidia allready out doing PR Bull.

    While Valve states: Do NOT use beta drivers for testing (and have had to code specificly for nvidia, to even allow the game to run on their hardware), Nvidia states this:

    "The Optimiziations for Half-Life 2 shaders are in the 50 series of drivers which we made available to reviewers on Monday [Sept. 8, 2003]. Any Half-Life 2 comparison based on the 45 series driver are invalid. NVIDIA 50 series of drivers will be available well before the release of Half-Life 2".

    So basically we have a race: Do Valve get HL2 out of the door before Nvidia gets its "optimised" Det 50 out?
    The fun part is, only Nvidia participates in this race. Valve just releases "when they are ready".

    Oh well. More bull.

    ~~DukeP~~

    Comment


    • #17
      Ouch. Now THERES to cheating on drivers:
      (From Techreport TechReport )
      "As you can tell from looking at the list in the slide above, Newell was concerned particularly with some of the techniques NVIDIA has used in recent driver releases, although he didn't exempt other graphics hardware makers from his complaints. He said they had seen cases where fog was completely removed from a level in one of Valve's games, by the graphics driver software, in order to improve performance. I asked him to clarify which game, and he said it was Half-Life 2. Apparently, this activity has gone on while the game is still in development. He also mentioned that he's seen drivers detect screen capture attempt and output higher quality data than what's actually shown in-game. "


      Oh man. Optimising for recognicsing screen grappers...
      How low can one go?
      The GFX PR Limbo.

      ~~DukeP~~

      Comment


      • #18
        nVidia screwed up big time, again. The miracle Det50s will probably be the biggest cheat pack ever., and the performance will probably still be way below ATI.
        no matrox, no matroxusers.

        Comment


        • #19
          Now, i have nothing against Nvidia, i even respect them for choosing a different path (has to be someone) than ATi in their architecture choices.
          But i am sooooo damn happy at having bought an ATi card.
          HL2 will be great (and Doom ]I[ , of course).
          PC-1 Fractal Design Arc Mini R2, 3800X, Asus B450M-PRO mATX, 2x8GB B-die@3800C16, AMD Vega64, Seasonic 850W Gold, Black Ice Nemesis/Laing DDC/EKWB 240 Loop (VRM>CPU>GPU), Noctua Fans.
          Nas : i3/itx/2x4GB/8x4TB BTRFS/Raid6 (7 + Hotspare) Xpenology
          +++ : FSP Nano 800VA (Pi's+switch) + 1600VA (PC-1+Nas)

          Comment


          • #20
            Originally posted by GT98
            thing i wonder about is Valve stated that you could run it on a Geforce3 with a 1Ghz machine....from these charts it looks like it puts a hurting on the top end equipment at 1024x768.
            I think that's because of the all aye candy turned on. Valve stated also that it'll be playable on 800Mhz with TNT2 class card, with all fancy graphics turned off (and that Source scales down very well...and up of course ). So while I'll get "only" physics and gamplay of HL2 on G400, I should be able to enjoy it...

            Comment


            • #21
              Not my kind of game anyway
              [size=1]D3/\/7YCR4CK3R
              Ryzen: Asrock B450M Pro4, Ryzen 5 2600, 16GB G-Skill Ripjaws V Series DDR4 PC4-25600 RAM, 1TB Seagate SATA HD, 256GB myDigital PCIEx4 M.2 SSD, Samsung LI24T350FHNXZA 24" HDMI LED monitor, Klipsch Promedia 4.2 400, Win11
              Home: M1 Mac Mini 8GB 256GB
              Surgery: HP Stream 200-010 Mini Desktop,Intel Celeron 2957U Processor, 6 GB RAM, ADATA 128 GB SSD, Win 10 home ver 22H2
              Frontdesk: Beelink T4 8GB

              Comment


              • #22
                yea. NV3x itself is flawed. They should "abandon" the NV30 design. Its just inefficient... and even a 5600 gets beaten by a GF4Ti... that's just sad. And IMO NV35 is just a heater... maxing out the chip itself...

                Comment


                • #23
                  These guys took a shot at studying the patents to figure out how NV30 works: http://www.3dcenter.org/artikel/cinefx/index_e.php
                  Gigabyte P35-DS3L with a Q6600, 2GB Kingston HyperX (after *3* bad pairs of Crucial Ballistix 1066), Galaxy 8800GT 512MB, SB X-Fi, some drives, and a Dell 2005fpw. Running WinXP.

                  Comment


                  • #24
                    I have to agree that my 300€ Radeon 9700 was a nice investment...

                    Comment


                    • #25
                      I don't regret getting the 9700Pro when the P didn't turn out the way I expected it to. It's the best investment I have ever made in a graphics card.

                      Comment


                      • #26
                        Now that the game isn't hampered by the restrictive authentication policy anymore, I'm starting to get concerned about performance with Parhelia. (Though if it keeps up the way it did in previous games, and considering the 4600 is faster than the 5900Ultra in some benches, we may see Parhelia being faster than NV's current flagship under AA benchmarks! )

                        Comment


                        • #27
                          nVidia responds:

                          Over the last 24 hours, there has been quite a bit of controversy over comments made by Gabe Newell of Valve at ATIs Shader Day.

                          During the entire development of Half-Life 2, NVIDIA has had close technical contact with Valve regarding the game. However, Valve has not made us aware of the issues Gabe discussed.

                          We're confused as to why Valve chose to use Release. 45 (Rel. 45), because up to two weeks prior to the Shader Day we had been working closely with Valve to ensure that Release 50 (Rel. 50) provides the best experience possible on NVIDIA hardware.

                          Regarding the Half-Life 2 performance numbers that were published on the web, we believe these performance numbers are invalid because they do not use our Rel. 50 drivers. Engineering efforts on our Rel. 45 drivers stopped months ago in anticipation of Rel. 50. NVIDIA's optimizations for Half-Life 2 and other new games are included in our Rel.50 drivers, which reviewers currently have a beta version of today. Rel. 50 is the best driver we've ever built. It includes significant optimizations for the highly-programmable GeForce FX architecture and includes feature and performance benefits for over 100 million NVIDIA GPU customers.

                          Pending detailed information from Valve, we are unaware of any issues with Rel. 50 and the drop of Half-Life 2 that we have. The drop of Half-Life 2 that we currently have is more than 2 weeks old. It is not a cheat or an over optimization. Our current drop of Half-Life 2 is more than 2 weeks old. NVIDIA's Rel. 50 driver will be public before the game is available. Since we know that obtaining the best pixel shader performance from the GeForce FX GPUs currently requires some specialized work, our developer technology team works very closely with game developers. Part of this is understanding that in many cases promoting PS 1.4 (DirectX 8) to PS 2.0 (DirectX 9) provides no image quality benefit. Sometimes this involves converting 32-bit floating point precision shader operations into 16-bit floating point precision shaders in order to obtain the performance benefit of this mode with no image quality degradation. Our goal is to provide our consumers the best experience possible, and that means games must both look and run great.

                          The optimal code path for ATI and NVIDIA GPUs is different, so trying to test them with the same code path will always disadvantage one or the other. The default settings for each game have been chosen by both the developers and NVIDIA in order to produce the best results for our consumers.

                          In addition to the developer efforts, our driver team has developed a next-generation automatic shader optimizer that vastly improves GeForce FX pixel shader performance across the board. The fruits of these efforts will be seen in our Rel.50 driver release. Many other improvements have also been included in Rel.50, and these were all created either in response to, or in anticipation of the first wave of shipping DirectX 9 titles, such as Half-Life 2.

                          We are committed to working with Gabe to fully understand.


                          Jammrock
                          “Inside every sane person there’s a madman struggling to get out”
                          –The Light Fantastic, Terry Pratchett

                          Comment


                          • #28
                            Lol.
                            The job of writing press releases for Nvidia must at this time be compareable to the job the Iraqie Informationsminister had.


                            There are NO cheats in our drivers! Our strenght is, eh Strong!

                            The percieved flaws must be a misunderstanding!

                            The (tech) journalists all lie!

                            Our codepaths are Superior to Microsofts!

                            DirectX 9 is ATI, Ours are better!

                            They will all burn once our new drivers will come out!!




                            ~~DukeP~~

                            Comment


                            • #29
                              The Welsh support two teams when it comes to rugby. Wales of course, and anyone else playing England

                              Comment


                              • #30
                                Damn so Nvidia is going to have to optimize its drivers for every DX9 title that comes out? what a bunch of BS. Nice to see them seriously stepping on their dicks as of late...just hope it translates into more sales for ATI
                                Why is it called tourist season, if we can't shoot at them?

                                Comment

                                Working...
                                X