Announcement

Collapse
No announcement yet.

Nvidia uncaps another 15% power in latest driver - MATROX WHERE ARE YOU?

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #31
    Another funny thing with these Nvidia drivers is that they seem to tax the card so much that they *decrease* the card max overclocking. I was able to run my Creative Ti4400 at 310/630 without any artifacts and now I got random white spots and texture flashing that actually go away if I lower the clocks to 300/610. So much for the performance increases

    I´m not alone on that, I´ve seen several reports of that, but curiously enough people are reporting them as "driver bugs"

    You really don´t need a third party utility to force aniso on D3D, but you still need to hack the registry to disable V-sync and to get the overclocking tab and some magic to force refresh rates over 60 Hz.

    And yes, the only speed bump is on 3dmark nature test. There are some improvements here and there but nothing very fancy, and they broke the usual amount of games also

    Comment


    • #32
      Bah..anyone with half a brain will realise that no matter what nvidia decides to do driver wise,it'll never come close to the performance that the 9700 delivers even with early drivers...



      Even if we take Nvidia's explanation at face value and believe that they've actually improved the pixel and vexter shader part of the driver,in actual gaming at high resolutions,the card will be mostly starved for bandwith anyhow....



      Just for kicks,did anyone with a GF4 run 3d mark using the latest drivers but at a higher resolutions,say 1280*1024 32 bit....I'm betting that the improvement in pixel and vexter shaders within the latest driver set will go pretty much unnoticed by then
      note to self...

      Assumption is the mother of all f***ups....

      Primary system :
      P4 2.8 ghz,1 gig DDR pc 2700(kingston),Radeon 9700(stock clock),audigy platinum and scsi all the way...

      Comment


      • #33
        That's true Superfly. But who really cares? I'd sure buy a 9700 right now if I was the type that could get every top card that is released. Since I can't afford that, I, like many other owners of GF4 cards, will be happy just to get some performance boosts wherever it's really needed. That btw, isn't in most current games. It's next generation stuff that pushes this card, or ultra-high resolutions. Even though nV can't really overcome the bandwidth issues, they can certainly address functionality / algorithmic points to fine-tune the card, helping make newer games faster / smoother / better looking. Now, I'm not saying this has happened with the 40.xx drivers, but the idea certainly isn't out of the question. (hey, I'm in a minority because I can usually afford once-a-year upgrades, so most people with GF4's are probably in the same situation, which means that they'll take whatever improvements they can get for their current cards until they can actually afford or even need the Next Big Thang.)
        "..so much for subtlety.."

        System specs:
        Gainward Ti4600
        AMD Athlon XP2100+ (o.c. to 1845MHz)

        Comment


        • #34
          Don't get me wrong here,driver updates are always a good thing and if there's a performance boost that'll allow users to keep any given video card that much longer asa viable gaming card,even better....


          Just the marketing hype and tactics surrounding driver releases,made even worse if the company who actually brings any given card on the market already knew for some time that there was still untapped potential on said card,even though it's been out now for quite some time and it is built on a core architecture that was released 18 months ago,and then decides to release said drivers at the same time as a competitors card hits retail,is to say the least,a somewhat questionable practice...
          note to self...

          Assumption is the mother of all f***ups....

          Primary system :
          P4 2.8 ghz,1 gig DDR pc 2700(kingston),Radeon 9700(stock clock),audigy platinum and scsi all the way...

          Comment


          • #35
            Hey, if you want to talk about driver releases as PR, you sure as hell can't stand up for <B>ATI</B> of all companies. Quack3 anyone?
            Gigabyte P35-DS3L with a Q6600, 2GB Kingston HyperX (after *3* bad pairs of Crucial Ballistix 1066), Galaxy 8800GT 512MB, SB X-Fi, some drives, and a Dell 2005fpw. Running WinXP.

            Comment


            • #36
              What nvidia did was way worse around the time the detonator 3 driver set was released,which horribly mangled image quality in quake 3,since that was the first driver set that implemented S3's texture compression modes and the GF2 series of cards didn't support all the modes properly...


              Right around the time the original radeon cards were released actually...



              The only way around the problem was to use a third party utility that instructed the driver to use a different compression mode over the one it defaulted to when running quake 3 to improve the visual quality somewhat at the expence of performance...


              At the time i had a GF2 gts 64 meg,which i never bothered to use texture compression in opengl, which was mostly usefull for the 32 meg version in any case....
              note to self...

              Assumption is the mother of all f***ups....

              Primary system :
              P4 2.8 ghz,1 gig DDR pc 2700(kingston),Radeon 9700(stock clock),audigy platinum and scsi all the way...

              Comment


              • #37
                LOL. Superfly, the GF2/3 arguably implemented S3TC the right way. Look at the acronym- S3tc. Now if you have one handy, go put an S3 Savage 2000 (I think that was the first card supporting it) in your system. Run Q3. Look at the textures. The problem with nVidia cards seems to have been that they actually used the original S3 algorithms and color depths for their implementation in GF cards.

                Not that this looks good or anything, because it looks like a__ (fill in the blanks). But it wasn't really wrong. The other cards that seemed to do it 'right' were actually using workarounds of one sort or another just because Q3 used certain textures of a known size in a compressed format, especially alpha textures, which are guaranteed to end up blended, when it really shouldn't have, at least not using that particular mode, since it should have been known that it would cause such issues. The reason nVidia has taken such a beating for it is basically because of the fact that the S3 cards went *KERPLOOEY* before anyone really had a chance to see it using Quake3, while the rest of the competition managed to get around the issue without any real heartache. So based on the then-current hardware, nV was the only company that had chips producing such nasty-looking stuff.

                Hope I didn't burst your bubble or anything, and I really would have liked to seen nVidia use more bits for all S3TC modes. But I can't say that they screwed anything, or even that this particular issue is cheating, unlike the Quake3/Quack3 issue with ATI. All the people saying that it wasn't cheating need to think about it a bit longer. (ATI didn't have their drivers as optimized as they'd like at the point that the card was released, and this was an easy way to hit the targeted numbers with the 8500's until they could get the drivers up to snuff. To give them credit, they had viable drivers showing similar numbers without the 'optimizations' available soon after the story blew up in their faces, but it was still cheating, imo.)

                BTW, what does the Parhelia do with the S3TC textures in Quake3, just out of curiousity? I'm wondering if Matrox took the bit depth / alpha textures into account for compressed textures, or if it even matters, given the higher bit depth of internal rendering usually used with it. (hint: screenshots would be nice, since I don't think I've seen any place specifically state that their shots for Parhelia used texture compression, though if I'm wrong, links to them would be okay.)
                "..so much for subtlety.."

                System specs:
                Gainward Ti4600
                AMD Athlon XP2100+ (o.c. to 1845MHz)

                Comment


                • #38
                  Thank's for the info snake-eyes....wasn't really aware that it was something inherently wrong with the way S3 implemented the tech itself....



                  Though that still doesn't change my view that even if the GF2 cards did indeed implement the tech according to S3's specifications,the staff in charge of driver developement at Nvidia must have caught the problem well before the drivers were actually available for download,at least as far as quake 3 goes,yet they still released the drivers knowing full well it produced the visual qualty problems it did(to put it lightly),regardless of any and all speed increases it provided....



                  And the curious part is that this happened right around the time the original radeon was released,which prior to the release of the detonator 3 drivers,was actually slightly faster that the gf2 gts in 32 bit color,at least as far as quke 3 goes.....
                  note to self...

                  Assumption is the mother of all f***ups....

                  Primary system :
                  P4 2.8 ghz,1 gig DDR pc 2700(kingston),Radeon 9700(stock clock),audigy platinum and scsi all the way...

                  Comment


                  • #39
                    The thing is Superfly, the problem wasn't with the drivers.
                    I'm sure the hardware was already far along in the design, so changing something like that may have taken too long. That's why the fix in the drivers involved changing DXTC (S3TC) modes instead of simply increasing the number of color bits used for that mode.
                    "..so much for subtlety.."

                    System specs:
                    Gainward Ti4600
                    AMD Athlon XP2100+ (o.c. to 1845MHz)

                    Comment


                    • #40
                      NVidia did similar things to the "Quack"-cheat (oh, it WAS a cheat, and nobody can tell me this wasn't done deliberately by ATI) many times e.g. by lowering the default LOD settings in the drivers - thus getting higher fps, but worse texture quality. This is apparently also done in the newest "25% faster" drivers. Quincunx antialiasing which looks like crap and only blurs everything, but it's fast so that NVidia can say: Hey look, we have sped up AA. Another NVidia "cheat" can be seen with the 3DMark title-screen issue.

                      Most companies have one or another "special driver settings" that could easily be called "cheating" in their drivers, think about disabling 32Bit ZBuffers, varying LOD settings and internal texture rendering accuracy.

                      ATIs non-traditional way of anisotropic
                      rendering is kind of a cheat as well but this is one that is really good for most users, because the disadvantages (while being demonstrable in specially set-up scenes) are not really apparent in actual gaming. So it's sometimes more depending on the point of view if you call something a great intelligent technique to fasten things up or call it a cheat.
                      Last edited by Indiana; 3 September 2002, 09:18.
                      But we named the *dog* Indiana...
                      My System
                      2nd System (not for Windows lovers )
                      German ATI-forum

                      Comment


                      • #41
                        Indiana:
                        As far as the aniso setting being a cheat with these new drivers..
                        Nope, and nope. Simply changing the slider setting under the D3D properties in the driver control panel restores the default (read: not tweaked with RivaTuner or some other tool) aniso / filtering under direct3D. This also results in 0 perform difference from the mistaken 0 setting, so I clearly classify that as a bug. (I might call it a cheat if that caused my framerates to be different between the poorer quality 0 mode and the [for previous drivers] default quality 1 mode, but since it doesn't..)

                        I'd agree that there are 'cheats' in use everywhere by practically every video card chipset manufacturer's drivers. However, whether I call something a cheat or not is based more on a few different things:
                        1) Does it result in a different level of quality of the output.
                        2) Does it result in a performance differential between the 'default / alleged cheat' settings and the proper settings.
                        and
                        3) Can the user override the settings to get the right quality / performance without registry tweaks (in some cases even that's not possible )


                        As far as the 3DMark2K1SE title screens went, you have to look at it from a programmer's point of view (I am, so I do). Clearing the buffers during a null-processing stage (ie. while NOT rendering a scene) is just good common sense. Otherwise those buffers, which could be full of unused crud (when related to any upcoming scenes) would artificially pull scores down. If used properly, the individual tests in 3DMark2K1SE can be evaluated independently on a given person's machine, in order to guage their hardware's strengths and weaknesses in different areas of 3D rendering and performance. I hardly think that having the results of.. say.. game 1 high quality be degraded because video buffers are still full of crap from the game 1 low quality pass gives me the proper information.

                        It'd be interesting to see if the original 'cheat' drivers return similar results for each seperate test as running only that test with the newer drivers (meaning, instead of running the full benchmark with the new drivers, run 1 test, log results, compare to the results of the earlier drivers, then run all tests, and compare results again). If all nVidia was doing was clearing buffers during the title screens in the old drivers, the results should be similar, since running only 1 test shouldn't have clogged buffers to begin with. If however, they were doing something else that IS sneaky, I'd expect the results to be different, since the original drivers were messing with something that the new drivers don't. Er, hope what I'm saying makes sense? (BTW, I'm a bit tired of driver swapping lately, as I tested the 40.41's, but if anyone else is still playing around, would you care to test this theory out?)
                        "..so much for subtlety.."

                        System specs:
                        Gainward Ti4600
                        AMD Athlon XP2100+ (o.c. to 1845MHz)

                        Comment


                        • #42
                          just out of couriousity, have matrox cheated in benchmarks?

                          I have always had the impression that they didn´t cheat, because they haven´t focused that much on the benchmark-race but more on imagequality.
                          This sig is a shameless atempt to make my post look bigger.

                          Comment


                          • #43
                            if ATI's Ansio filtering method is a cheat, then so is Matrox's FAA.

                            Indiana: about the quack cheat... could you tell a difference in the Quake3 picture quality when it was doing the "cheating" and when it wasn't? and why was it that after removing this "cheat", benchmark scores stayed the same?
                            "And yet, after spending 20+ years trying to evolve the user interface into something better, what's the most powerful improvement Apple was able to make? They finally put a god damned shell back in." -jwz

                            Comment


                            • #44
                              Indiana: about the quack cheat... could you tell a difference in the Quake3 picture quality when it was doing the "cheating" and when it wasn't? and why was it that after removing this "cheat", benchmark scores stayed the same?
                              Oh, hell yes. Didn't you read the reviews and see the screenshots? It was about the same as running Q3 on "Low Quality Textures" even though High was set. And the benchmarks didn't stay the same. If you ran the quack-ifier on Q3, you got the original image quality back, and the FPS plummeted to their previous levels.
                              Gigabyte P35-DS3L with a Q6600, 2GB Kingston HyperX (after *3* bad pairs of Crucial Ballistix 1066), Galaxy 8800GT 512MB, SB X-Fi, some drives, and a Dell 2005fpw. Running WinXP.

                              Comment


                              • #45
                                Originally posted by DGhost
                                if ATI's Ansio filtering method is a cheat, then so is Matrox's FAA.

                                Indiana: about the quack cheat... could you tell a difference in the Quake3 picture quality when it was doing the "cheating" and when it wasn't? and why was it that after removing this "cheat", benchmark scores stayed the same?
                                The benchmark scores with those drivers didn't stay the same. That's why it was a cheat. The drivers that removed the Quake3 detection (coming not long after this blew up in ATI's face) fixed the performance issues though. That's why my earlier post mentioned that the drivers weren't yet optimized well enough at the release to meet ATI's goals. They cheated to cover the interval until they had them optimized properly.
                                "..so much for subtlety.."

                                System specs:
                                Gainward Ti4600
                                AMD Athlon XP2100+ (o.c. to 1845MHz)

                                Comment

                                Working...
                                X