Announcement

Collapse
No announcement yet.

Nvidia uncaps another 15% power in latest driver - MATROX WHERE ARE YOU?

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #46
    sorry, i left out a couple words to clarify... how come the driver sets that "fixed" the problem brought performance levels up to where it was when the cheat was enabled. driver problems, yes. performance issues, yes. Snake-Eyes here is the only one who has been able to see past the fact it was just a cheat and see where it fit into the big picture. the question does remain tho, did they fix their cheating or did they just hide it better?

    Wombat... did you acctually try an 8500 with those driver sets? were you able to tell the difference without it being pointed out to you in screen shots? or in the reviews? and if i understand correctly, it wasn't applying "low texture" settings to the whole picture, just certain spots... it also played with filtering iirc. there is a difference being able to see the difference in the game and seeing the difference in screen shots. so... did you see it in the game?

    ATI's driver design decisions aside (and yes, they do suck. do not think i am sticking up for ATI at all by this. they have the most screwed up driver development on the face of this planet next to creative), the issue with that cheat is moot now... the performance is back to where it is supposed to be, the quality is back, and we don't have to bitch about it until forever and a day.
    "And yet, after spending 20+ years trying to evolve the user interface into something better, what's the most powerful improvement Apple was able to make? They finally put a god damned shell back in." -jwz

    Comment


    • #47
      Originally posted by Snake-Eyes
      The thing is Superfly, the problem wasn't with the drivers.
      I'm sure the hardware was already far along in the design, so changing something like that may have taken too long. That's why the fix in the drivers involved changing DXTC (S3TC) modes instead of simply increasing the number of color bits used for that mode.


      Snake...Regardless if the issue wasn't either driver or hardware related, the fact is that the use of S3's tech was ill suited when certain effects were used and Nvidia's driver team certainly saw the visual quality issues with Q3 while using those drivers well in advance,yet still decided to leave the feature enabled in the driver knowing full well that it produced a fairly substancial speed boost in Q3.... just what they needed to beat the original radeon Q3 perfomance in 32 bit color...


      Let us not forget that once texture conpression wasn't used in Q3,the detonator 3 driver set was no faster that their previous driver sets,at least as far as openGL and Q3 goes....
      note to self...

      Assumption is the mother of all f***ups....

      Primary system :
      P4 2.8 ghz,1 gig DDR pc 2700(kingston),Radeon 9700(stock clock),audigy platinum and scsi all the way...

      Comment


      • #48
        on a different note, I find it most annoying that these kind of whining threadstarters never come back.

        Despite my nickname causing confusion, I am not female ...

        ASRock Fatal1ty X79 Professional
        Intel Core i7-3930K@4.3GHz
        be quiet! Dark Rock Pro 2
        4x 8GB G.Skill TridentX PC3-19200U@CR1
        2x MSI N670GTX PE OC (SLI)
        OCZ Vertex 4 256GB
        4x2TB Seagate Barracuda Green 5900.3 (2x4TB RAID0)
        Super Flower Golden Green Modular 800W
        Nanoxia Deep Silence 1
        LG BH10LS38
        LG DM2752D 27" 3D

        Comment


        • #49


          Why is it called tourist season, if we can't shoot at them?

          Comment


          • #50
            But, at the same time, we all yearn for new Parhelia drivers...
            Let us return to the moon, to stay!!!

            Comment


            • #51
              And people wonder why I decided to buy a LeadTek GF4Ti4200 64MB VIVO.
              I can't think of one person who gives a rat's ass about what card you choose to use. It's your money, your pc and your eyes.


              Its cheaper than the Parhelia, and its faster, and the image quality is acceptable.

              I am also ammused at some of the MURCs calling some people nVidiots or ATifanatics or what not, some of the guys here are the biggest Matrox "FAN BOYs" there is, pimp pimp pimp is all you guys do with the Matrox stuff, even though Matrox has screwd up big time with the Parhelia.
              The funny thing is I don't see a whole lot of pimping going on around here. You must be on some very potent drugs. Sure there are a few that are happy to share their experiences with their hardware on a MATROX user forum, but I would hardly call that zealotry.


              Matrox has lost my support unless they start to act real.
              BooHoo

              I am no big heavy gamer though, but I can spot value or an over priced product a mile away.
              I am no big psychologist, but I can spot a troll a mile a way.


              nVidia builds alot of free value into their cards, extra features, wonderfull drivers, etc etc etc.
              Free value? The only things they have going over the parhelia are raw speed and price. Wonderful drivers? HA! Extra features? Oh, I get it, 2D issues in their drivers is the extra features, now I begin to follow your thoughts....

              Matrox cuts back on everything and makes you pay for extra stuff, put up with buggy drivers, or just makes you wait.
              Pay for extra stuff? Explain. Buggy drivers? This is a BRAND NEW CARD with a completely new code base. The drivers are coming along very nicely, and have been very timely. As far as making you wait, you will be waiting a long time before you get those extra 2D bugs, and the fewer usable features implemented to match nVidia.


              Matrox should have delayed the Parhelia release untill the drivers were mature enough.
              How so?


              Also I am ammused at the statments some of the BBs klept saying (and telling me) that the Parhelia would beat the GF4 (before the Parhelia was released), and that the GF5 would even be threatend. Where are your big words now?

              Mark
              I am amused that you like to make shit up. Maybe you should promptly pull your head out of your ass and take a breath, the lack of oxygen is killing your logistic skills.

              Rags
              Last edited by Rags; 4 September 2002, 23:15.

              Comment


              • #52
                This is a BRAND NEW CARD with a ground up code base.
                Yeah but it's that nasty gristle I can't stand

                (yeah yeah yeah go ahead and edit
                "Be who you are and say what you feel, because those who mind don't matter, and those who matter don't mind." -- Dr. Seuss

                "Always do good. It will gratify some and astonish the rest." ~Mark Twain

                Comment


                • #53
                  Originally posted by Greebe


                  Yeah but it's that nasty gristle I can't stand

                  (yeah yeah yeah go ahead and edit
                  LOL. He did.
                  "..so much for subtlety.."

                  System specs:
                  Gainward Ti4600
                  AMD Athlon XP2100+ (o.c. to 1845MHz)

                  Comment


                  • #54
                    odditory is just a lame troll..

                    "NV uncap's 15%"....yeah right

                    How about NV panic's and releases half assed drivers with advanced FUD features.

                    ATI has handed NV there ass on a platter..A large number of manufacturers have swithced to ATI, even creative with there own chips in the pipleline.

                    add to that Matrox have released there "rolls royce" of video cards, not the fastest but with excellent quality and features for the discerning buyer.

                    If you don't know why you should buy a parhelia, then you probably should not have one anyway....

                    odditory just another lame 1 post wonder

                    Comment

                    Working...
                    X