If this is your first visit, be sure to
check out the FAQ by clicking the
link above. You may have to register
before you can post: click the register link above to proceed. To start viewing messages,
select the forum that you want to visit from the selection below.
Announcement
Collapse
No announcement yet.
Nvidia uncaps another 15% power in latest driver - MATROX WHERE ARE YOU?
You do know that the image quality is lower in the latest dets right? The folks over on the IGN PC Hardware board posted some pics and it is VERY noticeable. Not only that but in game scores have actually decreased. I'm glad that Matrox doesn't eek out 15% performance by decreasing the image. . .THAT my friend is what I paid 400 dollars for.
My "Baby": Shuttle SS51G, P4@2.26 Ghz 533 FSB, 80 GB Western Digital Caviar "Special Edition" Hard Drive 7,200 RPM w/8 MB Cache, 512 MB Corsair PC2700 with Heat Spreaders, Pioneer DVD Drive (w/sexy slot load ), and of course a Matrox Parhelia Retail Vid Card
The funny thing is, game scores do not increase with this decrease in visual quality, so it's most likely a glitch - after all, these are "beta" drivers.
Definitely because they're betas. Also, it IS possible to fix the IQ problems (has to do with screwy anisotropic setting controls for D3D mainly) without losing any performance. In UT, I noticed that the one map I've ever played that seemed to significantly decrease my fps (as in dropping into the 30's on a 2250+ [aka 2100+ oc] with a 4600) no longer had the massive drops. However, the return of a problem I noticed with an older revision of drivers on this card offsets the usefulness of this improvement- that problem is that I got a weird consistent/persistent stutter to the game using the new drivers, which wasn't based on a drop in framerate.
For now I've returned to the last official release, which are butter smooth except for one area in the one UT map. Hopefully nVidia will address the stuttering issues as well as the screwy aniso settings control, among other things, before the final release of the 40.xx drivers. I can say that the formal inclusion (finally) of aniso settings under D3D is something most nV users are glad to see, as well as the addition of a tab that allows refresh rates to be set per-resolution for D3D titles (meaning no more need for Refresh-fix tools, at least for D3D games). The latter is especially important when you consider that it's becoming more and more likely that MS isn't going to address the XP refresh rates issues in SP1 (much as they didn't in W2K SP3 ).
"..so much for subtlety.."
System specs:
Gainward Ti4600
AMD Athlon XP2100+ (o.c. to 1845MHz)
And people wonder why I decided to buy a LeadTek GF4Ti4200 64MB VIVO.
Its cheaper than the Parhelia, and its faster, and the image quality is acceptable.
I am also ammused at some of the MURCs calling some people nVidiots or ATifanatics or what not, some of the guys here are the biggest Matrox "FAN BOYs" there is, pimp pimp pimp is all you guys do with the Matrox stuff, even though Matrox has screwd up big time with the Parhelia.
Matrox has lost my support unless they start to act real.
I am no big heavy gamer though, but I can spot value or an over priced product a mile away.
nVidia builds alot of free value into their cards, extra features, wonderfull drivers, etc etc etc.
Matrox cuts back on everything and makes you pay for extra stuff, put up with buggy drivers, or just makes you wait.
Matrox should have delayed the Parhelia release untill the drivers were mature enough.
Also I am ammused at the statments some of the BBs klept saying (and telling me) that the Parhelia would beat the GF4 (before the Parhelia was released), and that the GF5 would even be threatend. Where are your big words now?
...one man's potion, another man's poison. Really like my Parhelia so far. Read all I could find on it before buying it, positive and negative. Years of using Matrox builds a certain trust and experience, especially after trying nVidia and ATI here and there.
It's more a personal question, but the proof of the pudding is in the eating.
Originally posted by Geek And people wonder why I decided to buy a LeadTek GF4Ti4200 64MB VIVO.
Its cheaper than the Parhelia, and its faster, and the image quality is acceptable.
The key word here is "Acceptable", If you truly find the Geforce 4200 to have "acceptable" IQ, then good luck, well done on your purchase!
I don't.
There really isn't any more to say, I have purchased a Parhelia, and replaced a GeForce ti4200 that OC'ed to 300/650, so a geforce 4 4600 in clock/mem settings! Guess which card is sitting in my drawer?
Life is a ride
Like days on a train
Cities rush by
Like ghosts in the night
Originally posted by Geek Well on my 15" CRT i wont notice any improvment.
ROFL. That's as if he'd said: why buy a hifi receiver, with my 80W $9.99 computer speakers I don't notice any difference.
Also anyone who uses a DVI flatpannel, well all this VGA IQ stuff that people waffle on about is a moot point.
Again: ROFL
What about: "Also anyone who uses a flatpanel, well, all this frames-per-second stuff that people waffle on about is a moot point."
Since the flatpanel with it's 25 ms response time at best isn't able to really display higher fps anyway...
P.S.: I do agree that the Parhelia did not deliver the expected performance, but it IS a giant speed up from Matrox next fastest card (the G400Max) to those who want a Matrox because of it's superior 2D.
And to chime in on a thread like this with the 25% speed improvement lie when it's in fact only the Nature test in 3DMark that's gaining with no improvement to even decreased speed in real games combined with a major drop in image quality, can only be called fan boy-ish behaviour. Just look around the net where those drivers are discussed.
Those drivers are even back to using point-sample filtering, that's pre-Voodoo1 times. No, thank you - I don't need no such drivers.
Actually Indiana, if it weren't for the stuttering in UT, I'd probably have stuck with the 40.41 drivers. The other stuff is mainly driver bugs. IQ- make sure to change settings for D3D aniso to 1x or higher, and that's a moot point. Performance- they're equal or better in everything I've tried.
Basically, even if these drivers don't improve performance or IQ, I'm eager for the final release (with bug fixes taken care of), since as long as they're even with the previous officially released drivers (performance and IQ), they add the abilities I've wanted in my drivers so far- adjustable aniso in D3D + refresh rate overrides in the drivers themselves. The other stuff will be important in the future, but don't matter so much right now (the improvement in 3DMark2K1SE is in the 8.1 shader utilization- Nature demo, and also shows in other tests that make heavy use of them, such as Aquanox..)
Anyhow, to each his own. Parhelia is an excellent piece of hardware for some uses, though not an outstanding gaming card (especially if that's all you really plan to justify the $$ with- look elsewhere).
"..so much for subtlety.."
System specs:
Gainward Ti4600
AMD Athlon XP2100+ (o.c. to 1845MHz)
One thing that I wanted to chime in on was the apparant belief that this driver only increased 3DMark 2001.
Granted, the numbers do indicate as much...But, let's do something different. According to Brian Burke, the thing that this driver addressed was the Pixel Shading performance, and this can be demonstrated on 2 known pieces of software: 3DMark 2001 and Aquanox.
And here's the problem...If one were to take this statement at face value, there's no real way to disprove it! How many non CPU-limited games employ Pixel Shaders? Here we are approaching 2 years since DX8 was introducted, and this is what we have to place with!
Anyhow, I'm not trying to make excuses for this driver, nor the 25% claim...But, the lack of software makes it difficult to analyze. From the numbers that have been provided by various websites/individuals, these 2 pieces of software are given a boost. If we actually had more games to benchmark, it would make this a very simple thing to test.
Might see some difference in Morrowind or Neverwinter Nights. They both uses pixel shaders to generate water effects. Morrowind in particular seems pretty taxing.
Comment