If this is your first visit, be sure to
check out the FAQ by clicking the
link above. You may have to register
before you can post: click the register link above to proceed. To start viewing messages,
select the forum that you want to visit from the selection below.
Matrox is laying low and trying not to exploit the true performance of the perihelia. I think matrox might have a snowball of a video card that is rolling down a hillside and its getting bigger as it rolls! MATROX ROCKS!
Well, that is about the only POSITIVE explanation I can think of
<small>...unfortunately I don't think this is the case, since the type of initial publicity usually pretty much determines what people will think of a product.</small>
Edit: Which isn't to say that I wouldn't find extremely satisfying if Matrox waited till R300 is officially launched and then thought "hmm, maybe it's time to enable those other 64/128/256 bits"... But Haig told us to expect an increase of 10 percent, no more
10 per cent due to driver maturation. It may be that that does not include turning on additional sections/features of the GPU in future driver releases.
Just a straw to hang on to though...
Join MURCs Distributed Computing effort for Rosetta@Home and help fight Alzheimers, Cancer, Mad Cow disease and rising oil prices.
[...]the pervading principle and abiding test of good breeding is the requirement of a substantial and patent waste of time. - Veblen
To what extent could specific game patches improve things ? I remember that back in the days from my Mystique (the original, not 220), several games run terribly, until they were patched (I believe Tombraider was one of those). Another example was the supplied Mechwarrior 2 demo (it also had been patched/adapted for the Mystique, and comparing it to a "regular" Mechwarrior2 version, the difference was huge - could be that I'm mixing this last example with my G200 though).
Is there any chance gamepatches could now also be used to optimise the games' use of the Parhelia instruction set ?
Jörg
pixar Dream as if you'll live forever. Live as if you'll die tomorrow. (James Dean)
I remember thinking that the G400 drivers at one point took a huge leap in stability and performance. I recall wishing that these had been the drivers released with the card at the get-go; it would have made the g400 take the world by storm. Or so I thought at the time.
Since that time the drivers have not really changed for the g400; but it seems to me it was fairly late in the game that they hit that sweet-spot, but as others have mentioned, the 'half-life' of the Matrox cards seems to be a bit longer than others. Not that I would really know as I have been solely a Matrox user for quite some time now.
Performance did improve as time went on but there wasn't one release that was a huge leap; it was more a gradual improvement over time until such time that the G400 performance peaked. After that time Matrox continued to provide fixes - mainly for gaming titles where the developers showed little effort of supporting anything other than nVidia/ATI (which I cannot reallu blame them for - as that's the market majority).
Also, back then not everyone was so hell bent of benchmarking every title they had. It's completely mad these days where some friends I know spend almost as much time downloading 'unofficial' drivers and bechmarking using a multitude of tools and titles.
Anyone hoping for > 10% in every title is seriously optomistic. Certain titles/engines can be optomised and I'm sure if Matrox dp anything they will choose to 'concentrate' the latest Unreal and Q3 engines. It's all a cost benefit thing
TurboGL weren't 'that' good - well it depends if you ran a Quake engine game or not Besides, as I moved to Win2K when it was released they soon found their way into the history bin
Comment