Matrox is laying low and trying not to exploit the true performance of the perihelia. I think matrox might have a snowball of a video card that is rolling down a hillside and its getting bigger as it rolls! MATROX ROCKS!
Announcement
Collapse
No announcement yet.
I have a feeling
Collapse
X
-
I have a feeling
AMD Athlonâ„¢ 64 processor 3200+
Microsoft® Windows® XP Professional Edition
MicroStar K8T Neo-FIS2R MS-6702 System Board
1GB composed of 2- 512MB DDR400 SDRAM 184-pin DIMMs
3.5" 1.44MB Floppy Disk Drive
160GB 7200RPM Ultra ATA/100
16x DVD-ROM Drive
4x DVD±R/±RW Drive
e-GeForce FX 5950 Ultra 256MB DDR VIVO Graphics Card
Integrated 6 Channel AC'97 Audio CODEC
56K V.92 PCI Internal Modem
Realtek Integrated 10/100/1000 Ethernet Controller
IIM IEEE 1394 Host Controller- 2 PortsTags: None
-
Well, that is about the only POSITIVE explanation I can think of
<small>...unfortunately I don't think this is the case, since the type of initial publicity usually pretty much determines what people will think of a product.</small>
Edit: Which isn't to say that I wouldn't find extremely satisfying if Matrox waited till R300 is officially launched and then thought "hmm, maybe it's time to enable those other 64/128/256 bits"... But Haig told us to expect an increase of 10 percent, no moreLast edited by Tempest; 4 July 2002, 17:02.
-
10 per cent due to driver maturation. It may be that that does not include turning on additional sections/features of the GPU in future driver releases.
Just a straw to hang on to though...Join MURCs Distributed Computing effort for Rosetta@Home and help fight Alzheimers, Cancer, Mad Cow disease and rising oil prices.
[...]the pervading principle and abiding test of good breeding is the requirement of a substantial and patent waste of time. - Veblen
Comment
-
Just curious...
To what extent could specific game patches improve things ? I remember that back in the days from my Mystique (the original, not 220), several games run terribly, until they were patched (I believe Tombraider was one of those). Another example was the supplied Mechwarrior 2 demo (it also had been patched/adapted for the Mystique, and comparing it to a "regular" Mechwarrior2 version, the difference was huge - could be that I'm mixing this last example with my G200 though).
Is there any chance gamepatches could now also be used to optimise the games' use of the Parhelia instruction set ?
Jörg
Comment
-
I remember thinking that the G400 drivers at one point took a huge leap in stability and performance. I recall wishing that these had been the drivers released with the card at the get-go; it would have made the g400 take the world by storm. Or so I thought at the time.
Since that time the drivers have not really changed for the g400; but it seems to me it was fairly late in the game that they hit that sweet-spot, but as others have mentioned, the 'half-life' of the Matrox cards seems to be a bit longer than others. Not that I would really know as I have been solely a Matrox user for quite some time now.
Comment
-
Performance did improve as time went on but there wasn't one release that was a huge leap; it was more a gradual improvement over time until such time that the G400 performance peaked. After that time Matrox continued to provide fixes - mainly for gaming titles where the developers showed little effort of supporting anything other than nVidia/ATI (which I cannot reallu blame them for - as that's the market majority).
Also, back then not everyone was so hell bent of benchmarking every title they had. It's completely mad these days where some friends I know spend almost as much time downloading 'unofficial' drivers and bechmarking using a multitude of tools and titles.
Anyone hoping for > 10% in every title is seriously optomistic. Certain titles/engines can be optomised and I'm sure if Matrox dp anything they will choose to 'concentrate' the latest Unreal and Q3 engines. It's all a cost benefit thingCheers, Reckless
Comment
Comment