Yeah...G400, still have one (as you can see from the sig). Can't say I'm that much happy now though
Announcement
Collapse
No announcement yet.
Matrox @ Wiki
Collapse
X
-
-
I've put together the pages for the G200 and G400. I've tried to get as much info as I remember from experience and as much as I could dig up from articles online. Please feel free to add, revise, comment, etc.
Parhelia article is pretty bare, and I don't have first-hand experience with that card, so if we could get some people in-the-know on that it would be cool. I'll probably continue working on the older cards as I feel like it. I had a Mystique 220 back in the day, so maybe I'll get around to working on the Mystique.
Comment
-
Originally posted by NowhereWhat about section for Matrox networking products? I don't know nothing about them, don't even know how good/bad they were, but the logo itself (the fish) is perhaps worth it?
Comment
-
It would be nice if the releasedates of the cards would be added after the cards name.cu/2 magog - Germany - flying with OS/2 Warp speed...in a vehicle named eComStation (eCS)
---
Author of the Java Movie Database - http://www.jmdb.de
JMDB v1.35 FINAL is available (2007-09-20)
Homepage: http://www.juergen-ulbts.de/
Comment
-
According to these websites, Matrox's market share is less than one percent of the total graphics card market nowadays...
Comment
-
Originally posted by Mikko View PostAccording to these websites, Matrox's market share is less than one percent of the total graphics card market nowadays...
http://www.xbitlabs.com/news/video/d...229150853.html
Comment
-
No, define a market where they have a share of 100% of any say that. As in, "After years of innovation, Matrox succeeded in reaching a 65% market share in the non-mainstream video cards market, to much acclaim of matrox fans."Join MURCs Distributed Computing effort for Rosetta@Home and help fight Alzheimers, Cancer, Mad Cow disease and rising oil prices.
[...]the pervading principle and abiding test of good breeding is the requirement of a substantial and patent waste of time. - Veblen
Comment
-
things to note about the parhelia - their architecture was not a bad one and would have done great things if it were not for the swing towards more granular architectures. it probably would have done well in the DX9 world as most of the games that came out after it regularly address 8+ textures per pixel now days. Even the new games coming out around that time were using more than 2 textures per pixel. Doom 3, for instance, generally renders 7-9 textures per pixel.
That being said. Parhelia flopped because they failed to target DX9 and PS 2.0 as their target architecture. in order to address 4 textures at once required fairly hefty modification to their render code and for established DX8 cards there was only one other architecture that allowed more than 2 textures to be addressed at once (ATI's 8500 series, which allowed three textures). This required different code paths for each target video card and made development more complicated. Most vendors simply decided to target DX9/PS2 hardware instead and they used lowest common denominator render paths that targeted two texture architectures.
The thing that hurt the most was ATI's R300 which, while only being able to address 1 texture per pixel at a time, exposed itself as being able to address 16 textures at a time and did multiple passes internally to achieve the desired result. This allowed programmers making PS2.0 shaders to write their code as they intended (in a single pass, instead of having to do multiple passes and juggle textures between texturing units).
In turn, the increased granularity and simplicity of each pixel pipe made the chip easier to produce and allowed it to hit higher clock speeds. It also means that it is a lot more adaptive to the actual demands of a graphics program. whereas a Parhelia would be just a fast regardless of if it was rendering pixels with 1, 2, 3 or 4 textures, the R300 would render pixels with less textures faster than more complicated ones. This helped its legacy performance while still enabling developers to write better code.
Couple all that with really poorly performing drivers (ie, dropping from 1280x1024 -> 640x480 for rendering should result in a massive performance improvement but didn't) and a general lack of support it pretty much wrote it's own death certificate.
Edit: also wanted to point out that most competing cards at the time only had, at most, 4 pixel pipelines. clock for clock the Parhelia should have been able to keep up with a GeForce4 without a problem. It also would have been able to stay neck to neck with an 9700 at the same clockspeed when rendering two textures and when it moves to four textures it would have actually outperformed the 9700 clock for clock. again, there were horrendous performance problems in the Parhelia which really have no good explanation.
of course, all that doesn't help if no one programs anything for it...Last edited by DGhost; 28 April 2007, 13:21."And yet, after spending 20+ years trying to evolve the user interface into something better, what's the most powerful improvement Apple was able to make? They finally put a god damned shell back in." -jwz
Comment
-
Originally posted by Umfriend View PostNo, define a market where they have a share of 100% of any say that. As in, "After years of innovation, Matrox succeeded in reaching a 65% market share in the non-mainstream video cards market, to much acclaim of matrox fans."
Comment
Comment