If this is your first visit, be sure to
check out the FAQ by clicking the
link above. You may have to register
before you can post: click the register link above to proceed. To start viewing messages,
select the forum that you want to visit from the selection below.
And in case you didn't realize, that means that a big chunk of the 3D games out there currently runs better on a G400/450 than on a member of the GeForce family (provided you use the latest drivers). I can't believe nVidia seems to get away with it...
comments Himself?
System 1:
AMD 1.4 AYJHA-Y factory unlocked @ 1656 with Thermalright SK6 and 7k Delta fan
Epox 8K7A
2x256mb Micron pc-2100 DDR
an AGP port all warmed up and ready to be stuffed full of Parhelia II+
SBLIVE 5.1
Maxtor 40g 7,200 @ ATA-100
IBM 40GB 7,200 @ ATA-100
Pinnacle DV Plus firewire
3Com Hardware Modem
Teac 20/10/40 burner
Antec 350w power supply in a Colorcase 303usb Stainless
I just replaced my g200 with a Hercules geForce2 MX and I am quite happy in general. Here are some thoughts:
1- When I got the card, I was extremely sad about the 2D look. I really thought that my eyes would get injured over time (soon). I eventually (30 minutes later) upgraded to the latest drivers -- which are probably the so called Detonator 3 drivers, I am not familiar yet with the names and everything -- and I find the 2D quality to be reasonable. I wonder if all that the drivers did was change the contrast and things like that.
2- UT: I used to play Quake3 every once in a while but since I bought UT, I dont play it very much anymore. I was never able to get UT to work with my g200, so I was playing in software Rendering mode. It was playable thanks to my pIII-933. The MX runs nice and the game looks just beautiful, especially with the textures on the second CD. There are many levels that I did not even recognize!
3- Quake3: It's faster than the g200 and with higher settings, but for some reason I really dont like what the MX does. Maybe it's just the faster frame rates or something but it's seems like I get that t-buffer effect... Anyway, I discovered some graphics effetcs I had never seen before in both UT and Q3A and that was pretty cool.
4- Linux: stay with Matrox!
Overall I would recommend to anyone with a g200 wanting to upgrade to get the MX. I don't have twinview or whatever, but I am sure it's crap. I had been waiting for a long time to see the g800 or at least some specs, but when I realized that it can not possibly be out before Christmas, I bought the MX the next day the g450 was released. I read recently that Creative Labs will make a version with DDR next month, so it will be better until then for those wanting to upgrade.
By the way, there is nothing wrong with buying stuff from nvidia if it's what it takes to wake-up the competition. Look at what ATI did. When I hear about a computer with an ATI card at work, I know it's an inferior computer. And now ATI is finally producing something I wish I had. Twinview is maybe not very good now, but it will be before long, and 2D is already not bad. With the MX needing only 4W and a heatsink, I think that nvidia is gearing up towards the OEM market.
A last note, I never really had to complain about Matrox drivers except about the installation part. nvidia is not much different in that area.
The article actually does very little to save G450's honor. First - it's extremely unconvincing: it's based on pure speculation, on results achieved in the absence of the cards discussed. Moreover, the 16bit/S3TC move seems to be a terribly simplistic solution (for one thing, it fails to test the G450 where it most hurts - in 32bit color).
Finally, it is kind of beside the point. The idea is actually simple. Need a game-card? Go for nVidia. Need a card able to deliver great 2D on two screens as well as 3D adequate for casual gaming? Go for the G450. If you want to play a bit more, go for the G400MAX.
I really fail to see the point of saving Matrox's honor with a lame article.
Asus A7V, Duron 600@900, 192MB PC133@100, G200, Guillemot MUSE, etc.
I don't see that it matters that the G400 is running in 16bit. It still looks better than the MX in 32 bit. The point of running in high colour depth is to make things look nicer, not just to eat all the bandwidth on your card.
I do agree that the comparison is hardly scientific, it would be good if someone could do the same thing with the correct cards.
Interesting that with the display comparison two of us coundn't tell us any differance while one could tell the differance and went for the matrox each time.
Pity Dzeus didn't have an Mx for performance comparision rather then scaling down the results.
Chief Lemon Buyer no more Linux sucks but not as much
Weather nut and sad git.
From what I've read (3dchipset I believe), the nvidia drivers don't enable S3TC properly. Which causes the crappy look in Q3A. On other cards the S3TC does show its worth. (So they say)
As I had no GeForce MX at hand, I benchmarked a GeForce 2 GTS and then reduced the fps results by the same percent rate with which the MX against a GeForce DDR scored in an MX review posted on Tom's Hardware Guide.
You propose this as Honor? This is pure trash!
"I just wanted to show you that a cheapo card from nVidia isn't necessarily better in any way, just because it carries the word "GeForce" in its name. While a GeForce 2 GTS is definitely one of the very fastest 3D cards on the market now, the GeForce 2 MX is a severely crippled version. "
Why is this garbage posted? Do the readers here need reassurance that Matrox cards are OK to own? This is the worst article I have ever seen!
And for my next piece I will review somthing else I do not have. And Slam it !
[This message has been edited by LAMFDTK (edited 13 September 2000).]
my english is really bad. I don't understand 'starbucks'. But one is sure you don't anderstand what i mean.
I don't care about the score he gives. It's the first time i see someone who take a another way to bench card. The way i talking about is to include quality for the nechmark score. This is very interesting.
compare quality and fixing the setting and then launch the bench.
Comment