If this is your first visit, be sure to
check out the FAQ by clicking the
link above. You may have to register
before you can post: click the register link above to proceed. To start viewing messages,
select the forum that you want to visit from the selection below.
<---------->
The closest in functionality to the G450 that has been announced using the GeForce2 MX is the Hercules 3D Prophet II MX Dual-Display, but it uses three chips to get almost the same the job done, and has an MSRP of $199.
<---------->
There are already Geforce MX cards with 2 CRT heads +1 TV head for $139. Too bad, that Nvidia screwed with TwinView functionality (BTW, I think, that new drivers can cure all that TwinView problems, what's your opinion, is it software or hardware thing?)
Originally posted by SwAmPlAdY: All this crap in a few years, it's a wonder they can sell anything in the retail market.
Anyone else?
SwAmPy
[This message has been edited by SwAmPlAdY (edited 06 September 2000).]
Hi Swampy,
you failed to mention one major feature that was already *sort of supported* when the G200 was introduced ... it has to do with a certain kind of texture filtering ...
Know what I mean ?
Despite my nickname causing confusion, I am not female ...
"And just because I am a wonderin. is there any game out there that the G400 doesn't do a decent job running?"
Yes.
To name a few: Mercedes Benz Truck Race
Rallye Masters
STCC
MS Flightsim 98
all of them D3D, and that´s why I recently sold my one week old G400 DH 32 (clocked at MAX speeds of course) for the equiv. of 100$.
Good riddence.
Now I see what's happening! Its not a gamers card. Its not a business card. Its not even an OEM card.
ITS AN EXPERIMENTAL CARD!!
TO quote from the Gamers Depot review above:
"Matrox was able to shrink the main core of the G400 chip enough so that they could squash a TMDS transmitter for digital flat panels, a TV encoder, and the primary and secondary RAMDACs for the separate video outs all on one piece of silicon."
To do all that took a lot of engineering genius. It is almost amazing that Matrox could keep the performance up to to G400 standards. It is an accomplishment they can be proud of.
However, the market is not alway kind to engineering miracles unless they show some improved performance. Like the G250, I doubt many of you will see a G450. Let's just hope they are using the G450 as an engineering "dry run" for their real masterpiece - the G800.
you failed to mention one major feature that was already *sort of supported* when the G200 was introduced ... it has to do with a certain kind of texture filtering ...
Maggi, are trying to say that my G200 supports the "render subjects nude" feature? After all, I think you could call "render subjects nude" a kind of texture filtering!
The only question is, how do you only sort of support "render subjects nude"? Does it only render half of the subject nude? and if so, which half?
nope HedsSpaz, that's not it ... keep guessing or try to remember what kind of filtering is supported, but fails to run in each benchmark that tests it
Despite my nickname causing confusion, I am not female ...
Originally posted by rubank: If I remember correctly, the G200 had the same Anisotropic filtering support as the G400:
none.
Is that what´s on your mind, Magster?
direct hit Rubank ...
According to all spec sheets, the G200 & G400 support that kind of filtering, but all tests simply fail, because there are more than one kind of anisotropic filtering.
If you'd launch Ziff Davis' 3D Winbench, you'll find out that both cards do support anisotropic minification, but no such magnification. Thus this feature can be used only under special circumstances like setting up a User Scene in 3D WB where you can differentiate between min & mag filtering. All other tests (eg 3D Mark) seem to require both filters and hence only report failures ...
Cheers,
Maggi
Despite my nickname causing confusion, I am not female ...
That´s a little odd, Maggi, what version of 3DWinBench are you running?
I d/l:d the latest version, including some patch, of said benchmarks, and it reported "not supported in hardware" for my G400 (anisotropic, that is). Nothing about min/max.
The G850 will be a die-shrunk, cheaper version of the G800, due for review 1.5 years after the spec. release of the G800.
It will sport hardcoded DX10 support, well most of it, hence no troubles with drivers. For OpenGL it will use the same full ICD as the G400.
Comment