It's hot down here... (played Diablo yesterday evening, still some remains )
Ok, let's step in :
1) FSAA : many people and especially every NVidia sold so-called journalist (you know, the ones WE MURCers complain about when there is a G400-GeFarce comparison) downplays it, talking about the huge performance hit, no need to use it...
But face it or not, FSAA is here to stay, it IS a major new feature, and not something just throw in to hide the "fact" that the Voodoo have the "same old architecture". G400 to have FSAA ? Tell me where I can download G400 FSAA enabled drivers (working in both OpenGL and Direct3D mind you), allowing playable framerates, I would be grateful.
I thought us MURCers here paid attention to image quality, knowing better than FPS-crazy NVidiots. FSAA (and especially 3DFx's one) brings a new level to image quality, period.
The fact is, the Voodoo5 has enough fillrate to provide playable framerates in resolutions up to 1024 with 2x FSAA (and I'm only talking Q3 here, chances are that many games will be playable with 4x FSAA).
I sincerely hope the G800^H^H^H^H "next chip" will have a nice hardware FSAA implementation, and I'm pretty sure it will.
2) The "it's just SLIed old Voodoo techno" argument is a joke again, and gratuitous bashing. There are 2 main arguments in favor of VSA technology :
- scalability (one chip can fill everything between entry-level and very,very high end)
- memory bandwidth, for god sake ! NVidia ignored this for years, and now they run into a wall, when 30% at least of their new monster's fillrate can't be used because of bandwidth. And they already use the fastest DDR memory for a consumer card.
If you think multichip design is a weak solution, could you please explain this to Silicon Graphics Inc., I'm sure they will be pleased to hear you.
3) Lack of features : so, on the one hand we laugh at NVidia for throwing in useless features (T&L, pixel shader) and one the other hand we blame 3DFx for not including them ? FSAA is IMHO a must have feature for new cards, and you can't deny it can be used, being enabled out of the box for every 3D game you own, from GLQuake to Battlezone 2 ! As for the T-Buffer effects, they have the advantage upon T&L that they can be enabled via a patch (much like EMBM), instead of needing to redesign the entire 3D engine.
I'm talking 3D features here, not 2D/3D (no DualHead...).
4) External power supply or internal connector : shall I say gratuitous bashing again ? They do what they have to in order to make their cards work, isn't that normal ?
- External power supply for the V5 6k : there is a way to get the amount of power needed inside the computer : it is called AGPPro, and the motherboard price is nearly a match for the Voodoo5 6k alone.
- Internal power connector : again, remember how we laughed at stories about users not being able to work their GeFarce with old/poorly designed motherboards ? If 3DFx learns from other companies mistakes, isn't it good ?
5) Image quality : I wait for the final product before saying anything. It's true that the precedent Voodoo products were not good in that regard, but I don't think JPG screenshots do the card justice. Besides, I've seen TGA screenshots (Homeworld 4x FSAA for example), and they were pretty good. But there's no way I'm going to be sure about image quality in one way or another before seeing the card.
6) Price : 300$ is sure a lot of money. But it's corporate price, the odds are we will see less expensive Voodoo5 street price. The 6k for 600$ is a very clever marketing thing : that is the price of a SLI system when the Voodoo2 came out, and that many people bought. Besides, I'm sure there would be the same number of people (extreme hardcore gamers) to buy it if I was "only" 500$.
Don't get me wrong, I'm not some blind 3DFx troll. I love my G400Max, the extremely rich feature set and the top-notch image quality of Matrox cards, but that doesn't mean I'm won't give credit where it's due. And I think that 3DFx deserves credit for the Voodoo5 (very nice architecture and MAJOR 3D feature), although the card is indeed late.
------------------
Corwin the Brute
Ok, let's step in :
1) FSAA : many people and especially every NVidia sold so-called journalist (you know, the ones WE MURCers complain about when there is a G400-GeFarce comparison) downplays it, talking about the huge performance hit, no need to use it...
But face it or not, FSAA is here to stay, it IS a major new feature, and not something just throw in to hide the "fact" that the Voodoo have the "same old architecture". G400 to have FSAA ? Tell me where I can download G400 FSAA enabled drivers (working in both OpenGL and Direct3D mind you), allowing playable framerates, I would be grateful.
I thought us MURCers here paid attention to image quality, knowing better than FPS-crazy NVidiots. FSAA (and especially 3DFx's one) brings a new level to image quality, period.
The fact is, the Voodoo5 has enough fillrate to provide playable framerates in resolutions up to 1024 with 2x FSAA (and I'm only talking Q3 here, chances are that many games will be playable with 4x FSAA).
I sincerely hope the G800^H^H^H^H "next chip" will have a nice hardware FSAA implementation, and I'm pretty sure it will.
2) The "it's just SLIed old Voodoo techno" argument is a joke again, and gratuitous bashing. There are 2 main arguments in favor of VSA technology :
- scalability (one chip can fill everything between entry-level and very,very high end)
- memory bandwidth, for god sake ! NVidia ignored this for years, and now they run into a wall, when 30% at least of their new monster's fillrate can't be used because of bandwidth. And they already use the fastest DDR memory for a consumer card.
If you think multichip design is a weak solution, could you please explain this to Silicon Graphics Inc., I'm sure they will be pleased to hear you.
3) Lack of features : so, on the one hand we laugh at NVidia for throwing in useless features (T&L, pixel shader) and one the other hand we blame 3DFx for not including them ? FSAA is IMHO a must have feature for new cards, and you can't deny it can be used, being enabled out of the box for every 3D game you own, from GLQuake to Battlezone 2 ! As for the T-Buffer effects, they have the advantage upon T&L that they can be enabled via a patch (much like EMBM), instead of needing to redesign the entire 3D engine.
I'm talking 3D features here, not 2D/3D (no DualHead...).
4) External power supply or internal connector : shall I say gratuitous bashing again ? They do what they have to in order to make their cards work, isn't that normal ?
- External power supply for the V5 6k : there is a way to get the amount of power needed inside the computer : it is called AGPPro, and the motherboard price is nearly a match for the Voodoo5 6k alone.
- Internal power connector : again, remember how we laughed at stories about users not being able to work their GeFarce with old/poorly designed motherboards ? If 3DFx learns from other companies mistakes, isn't it good ?
5) Image quality : I wait for the final product before saying anything. It's true that the precedent Voodoo products were not good in that regard, but I don't think JPG screenshots do the card justice. Besides, I've seen TGA screenshots (Homeworld 4x FSAA for example), and they were pretty good. But there's no way I'm going to be sure about image quality in one way or another before seeing the card.
6) Price : 300$ is sure a lot of money. But it's corporate price, the odds are we will see less expensive Voodoo5 street price. The 6k for 600$ is a very clever marketing thing : that is the price of a SLI system when the Voodoo2 came out, and that many people bought. Besides, I'm sure there would be the same number of people (extreme hardcore gamers) to buy it if I was "only" 500$.
Don't get me wrong, I'm not some blind 3DFx troll. I love my G400Max, the extremely rich feature set and the top-notch image quality of Matrox cards, but that doesn't mean I'm won't give credit where it's due. And I think that 3DFx deserves credit for the Voodoo5 (very nice architecture and MAJOR 3D feature), although the card is indeed late.
------------------
Corwin the Brute
Comment