Announcement

Collapse
No announcement yet.

Ton of Voodoo 5 5500 AGP Previews up!

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #16
    It's hot down here... (played Diablo yesterday evening, still some remains )

    Ok, let's step in :

    1) FSAA : many people and especially every NVidia sold so-called journalist (you know, the ones WE MURCers complain about when there is a G400-GeFarce comparison) downplays it, talking about the huge performance hit, no need to use it...
    But face it or not, FSAA is here to stay, it IS a major new feature, and not something just throw in to hide the "fact" that the Voodoo have the "same old architecture". G400 to have FSAA ? Tell me where I can download G400 FSAA enabled drivers (working in both OpenGL and Direct3D mind you), allowing playable framerates, I would be grateful.
    I thought us MURCers here paid attention to image quality, knowing better than FPS-crazy NVidiots. FSAA (and especially 3DFx's one) brings a new level to image quality, period.
    The fact is, the Voodoo5 has enough fillrate to provide playable framerates in resolutions up to 1024 with 2x FSAA (and I'm only talking Q3 here, chances are that many games will be playable with 4x FSAA).
    I sincerely hope the G800^H^H^H^H "next chip" will have a nice hardware FSAA implementation, and I'm pretty sure it will.

    2) The "it's just SLIed old Voodoo techno" argument is a joke again, and gratuitous bashing. There are 2 main arguments in favor of VSA technology :
    - scalability (one chip can fill everything between entry-level and very,very high end)
    - memory bandwidth, for god sake ! NVidia ignored this for years, and now they run into a wall, when 30% at least of their new monster's fillrate can't be used because of bandwidth. And they already use the fastest DDR memory for a consumer card.

    If you think multichip design is a weak solution, could you please explain this to Silicon Graphics Inc., I'm sure they will be pleased to hear you.

    3) Lack of features : so, on the one hand we laugh at NVidia for throwing in useless features (T&L, pixel shader) and one the other hand we blame 3DFx for not including them ? FSAA is IMHO a must have feature for new cards, and you can't deny it can be used, being enabled out of the box for every 3D game you own, from GLQuake to Battlezone 2 ! As for the T-Buffer effects, they have the advantage upon T&L that they can be enabled via a patch (much like EMBM), instead of needing to redesign the entire 3D engine.

    I'm talking 3D features here, not 2D/3D (no DualHead...).

    4) External power supply or internal connector : shall I say gratuitous bashing again ? They do what they have to in order to make their cards work, isn't that normal ?
    - External power supply for the V5 6k : there is a way to get the amount of power needed inside the computer : it is called AGPPro, and the motherboard price is nearly a match for the Voodoo5 6k alone.
    - Internal power connector : again, remember how we laughed at stories about users not being able to work their GeFarce with old/poorly designed motherboards ? If 3DFx learns from other companies mistakes, isn't it good ?

    5) Image quality : I wait for the final product before saying anything. It's true that the precedent Voodoo products were not good in that regard, but I don't think JPG screenshots do the card justice. Besides, I've seen TGA screenshots (Homeworld 4x FSAA for example), and they were pretty good. But there's no way I'm going to be sure about image quality in one way or another before seeing the card.

    6) Price : 300$ is sure a lot of money. But it's corporate price, the odds are we will see less expensive Voodoo5 street price. The 6k for 600$ is a very clever marketing thing : that is the price of a SLI system when the Voodoo2 came out, and that many people bought. Besides, I'm sure there would be the same number of people (extreme hardcore gamers) to buy it if I was "only" 500$.

    Don't get me wrong, I'm not some blind 3DFx troll. I love my G400Max, the extremely rich feature set and the top-notch image quality of Matrox cards, but that doesn't mean I'm won't give credit where it's due. And I think that 3DFx deserves credit for the Voodoo5 (very nice architecture and MAJOR 3D feature), although the card is indeed late.

    ------------------
    Corwin the Brute

    Corwin the Brute

    Comment


    • #17
      FSAA : many people and especially every NVidia sold so-called journalist (you know, the ones WE MURCers complain about when there is a G400-GeFarce comparison) downplays it, talking about the huge performance hit, no need to use it...
      But face it or not, FSAA is here to stay, it IS a major new feature, and not something just throw in to hide the "fact" that the Voodoo have the "same old architecture".
      Wrong. FSAA on the V5 is EXACTLY the same as FSAA on every other chip. It's been around for years. The only difference is they're partially offloading it to the chip. Notice I said PARTIALLY. If the improvements were big enough, some of it could have been offloaded to the G400's WARP engine... but Matrox chose not to.

      FSAA is performed by rendering the entire scene at double the resolution, and then downsampling it. That's all. No special tricks, and 3dfx can claim otherwise all day long, but it looks exactly the same as everyone else's implementation.

      And it DOES cause a major performance hit. They're not being real specific yet, but let's wait until the boards are out. I'm willing to bet you don't get playable FSAA rates at any resolution over 640x480. I'm also willing to bet that even if you DO get 800x600 with FSAA, that I'd like my 1280x1024 without FSAA, as rendered by the G400, better.

      G400 to have FSAA ? Tell me where I can download G400 FSAA enabled drivers
      Download the G400 Tweaker. Check the little "Full Screen Anti-Aliasing" box. Reboot. BIG OL' HONKIN' PERFORMANCE HIT, but it works.

      (working in both OpenGL and Direct3D mind you),
      Oh pshaw. Does 3dfx's work in OpenGL? We don't know yet, do we?

      allowing playable framerates
      At what resolution? 640x480, sure thing. If the V5 can do FSAA at 1024x768 at playable rates I'd be very surprised.

      FSAA (and especially 3DFx's one) brings a new level to image quality, period.
      No, it really doesn't. Honest. I thought everyone here had read the useful posts by Kruzin and others talking about this (Kruzin, if I'm putting words in your mouth sorry, I honestly recall you writing that post, I could be wrong though...) and debunking it.

      The fact is, the Voodoo5 has enough fillrate to provide playable framerates in resolutions up to 1024 with 2x FSAA (and I'm only talking Q3 here, chances are that many games will be playable with 4x FSAA).
      I don't believe it. I don't think Fill Rate has ANYTHING to do with it. And I don't think the V5 can run reasonable framerates at 2048x1536, which is what is required to do FSAA at 1024x768... and if you wanted 4x FSAA you'd need to render at 4096x3072. If 3dfx claims the card can do that, they're just plain ol' lying.

      And NOBODY has addressed the fact yet that 3dfx's "PCI disguised as AGP" interface is utterly incapable of transferring that kind of data to the card anyway! I could be wrong here, maybe they have some nifty preloading trick with local memory... or maybe not.

      I sincerely hope the G800^H^H^H^H "next chip" will have a nice hardware FSAA implementation, and I'm pretty sure it will.
      And it won't do any more good than current implementations, 3dfx's included.

      The "it's just SLIed old Voodoo techno" argument is a joke again, and gratuitous bashing.
      Bull and s**t. We've been over this a dozen times, and nobody has EVER shown any major difference besides fillrate and raw speed between the V1 and the V2, or between the V2 and the V3. Period. All the new features could very easily have been done in drivers, and those that couldn't were really minor tweaks. Name me a new feature of the V5... BESIDES so-called FSAA and T-buffer (which I still think is hideous). There aren't any!

      There are 2 main arguments in favor of VSA technology :
      Oh, ok. Now we're talking memory interface technology. I see. I was talking 3D core. There have been LOTS of architectural improvements, sure. Memory is faster, yadda yadda yadda. Who cares? I'm talking FEATURES. Every single 3dfx product has been a faster Voodoo1. Period.

      Lack of features : so, on the one hand we laugh at NVidia for throwing in useless features (T&L, pixel shader)
      No, we laugh at NVidia for including them in a manner which renders them practically useless. They're not DX8 compatible, which means they will work for EXACTLY one product cycle, and they aren't particularly efficient, either. No games support them. They require an entire engine rewrite. THAT is why we laugh at NVidia for these features.

      and one the other hand we blame 3DFx for not including them ?
      Yes, because they didn't even try. They never have. They utterly ignore industry innovation time and time again in favor of MORE FRAMES PER SECOND WOOHOO!

      [quoteFSAA is IMHO a must have feature for new cards,[/quote]

      No it isn't. Cards have had it, it was useless since it was TOO DAMN SLOW. And it still is.

      and you can't deny it can be used, being enabled out of the box for every 3D game you own, from GLQuake to Battlezone 2 !
      Sure I can! I deny it can be used because the frame rates wouldn't be fast enough.

      As for the T-Buffer effects, they have the advantage upon T&L that they can be enabled via a patch (much like EMBM), instead of needing to redesign the entire 3D engine.
      I give you that. However, that doesn't change the fact that they DON'T IMPROVE VISUAL QUALITY. AT ALL. They just look plain ol' ugly, and that's all there is to it. You can't "simulate" motion blur, unless you have designed some incredible "eyeball motion tracking system" involving a camera and split-second adjustments. We've been over this as well.

      I'm talking 3D features here, not 2D/3D (no DualHead...).
      It sounded like you were talking about memory architecture a minute ago. Hmm...

      External power supply or internal connector : shall I say gratuitous bashing again ? They do what they have to in order to make their cards work, isn't that normal ?
      NO. IT IS NOT NORMAL. You don't need that bloody much power for a well-designed chip. Period. SGI machines running off a teeny weeny 200W power supply generate killer graphics that 3dfx can't even touch, yet 3dfx can't make do with the power coming from a 300W PC supply? What is WRONG with their engineers?

      The GeFarce2 is just as fast as the V5, and it runs just fine without overheating the case or draining the local power grid.

      - External power supply for the V5 6k : there is a way to get the amount of power needed inside the computer : it is called AGPPro, and the motherboard price is nearly a match for the Voodoo5 6k alone.
      Bull and s**t. You DON'T NEED THAT MUCH POWER. You don't need anywhere NEAR that much power, they just haven't engineered the chips well enough. Period. End of story.

      - Internal power connector : again, remember how we laughed at stories about users not being able to work their GeFarce with old/poorly designed motherboards ? If 3DFx learns from other companies mistakes, isn't it good ?
      The GeFarce was designed to AGP spec. Some motherboards didn't provide AGP spec. voltage. If the GeFarce had been engineered with the lower voltage in mind there wouldn't have been a problem. The engineers went "oh, we have X volts/watts to play with" and designed accordingly. Thus, when given Y volts/watts instead it didn't work right. The 3dfx engineers went "oh, we have X volts/watts - let's make something that uses X+100 Volts/watts!" instead of just rethinking their design.

      {quote}Image quality : I wait for the final product before saying anything. It's true that the precedent Voodoo products were not good in that regard, but I don't think JPG screenshots do the card justice. Besides, I've seen TGA screenshots (Homeworld 4x FSAA for example), and they were pretty good. But there's no way I'm going to be sure about image quality in one way or another before seeing the card.[/quote]

      Yeah, and there are also people that claim that the V3 "wasn't washed out". (See above!)

      But I'm not basing my opinion of the V3's rendering on screenshots, I'm basing it on OWNING SEVERAL OF THEM. Their image quality is "ok". Not good, not great. Just "ok". No matter HOW you set the settings. Maybe 3dfx got it right with the V5, but that's NOT apparent from the screenshots, which look pretty blah.

      300$ is sure a lot of money. But it's corporate price, the odds are we will see less expensive Voodoo5 street price. The 6k for 600$ is a very clever marketing thing : that is the price of a SLI system when the Voodoo2 came out, and that many people bought. Besides, I'm sure there would be the same number of people (extreme hardcore gamers) to buy it if I was "only" 500$.
      Or, they might buy a GeForce2 for substantially less, because it's just as fast and has more/better features.

      <SNIP>that doesn't mean I'm won't give credit where it's due. And I think that 3DFx deserves credit for the Voodoo5 (very nice architecture and MAJOR 3D feature),
      Umm... no, lousy engineering and no useful features. No credit, sorry.

      although the card is indeed late.
      Now here I WON'T bash 3dfx, since EVERY card is always late.

      - Gurm


      ------------------
      Listen up, you primitive screwheads! See this? This is my BOOMSTICK! Etc. etc.
      The Internet - where men are men, women are men, and teenage girls are FBI agents!

      I'm the least you could do
      If only life were as easy as you
      I'm the least you could do, oh yeah
      If only life were as easy as you
      I would still get screwed

      Comment


      • #18
        FSAA does work with any API out there. It has also been shown that at 800X600 32-bit in UT, you get 2 sample FSAA almost for free! However as pointed out in many reviews, FSAA will find it's home not in FPS games but in flight and racing sims, where fill rate in not being taxed to it's limits, and the beauty of 4 sample AA can be used...In addition 3DFX's way of doing FSAA, is far superior than that of Nvidia, which uses edge AA with gives nowhere near as clean an image, and provides a much greater perfomance hit.

        [This message has been edited by Maniac (edited 27 April 2000).]

        [This message has been edited by Maniac (edited 27 April 2000).]
        Celeron 566@877 1.8V, 256meg generic PC-100 RAM (running at CAS2) Abit BH6, G400 16meg DH@150/200, Western Digital Expert 18gig, Ricoh mp7040A(morphed to mp7060A) Pioneer 6X DVD slot load, Motorola Cable Modem w/DEC ethernet card, Soundblaster Live Value Ver. 2, Viewsonic GT 775

        Comment


        • #19
          Won't engage in a 3DFx/NVidia flame war on a Matrox forum... But a few things need to be adressed.
          Wrong. FSAA on the V5 is EXACTLY the same as FSAA on every other chip. It's been around for years. The only difference is they're partially offloading it to the chip. Notice I said PARTIALLY. If the improvements were big enough, some of it could have been offloaded to the G400's WARP engine... but Matrox chose not to.

          FSAA is performed by rendering the entire scene at double the resolution, and then downsampling it. That's all. No special tricks, and 3dfx can claim otherwise all day long, but it looks exactly the same as everyone else's implementation.

          And it DOES cause a major performance hit. They're not being real specific yet, but let's wait until the boards are out. I'm willing to bet you don't get playable FSAA rates at any resolution over 640x480. I'm also willing to bet that even if you DO get 800x600 with FSAA, that I'd like my 1280x1024 without FSAA, as rendered by the G400, better.
          3DFx's FSAA is not the same than on any other board. The fact is that their accumulation buffer method (jittered subsamples) grants better results for a given level of subsamples (and so a given performance hit). Please do a bit of reading before bashing, ok ? Then, you're "willing to bet me I don't get playable FSAA rates at any resolution over 640x480". Look at Reverend's benchmarks (http://www.voodooextreme.com/reverend/Reviews/3dfx_5500/7.html), does more than 45 FPS in Quake 3 (probably the most fillrate intensive game out there) at 800x600 32bits looks playable for you ? It sure does for me.
          And don't forget about CPU limited games (racing sims, flight sims, RPGs...) for which you'll be able to get 2x (or even 4x) FSAA apparently for free... BTW, 2x FSAA requires about twice the fillrate, and 4x four times the fillrate. With a Voodoo5 5.5k, you can have 2x FSAA and still G400Max-like fillrates. And besides, I don't know where you got that the AA job was only partially offloaded to the chip. Do you care giving a link to your source ?
          And I don't think the V5 can run reasonable framerates at 2048x1536, which is what is required to do FSAA at 1024x768... and if you wanted 4x FSAA you'd need to render at 4096x3072.
          Let's play maths, shall we ?
          2x FSAA@1024x768 is more like 1448x1086 (weird resolution, I know, but it is sqrt(2)x1024).
          4x FSAA@1024x768 is indeed 2048x1536.

          BTW I don't very well see were AGP texturing (or 3DFx's inability to use it, which I doesn't deny) fits here.

          Bull and s**t. We've been over this a dozen times, and nobody has EVER shown any major difference besides fillrate and raw speed between the V1 and the V2, or between the V2 and the V3. Period. All the new features could very easily have been done in drivers, and those that couldn't were really minor tweaks. Name me a new feature of the V5... BESIDES so-called FSAA and T-buffer (which I still think is hideous). There aren't any!
          Where did I mention the Voodoo 1, 2 or 3 in my posts ? Just because they did not innovate with the Voodoo 2 and 3 does mean there is some sort of natural law which prevents them for doing it with the Voodoo4 and 5 ? It is true that besides T-Buffer (FSAA being part of T-Buffer, BTW), I can't name a new feature of the V5 over competiting cards.
          But what would happen if some troll came here, just saying "I have no second monitor so DualHead on the Matrox plain sucks, and EMBM has too much of a performance hit, so I won't use it. So, what are the new features on the G400 ?" The troll would look stupid. Compare with what you said... Because you think FSAA and T-Buffer are useless doesn't mean everybody has to think like you.

          About power supply, I think are comparing SGI architecture and PC architecture ? BTW, SGI makes graphics neither 3DFx nor any other current PC graphic chip maker can touch. And I strongly doubt SGI could come to the graphic quality they have in the current PC architecture (AGP).

          There is in fact an engeneering decision : 3DFx decided to go for a .25mu process, in order to ensure high yields. Hence the power requirements.

          As for image quality, the fact that it was not very good on the Voodoo3 does not imply it will be the same on the V5.

          For the GeForce 2 being as fast as the Voodoo5 6k, let's wait for benchmarks... But I wouldn't bet on it. The GeForce 2 is so bandwidth limited it's almost funny...

          I think it's my last post in this thread, because it looks like I won't be able to change your mind.

          ------------------
          Corwin the Brute

          Corwin the Brute

          Comment


          • #20
            Ok, let's be realistic here. The Voodoo5 take a MASSIVE performance hit when doing FSAA. Enough to make it almost unplayable at reasonable resolutions. So does the G400, which has had the option to enable AA in the tweakers (not published in the drivers due to tremendous performance hit) for most of its lifespan.

            Nvidia has now enabled the feature for all of their cards as well.

            So 3dfx takes LESS of a hit. It's still too large of a hit to make the games playable. Their mystical 100+fps turn into 30fps when it's enabled, and I'd rather have 1280x1024 at 40fps than 640x480 at 30, even if the 640x480 is FSAA.

            As for T-bluffer (or t-bugger, or pee-butter, however you like to look at it), don't even get me started on how useless this feature is.

            1. Nobody will support it in their games. It's worse than T&A, which at least is partially built into OpenGL.

            2. It just looks icky. Makes my eyes hurt. Nevermind the poor legions of folk who can't play Doom or Quake because it gives them motion sickness. Now it's BLURRY motion sickness. Oh joy.

            Combine this with the unreasonable price of the card, and the absurd power and heat characteristics, and you have a big ol' loser. My case temperature is ALREADY too high, I'm not putting in anything that will bump it up by another 30 degrees. It's just not gonna happen.

            And for those of you who think the 22-bit color was "good" or "fine" or "better" in any way, PLEASE go look at some screenshots. If you can even begin to compare the G400's output to the Voodoo3's output, you are quite obviously completely blind and should just immediately go buy whichever "bitchinfast livin' la video loco" card comes out next. Please. Waste your money, but not my time.

            - Gurm

            ------------------
            Listen up, you primitive screwheads! See this? This is my BOOMSTICK! Etc. etc.
            The Internet - where men are men, women are men, and teenage girls are FBI agents!

            I'm the least you could do
            If only life were as easy as you
            I'm the least you could do, oh yeah
            If only life were as easy as you
            I would still get screwed

            Comment


            • #21
              And NOBODY has addressed the fact yet that 3dfx's "PCI disguised as AGP" interface is utterly incapable of transferring that kind of data to the card anyway! I could be wrong here, maybe they have some nifty preloading trick with local memory... or maybe not.
              Just to let everyone here know, the Voodoo 5 does NOT support AGP texturing at ALL. Here it is, straight from 3dfx's mouth:

              "The VSA-100 chip does not support AGP execute mode (or what is called "AGP texturing"). Period. Here's what Scott Sellers said : "AGP texturing is dead -- there are zero games that matter that use it and even Microsoft and Intel are completely downplaying it at this point. 3dfx has always been against AGP texturing, and it now appears our philosophy on this has been upheld in the market."
              I'm not taking sides with either 3dfx or Nvidia, as they both have faults, but I don't think FSAA is the best thing out there as 3dfx would have us believe. I'll wait until Matrox and ATI implement this before I see just how special it's gonna be.

              The Rock
              Bart

              Comment

              Working...
              X