Announcement

Collapse
No announcement yet.

Looks like someones already decided that the new card sucks

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #31
    Cheesemaker
    At its simplest level 250fps denotes power with bandwidth and fill rate a plenty. It also means that the card should also be able to turn that excess frame rate into better quality imagery or for use with more advanced game engines.
    Luckily for us parhelia will give us quality but not at the expense of fps
    Interests include:
    Computing, Reading, Pubs, Restuarants, Pubs, Curries, More Pubs and more Curries

    Comment


    • #32
      Originally posted by mdhome
      Cheesemaker
      At its simplest level 250fps denotes power with bandwidth and fill rate a plenty. It also means that the card should also be able to turn that excess frame rate into better quality imagery or for use with more advanced game engines.
      Not necesarily. Although it's a safe bet that a card capable of 250FPS with everything tured off should get decent frame rates when rendering high-quality video, there is no guarantee that this will be the case. You'll note that the people lusting after ridiculously high frame rates rarely talk about what rates they get when FSAA or high-quality filtering are turned on.

      My main video card at present is a Geforce 3 (I got sick of waiting for Matrox when the G550 came out with no gaming benefits). I like the Geforce 3. It has much better video quality than previous Nvidia offerings, and it gets pretty impressive frame rates in most games. However, turn on some serious rendering features, and the frame rates plummet. I find it interesting that my G400MAX can still hold its own surprisingly well once the stakes increase.

      The other part of the problem is that just because a card can turn out massive numbers of FPS, this doesn't mean that the game will run more smoothly. When I first tested my G400MAX against an original Geforce, not suprisingly, the Geforce won. Its fill rate was obscenely high (for a few years back), and it consistently turned out better frame rates than the Matrox could muster. After you got over the fact that the video quality was AWFUL (at the time it hadn't occurred to me that it would be any different - I spent 2 hours trying to figure out what was faulty), I was left with the realisation that the G400 was smoother. It may not have been able to produce the same frame rates as the Geforce, but turning was smooth and fluid, and on the Geforce it was jerky.

      And I guess this is where my question comes from. Why do people insist on believing that a card able to turn out a frame rate that their monitor can't display, and that they couldn't see even if the monitor could, will give them a better gaming experience? Sure, FPS can be measured, and any discussion on quality is highly subjective, but the only real indication of how good a card is comes when you turn the frame rate counter off. In my original test, if it weren't for the frame rate counter on the game, I would have sworn that the G400MAX was turning out much higher framerates than the Geforce.........
      Last edited by Cheesekeeper; 18 May 2002, 03:56.

      Comment


      • #33
        measuring in frames per second is a bad idea IMO, why don´t we measure the number of milli-seconds a frame is displayed, instead, the higher the more jerky.

        with a framerate of 100 fps, you can have 1 frame the first half of the second and 99 the second half of the second.

        my point: fps doesn´t say anything about how smooth an animation is.
        This sig is a shameless atempt to make my post look bigger.

        Comment


        • #34
          Measuring frame rate in one environment isn't that bad of an idea, <B>if</B> there is a reason to believe that you can extrapolate a cards performance based on other knowledge you have. For example, all of these people judging the GF4 on its FPS: well, they've been using the same archtecture for <I>years</I>, with various tweaks and speedbumps. Trying to compare FPS of Parhelia to FPS of a GF4 at only one data point is kind of silly. It makes as much sense as deciding that a P4 must always be faster than an Athlon because it runs so much faster clock-wise.
          Gigabyte P35-DS3L with a Q6600, 2GB Kingston HyperX (after *3* bad pairs of Crucial Ballistix 1066), Galaxy 8800GT 512MB, SB X-Fi, some drives, and a Dell 2005fpw. Running WinXP.

          Comment


          • #35
            Originally posted by mdhome
            Cheesemaker
            At its simplest level 250fps denotes power with bandwidth and fill rate a plenty. It also means that the card should also be able to turn that excess frame rate into better quality imagery or for use with more advanced game engines.
            Luckily for us parhelia will give us quality but not at the expense of fps
            so... basically... you are saying that because a card can get 250FPS in Q3 that it will have plenty for, say, doom3? despite the fact they are coded to use completely different methods of using the video card?

            or that, you know, because it is getting such high frame rates, it can enable such features as FSAA or high quality Ansio filtering without performance going to pitiful levels?

            what if the chipset was basically the G400 chipset clocked 3x as high with slightly more efficent internal workings? it would have enough power to drive Q3 bloody fast, but what happens when you try to enable FSAA or any other "advanced" feature?

            anyways...
            "And yet, after spending 20+ years trying to evolve the user interface into something better, what's the most powerful improvement Apple was able to make? They finally put a god damned shell back in." -jwz

            Comment


            • #36
              Yeah Hellbinder is definitly worried LOL hmm well considering Matrox has always been able to provide better driver support than either Nvidia and ATI IMHO and since now Matrox has the hardware to back up the software.....hehe I see many soiled pairs of briefs over at ATI and Nvidia corp headquarters
              AMD Athlon 1800 XP@ 1680GHZ (only the best) on a Epox 8K7A, 512 megs PC2100DDR, Matrox Parhelia 128 AGP,Turtle Beach Santa Cruz, DSL BABY, 1250 Down/220 UP XP 2600 Pro

              Comment


              • #37
                Originally posted by DGhost


                so... basically... you are saying that because a card can get 250FPS in Q3 that it will have plenty for, say, doom3? despite the fact they are coded to use completely different methods of using the video card?

                or that, you know, because it is getting such high frame rates, it can enable such features as FSAA or high quality Ansio filtering without performance going to pitiful levels?

                what if the chipset was basically the G400 chipset clocked 3x as high with slightly more efficent internal workings? it would have enough power to drive Q3 bloody fast, but what happens when you try to enable FSAA or any other "advanced" feature?

                anyways...
                As I said ...At its simplest.

                Regards MD
                Interests include:
                Computing, Reading, Pubs, Restuarants, Pubs, Curries, More Pubs and more Curries

                Comment


                • #38
                  Hmm.. Don't knock nVidia's driver support. Matrox's drivers have been good to me, no doubt. But nVidia's flavor-of-the-week (as many of you like to say) has had many advantages for me so far. For example, since not many of us all have the same games, nor ALL games available, it's possible to find a set of drivers that works well in all the games we do have, even if there may be bugs in other games (of which we don't own anyway). On top of that, nVidia does show incremental improvement in performance without a loss of quality when you look at their driver releases overall.

                  Matrox's biggest ace-in-the-hole from my viewpoint hasn't been outstanding overall drivers, but drivers that at least enabled all the features in some sort of usable manner, while still having good overall performance. Matrox cards tend to be better overall, when taking drivers and hardware both into account. They've just been a bit slow to update their technology, and fall behind when the other companies release newer technology. (One of the reasons I ended up with nVidia cards for my last 3 boards wasn't new features, it was playability using the existing features, which was only really made possible by boosting the card's performance levels.)
                  "..so much for subtlety.."

                  System specs:
                  Gainward Ti4600
                  AMD Athlon XP2100+ (o.c. to 1845MHz)

                  Comment

                  Working...
                  X