Announcement

Collapse
No announcement yet.

A little discussion on Radeon 9700 and Parhelia

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • A little discussion on Radeon 9700 and Parhelia

    Hi everyone,
    I used to have another account on this board, but I've forgotten the password, so I created this one.

    First off, let me say that I'm a big Matrox fan. I run a G400 at work, a G200 at home, and I've got a Millenium II somewhere.

    That said, I thought that I would mention something that might cheer up the rest of the Matrox fans.

    The Parhelia has clock-for-clock the same texture throughput as the R300.

    Now, the R300 has same advantages due to it's fully floating point pipeline, and DX9 compaibility. Also, it can pump out twice as many pixels per clock.

    But theoretically, in a next generation game that made use of displacement mapping and hardware tesellation (as I understand, these could almost elimate overdraw, if used properly?), the Parhelia could match the R300 on a clock-for-clock basis.

    It's an interesting possiblity, but the tremendous amount of driver optimization that would need to be done makes me wonder if it will ever happen.

    The Parhelia definately has some things going for it still, but considering the R300 does 10-bit color as well, it could diminish the appeal, if they remain at the same price.

    I know it can be kind of discouraging to look at graphs that show the Parhelia being outdone 2-5x, but with proper drivers, that could change. As well, Matrox is still the only one to do anti-aliasing right (assuming the can get it bug free).

    I want a Parhelia, but with the current drivers I think I will wait to see how the competition heats up.

    The funniest part is that it's us Canadians providing innovation in the graphics industry. Great time to be a canuck, eh?

  • #2
    The problem is that they're not on a clock for clock comparison.
    (1 loss for Parhelia)

    Waiting for the parhelia to get better drivers might help, but during that same time ATI will be making their's better too.
    (Tie for both)

    The 9700 has showed that it can out perform with all of the eye candy turned on, even 4x FSAA and AF. What about 16x AA? Well my eyes might not be the best but I'm sure 4x AA at 1600 by 1200 is probably better looking than 1024 by 768 with 16x AA, not to mention the 9700 would still be faster.
    (1 loss for Parhelia)

    Triple Head gaming, well I don't have 3 monitors but still cool.
    I don't think 9700 has that support.
    (1 loss for 9700)


    So atleast for me, I like fast frame rates with lots of eye candy, so in my case I'll be getting the 9700 from ATI.
    Last edited by Blackphoenix; 18 July 2002, 10:28.

    Comment


    • #3
      You're probably wrong about that Blackphoenix. I've tried all the FSAA options on my GF3Ti, and none of them looked very good to me. I'm playing all of my games on Parhelia at 1024x768x32x16FAA, and it's <B>unbelievable</B>. 16x12 doesn't look as good as this, all the polys look so smooth I can't guess at resolution at all.
      Gigabyte P35-DS3L with a Q6600, 2GB Kingston HyperX (after *3* bad pairs of Crucial Ballistix 1066), Galaxy 8800GT 512MB, SB X-Fi, some drives, and a Dell 2005fpw. Running WinXP.

      Comment


      • #4
        - And to add to what Wombat said. The 16xFAA for the Parhelia doesnt touch the textures. When i was using a Radeon 8500 128Meg. With 4xFSAA or Higher, my eyes would get really strained and burdened by the blurry screen. I couldnt stand playing with it on. With my Parhelia, 16xFAA is spectactular. Playing the bug ridden Nvidia Optimised Americas Army, with 16xFAA in 800x600. You couldnt tell what resolution the game was really playing on. And whats the real need to play in higher resolutions? To get rid of Jaggies right? So I asked my self this. If you can play a game with repectable speeds with the Parhelia in 800x600 (or even 2400x600) with 16xFAA on without a blurry textures. And play with the Radeon at 1600x1200 with blurry screen and textures. I myself would still prefer the Parhelia.

        As for the Radeon 9700. It's a wonderous Gaming Card but, thats all it really will be used for. I remember having an argument on the IGN Boards. That the next ATI card will support Triple Head and all the features the Parhelia will have... That never happened.
        Now PREviews said the card has 10-Bit colour mode, does that include 10-Bit Alpha? So with John Carmack putting the Parhelia for only have 2-Bits Alpha, did the Radeon 9700 do the same? -

        Edit: And what happened to the Glorified rumours of 128-Bit Internal Colour Presicion? I didnt read about that in the White Papers... -
        Last edited by Sinistral; 18 July 2002, 11:12.
        - ? -

        Comment


        • #5
          Well with the new smoothvision it might look just fine.
          Have to wait and see screen shots.

          I agree the 16x FSAA is nice, but that really seems to be the only difference between the two cards (except for the 9700's higher speed and parhelia's triple gaming)

          Comment


          • #6
            Don't forget the quality of Matrox's output, and the oh-my-god-vivid colors.

            Has anybody even caught up to DVD Max yet?

            Too bad neither one of them has Linux drivers.
            Gigabyte P35-DS3L with a Q6600, 2GB Kingston HyperX (after *3* bad pairs of Crucial Ballistix 1066), Galaxy 8800GT 512MB, SB X-Fi, some drives, and a Dell 2005fpw. Running WinXP.

            Comment


            • #7
              It would be interesting to see if people can underclock a 9700 down to 220/275 to see how well a R300 performs against a Parhelia-512.

              Comment


              • #8
                I really get tired of going over it again and again, but clock speed means <I>nothing</I> people. They're different architectures, and they are <I>designed</I> to run at different speeds.

                You guys would think somebody were nuts if they said "let's remove 2 pistons from this V8 to see how it stacks up to a V6."
                Gigabyte P35-DS3L with a Q6600, 2GB Kingston HyperX (after *3* bad pairs of Crucial Ballistix 1066), Galaxy 8800GT 512MB, SB X-Fi, some drives, and a Dell 2005fpw. Running WinXP.

                Comment


                • #9
                  But we're comparing the V6 Parhelia (at 4000rpm) to a V6/8 Radeon (at 5000rpm). If we run them both at 4000rpm, how do they perform?

                  It's more comparing to see what's better. But I don't care about an underclocked card - I wanna see a 5000rpm Parhelia

                  P.
                  Meet Jasmine.
                  flickr.com/photos/pace3000

                  Comment


                  • #10
                    Still just as useless. If my torque curve is optimized for 5000rpms, and yours for 2000 rpms, then why run them both at 2000rpms?
                    Gigabyte P35-DS3L with a Q6600, 2GB Kingston HyperX (after *3* bad pairs of Crucial Ballistix 1066), Galaxy 8800GT 512MB, SB X-Fi, some drives, and a Dell 2005fpw. Running WinXP.

                    Comment


                    • #11
                      Originally posted by Wombat
                      Still just as useless. If my torque curve is optimized for 5000rpms, and yours for 2000 rpms, then why run them both at 2000rpms?
                      The point is to see which is fastest at 2000 rpm and then extrapolate the performance of the 2000 rpm one to guesstimate its performance if it was able to run at 5000 rpm.

                      To see if the underclocking results are useless or not you would need to test both chips at various clock speeds to see if the performance change is linear or not.

                      Comment


                      • #12
                        Seeing which is fastest at 2000rpm is meaningless.

                        Now, if you wanted to do a linear extrapolation of Parhelia's performance at higher clock speeds, you don't need the ATi card for that.
                        Gigabyte P35-DS3L with a Q6600, 2GB Kingston HyperX (after *3* bad pairs of Crucial Ballistix 1066), Galaxy 8800GT 512MB, SB X-Fi, some drives, and a Dell 2005fpw. Running WinXP.

                        Comment


                        • #13
                          You would all think the V10 viper engine is completely awesome.. until you strapped yourself into a F1 racer with a V6.

                          Comment


                          • #14
                            You can easily scale the Parhelia benchmarks up yourselves to an equal clock speed. And though you won't get an equal increas in performance in performance than you raised the clock speed, the Parhelia still would lose by a large amount against the R300 when put in a clock-to clock camparision. I guess Matrox really forgot to use a "crossbar"-like advanced memory controller (thinking that Matrox had some quirks in earlier memory controllers, e.g. the G400 as well...)
                            And the R300 WILL put the Parhelia to shame in DVD / desktop video playback - even the R7500 does (check the AVS forum or other similar fori) and those new capabilities of the R300 are just wonderful.

                            I guess there only will be two fields where the R300 could lack: 2D image quality and drivers (although ATIs drivers have gone the long way from "simply terrible" to "acceptable" now).
                            Last edited by Indiana; 18 July 2002, 15:43.
                            But we named the *dog* Indiana...
                            My System
                            2nd System (not for Windows lovers )
                            German ATI-forum

                            Comment


                            • #15
                              ...and Matrox's have gone to "nonexistant" in some OSes now, and "seriously lacking" in others.

                              AZ
                              There's an Opera in my macbook.

                              Comment

                              Working...
                              X