Announcement

Collapse
No announcement yet.

the big parhelia review conclusions thread

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • the big parhelia review conclusions thread

    i try not to be dissapointed but i am

    Parhelia is about 20-50% slower than the 4600 in every Benchmark and about as fast as the 8500. Now the Parhelia is the fastest with AA enabled but honestly most framerates with AA are not playable still.

    I am especially dissapointed with the UT2003 results. I hoped the Parhelia would really shine here, but it can only keep up with a 100$ 8500LE. I simply cannot imagine how some Epic guy said they have the Parhelia running UT2003 in Surround Gaming and even pretty fast. By the Anandtech Benchmarks that makes about 20FPS for 3x1024x768x32 - average. Unplayable.

    But i still think there is hope...

    Drivers: Since the Parhelia drivers were written from scratch i have high hopes there is still much room for optimization. Probably not 100% but maybe 20-30%. nVidia and ATI cards usually get tweaked to the max with the various tools available for them, maybe this will also give the Parhelia a little boost.

    Overclocking: Like Anand said there is not much sense in overclocking the RAM (thats also the reason why it is underclocked by matrox) because the RAM isn't holding it back at all. It is all about the GPU. Since the Heatsink/Fan on the Parhelia is rather small i hope slapping some big HS on it will squeeze some more Mhz out of the Core.

    Displacement Mapping: Anand and others have criticized the Parhelia for the lack of an occlussion feature but i think DM is exactly what they were looking for. Just that it isn't used by Games at all as of now, this will hopefully change in the future and will give the Parhelia another boost.

    0.13: The Parhelia MAX or Parhelia 2 or whatever they will call it will probably be what the P should have been. But when the P2 finally comes out the R300 and nVidias next card will probably already play in a different league

    In the End this sums up to: no Parhelia for me I'd love to experience all the new features but thats not enough to convince me to spend 400$ instead of 100$ for a (original) ATI Radeon 8500LE with about the same speed and not bad 2D either.
    Last edited by thop; 25 June 2002, 09:59.
    no matrox, no matroxusers.

  • #2
    The card easily fills my expectations of the Quality and Features part of the Parhelia. The performance is another story. I hope that the GPU fill-rate is what's holding the card back. Possibly with some good overclocks it might be able to get a 25-30% boost across the board... (wishful thinking, I know)

    I hope someone releases an overclock utility soon so that we can see what it's able to do.

    Comment


    • #3
      I'm interested in seeing OC performance as well, but I'm not buying a $400 card that I'll have to overclock to get to perform.
      "That's right fool! Now I'm a flying talking donkey!"

      P4 2.66, 512 mb PC2700, ATI Radeon 9000, Seagate Barracude IV 80 gb, Acer Al 732 17" TFT

      Comment


      • #4
        I say: wait and see.

        For the following:
        -how drivers mature in the coming 1 to 3 months.
        -how well people can push up the core clock when more cooling is applied
        -perhaps for the 256Mb version or others from Matrox
        -because I am going on holiday for 4 weeks next week

        gnep
        DM says: Crunch with Matrox Users@ClimatePrediction.net

        Comment


        • #5
          Well we got what we asked for( better than GF2), just not at a price point that would be reasonable.

          Pricewatch tell's me 4600's are going just over 300, and 8500's just over 100. There is a question as to what clock speed ATI you get but for 1/4th the price OCing to retail speeds is not unreasonable. We'll have to wait and see what the P retails at. With the beating it's taking from the reviewers I don't think M is going to be able to make that price fly for very long. My guess is they will end up releasing the .13 and 64Meg versionions earlier they they had planed to off set the backlash that will be rampant in the next few weeks.
          K63+550/192M/SCSI/G400Max

          Comment


          • #6
            Hopefully matrox will read comments from around the web and try to bump up the 256mb version....hopefully.
            Fenrir(AVA)
            "Fearlessness is better then a faint-heart for any man who puts his nose out of doors.
            The length of my life and the day of my death were fated long ago"
            Anonymous lines from For Scirnis

            Comment


            • #7
              I am disapointed in the performace that seems to be showing up in reviews. I have been waiting a long time for Matrox to come out with a competitive card. I just can't pay $400 for something that is same speed as others $100 cards, even if image is better.
              Workstation Specs:
              Pentium 4 2 GHz, ASUSTek P4T-E i850, 1024 MB PC800 RDRAM, ATi Radeon 8500 64m, Sound Blaster Audigy Gamer, 3Com 3C905TX-C NIC, Western Digital 80g ATA100 HD, Sony 16x/40x DVD-ROM, Sony CD-RW 175S/C, 19" Sony 420GS, and Windows XP Pro.

              Comment


              • #8
                Hmm, after reading a lot about it on matrox.com and other sites,, ATI's r300 better be sweet!!! I mean, 20 gigs of bandwitdth, oh and yea I don't give a crap about beta tests, dont' forget all of you (like me) that when we bought our radeon 64ddr vivo in july 2000 that a heck of a lot of games were SLIDESHOWS, I'm still runnin my original radeon, (skipping 8500), but I think I'll wait till Sept. to decide between the parhelia and the next ati card...
                Cool review


                hmm, but does this card expect you to have a fast processor??? hmm... I have yet to see anything on its T&L engine, or wait does it USE one... hmm..

                maybe I'm blind but i don't think so...

                Comment


                • #9
                  Sigh,

                  $400US is far too rich for me for what the Parhella is. I would have been able to justify it to myself if it was a fast and quality focused video card. Unfortunately, it just isn't that fast. It is seriously slower in most areas than video cards sporting 128-bit buses. This simply isn't acceptable for a $400US video card.

                  In fact, I think I might just regertitate my hat if the performance situation does not change drasticly within the next few months with improved drivers.

                  Other than that, I guess I have to wait for the refresh.
                  80% of people think I should be in a Mental Institute

                  Comment


                  • #10
                    If the R300 & NV30 are even 20% faster than the ti4600, Parhelia's performance won't be a pretty picture.

                    Comment


                    • #11
                      Now the Parhelia is the fastest with AA enabled but honestly most framerates with AA are not playable still.
                      I trust that you were being facetious with this statement? If you are so far gone that 93.7 FPS in QIII with 16x FAA at 1024x768 (via hot hardware, page 8, top graph) is "unplayable", then you should never have been looking at the Parhelia in the first place. Now I just happened to choose that benchmark because it was at the top of the page that I currently had open in another window, but the rest of the benchmarks reflect similar relative performance levels. Apparently you, like many others, define "unplayable" as anything under 100 FPS. And I, like many other Murcers, think that this belief is the result of crack-addled fps whores who have no life outside of the almighty shoot-em-up.

                      As you'll notice from my system specs in my sig, I have a Kyro II and a G400 currently. I like both of them, they perform well enough for what I ask them to do, but I wouldn't mind a bit more oomph at times. What I don't want is more oomph than I know what to do with. What worldly difference does it make whether QIII is running at 100 fps or 500 fps? I'm not going to make the argument about the human eye only being able to register so many fps, cause I'm not sure I buy the argument anyway. This is not to say that there isn't a playability barrier, it's just a lot lower than a lot of people these days would like to admit.

                      I define "playable" to be 40+ maintained fps in most games. As such, the fact that I will be able to play my favorite games (currently Everquest, UT, Heroes IV, and a few others in case it matters), at 1024x768x32 with 16x FAA at 60+ FPS in every case, to me is simply outstanding performance. In this circumstance, the GF4, the Radeon 8500, and the Parhelia, can all achieve the performance that I want. So once I've reached that performance barrier, wasting time and effort going beyond it means nothing to me. At this point I start looking at the other features of the card. Geuss who comes out on top for me? Thats right, and thats why I have no intention whatsoever of cancelling my preorder.

                      Ian

                      PS. Thop, I'm sorry you were the inadvertent target of my ranting, but this is the type of thinking that most Murcers have seen time and again and are sick to death of. I'm not trying to target you specifically but rather the line of thought that you seem to be taking.
                      Primary System:
                      MSI 745 Ultra, AMD 2400+ XP, 1024 MB Crucial PC2100 DDR SDRAM, Sapphire Radeon 9800 Pro, 3Com 3c905C NIC,
                      120GB Seagate UDMA 100 HD, 60 GB Seagate UDMA 100 HD, Pioneer DVD 105S, BenQ 12x24x40 CDRW, SB Audigy OEM,
                      Win XP, MS Intellimouse Optical, 17" Mag 720v2
                      Seccondary System:
                      Epox 7KXA BIOS 5/22, Athlon 650, 512 MB Crucial 7E PC133 SDRAM, Hercules Prophet 4500 Kyro II, SBLive Value,
                      3Com 3c905B-TX NIC, 40 GB IBM UDMA 100 HD, 45X Acer CD-ROM,
                      Win XP, MS Wheel Mouse Optical, 15" POS Monitor
                      Tertiary system
                      Offbrand PII Mobo, PII 350, 256MB PC100 SDRAM, 15GB UDMA66 7200RPM Maxtor HD, USRobotics 10/100 NIC, RedHat Linux 8.0
                      Camera: Canon 10D DSLR, Canon 100-400L f4.5-5.6 IS USM, Canon 100 Macro USM Canon 28-135 f3.5-5.6 IS USM, Canon Speedlite 200E, tripod, bag, etc.

                      "Any sufficiently advanced technology will be indistinguishable from magic." --Arthur C. Clarke

                      Comment


                      • #12
                        Playable, IMHO, is anything above 30fps. Any hardcore first-person shooter wants 85fps+. Problem is that anyone who can afford a $400 video card already has a 19 or 21" monitor on his desk. These monitors are meant to run at 1280x960 & 1600x1200. The Parhelia is clearly not capable of providing optimal speeds at either of those resolutions and occasionally not even able to provide playable speeds. (32-bit, of course)

                        Comment


                        • #13
                          I don't really mind either cost or performance of Parhelia, but if Tech-Report quoted them right, then Matrox opted for disabling settings for anisotropic filtering higher than 2x due to performance reasons. Not doable for a card that's supposed to be the be-all, end-all of image quality.

                          Hope somebody from Matrox can clarify the AF issues.

                          ta,
                          .rb
                          Visit www.3dcenter.de

                          www.nggalai.com — it's not so much bad as it is an experience.

                          Comment


                          • #14
                            So buy the card and play Q3 engines....
                            This is from http://www.gamepc.com/labs/view_content.asp?id=parhelia128&page=10
                            Lucasarts' Jedi Knight II is one of the most graphically intensive games based on the Quake III engine, and really takes a high-end graphics card combined with a fairly fast CPU to play at reasonable frame rates. For the "Max Quality" settings, we tested the nVidia and ATI cards with 4x FSAA and 8-tap anisotropic filtering, while the Matrox Parhelia card was tested with 16x Fragment Anti-Aliasing and anisotropic filtering enabled.

                            Jedi Knight II brings back the "curse" of Matrox's OpenGL drivers. While image quality was great and the stability is rock solid (which hasn't been the case for Matrox's G400 class OpenGL drivers), performance is certainly not up to par. In both the 1024 x 786 and 1600 x 1200 tests, the Parhelia-128 shows performance sub-par to all of the other nVidia and ATI products in testing. Not good.. not good at all.
                            Hows those UT2003 scores looking...

                            Comment


                            • #15
                              Originally posted by HedsSpaz
                              And I, like many other Murcers, think that this belief is the result of crack-addled fps whores who have no life outside of the almighty shoot-em-up.
                              i know that wasn't directed at me but i have to say sth. about this. if you look in older posts you will see that i am always saying all i need is 30FPS even in the worst case scenario and about 60FPS average. and all above that is a waste and actually a penis size thing. so that shoe doesn't fit me

                              i don't play Q3A anymore nor do i know anyone who still does nor am i interested in playing it in the future

                              if i look at more recent games the performance at 1280x1024x32 16AA (and with a 400$ card i don't want to be limited to 1024x768x32, i'm not even talking about 1600x1200x32) isn't playable. when i look at future games like UT2003 parhelia will have problems to get decent framerates at 1280x1024x32 even without AA

                              i guess in the end it all sums up to: card is nice, but far too expensive
                              Last edited by thop; 25 June 2002, 11:23.
                              no matrox, no matroxusers.

                              Comment

                              Working...
                              X