Announcement

Collapse
No announcement yet.

What affect does the new ATI card have on Parhelia?

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #16
    Elie said,
    Wait till applications ala Games start to take advantage of it's awesome architecture, and then we'll see some serious benchmarks.
    Very doubtful. By the time games start to take advantage of its architecture Parhelia will be at least one full generation behind, my guess two generations. Hardware is always developed faster than software can take advantage of it.

    And aside from triple-head, the proprietary features of Parhelia will remain unused unless integrated into a later revision of DirectX.

    Alas boys and girls, Parhelia's limelight was shorter lived than even I predicted.

    -[Ch]ams

    Comment


    • #17
      from: http://www.anandtech.com/video/showdoc.html?i=1656&p=7
      The R300 also supports the Vertex Shader 2.0 specification, which is a part of Microsoft’s DX9 spec. Along with this, the Hardware Displacement Mapping technology from Matrox’s Parhelia is also supported by the R300, though it will go unused for a while until developers begin taking advantage of it.
      this will help matrox getting support for displacement maps.



      it seems like parhelia supports 2/3 of the dx9 specs, and it might have an advantage because of depth adaptive tesselation (DAT) .
      this could actually mean that it WILL be fast enough with advanced higher order surfaces (HOS), in future games.
      its quad TMUs/pipe could also prove to be an advantage.
      Last edited by TdB; 18 July 2002, 14:14.
      This sig is a shameless atempt to make my post look bigger.

      Comment


      • #18
        Originally posted by Montanan
        As long time matrox 2d fan, I migrated to Nvidia when the overpriced and slow Parhlia came out.
        As a long-time Matrox fan, it's impossible to migrate to Nvidia - you'd beat yourself in the head with a tractor before subjecting yourself to that sort of horror. ATi is bearable. If you switched to Nvidia then you weren't a true Matrox fan in the first place..


        We seldom attribute common sense except to those who agree with us.
        __________________
        The Darkside: OCN HellSpawn - Having our way with your daughter's genes

        Comment


        • #19
          It depends. I finally gave up on my MAX and got a Gainward GF3Ti. It was more than bearable. I'd seen plenty of GF2s, and known I couldn't possibly deal with that crap, but the Gainward was pretty decent. Could I tell the difference from a Matrox? Yes. Did it hurt my eyes? No. Still, this Parhelia was a very noticable improvement. When I installed the cards, I actually went GF3->G450->Parhelia, and could notice the improvements on each step.
          Gigabyte P35-DS3L with a Q6600, 2GB Kingston HyperX (after *3* bad pairs of Crucial Ballistix 1066), Galaxy 8800GT 512MB, SB X-Fi, some drives, and a Dell 2005fpw. Running WinXP.

          Comment


          • #20
            Ok, I belive Montanan when he says he's a long time matrox 2d fan. Considering he just joined MURC in June!!! (sarcasm is great, isn't it) So if you're so keen on Matrox 2D how could you go nVidia? Just as Wombat pointed out, maybe not eye-jarring pain, but you can definitely tell the difference. My radeon 64mb ddr VIVO that I had before my Parhelia even had a VERY noticeble difference in image quality. Plus it's just plain cool to watch Southpark full screen on my TV with DVDmax. Anyhow, I've ranted long enough. It's good to be back in the Matrox game. In my perhaps not so humble opinion, I think that ATI's stink because of their drivers. nVidia cards suck because there really isn't a way to say "This card is good/bad" They just sell their reference design to to other companies and they don't always do a good job with it. If you were joe-schmoe who needed a video card and just happened to see one in a Software Etc. store, you wouldn't know which ones had the crisper 2d display.
            I'm reminded of my friend who bought a 1000 dollar monitor and yet had an nVidia card in it so he couldn't go higher than 1024x768 resolution without killing his eyes. Of course I had my Marvel G400 and that time and laughed at him.

            Leech
            Wah! Wah!

            In a perfect world... spammers would get caught, go to jail, and share a cell with many men who have enlarged their penises, taken Viagra and are looking for a new relationship.

            Comment


            • #21
              Seems as though reality is slowly but surely setting in. The folly of releasing but one true, high end card in the past three years is finally catching up with Matrox. The R300 is performing like the P should have. Not only is it besting the Geforce 4 Ti 4600 by a fairly large margin in every single high res benchmark, but it is absloutely clobbering the P. with more than twice the performance, even in Matrox' own niche HQ settings.

              Matrox now in order to stay in this game is going to have to do a complete 180 turnaround in marketing/philosophy, and release new revisions/faster products even quicker than Ati/Nvidia. If not I see this as their eventual demise. I really do...
              Last edited by Maniac; 18 July 2002, 21:58.
              Celeron 566@877 1.8V, 256meg generic PC-100 RAM (running at CAS2) Abit BH6, G400 16meg DH@150/200, Western Digital Expert 18gig, Ricoh mp7040A(morphed to mp7060A) Pioneer 6X DVD slot load, Motorola Cable Modem w/DEC ethernet card, Soundblaster Live Value Ver. 2, Viewsonic GT 775

              Comment


              • #22
                - Matrox isnt soley dependant on the gaming market... I doubt with a release of a newer card (which is just plain evolution) will put any company under. Maybe if Matrox was a pure Gaming First company. Then yeah, that may be possible. But if they were, they would have joined in the high FPS and 3DMark score race a long time ago. -
                - ? -

                Comment


                • #23
                  Isn't this what happened to 3Dfx? Voodoo 5 was late, and slower than competition in most benchmarks. And then they went out of business.... But as stated, Matrox is not too dependant on the games market. However, no Parhelia for me. I guess an AIW Radeon 9700 will fit me much better, and not be substantially more expensive.
                  -Slougi

                  Comment


                  • #24
                    I really wish Matrox would've released a "gamer" edition of the Parhelia clocked at 300mhz with that bigass heatsink. Performing at ti4600 levels with surround gaming I would've been able to justify it over a 9700...

                    right now I'm not so sure. :/

                    Comment


                    • #25
                      Even tho the Parhelia doesn't rely on gamers, it doesn't rely on business either.
                      Any serious businessman won't shell out the dosh for a parhelia, when he can give high quality desktops cards like G450 and G550's to them at a fraction of the price.
                      Why is the G450 still sold!?
                      P has really been a cock up. It just hasn't lived up to the hype.

                      I loved Matrox's quality and stuff, but there is no way on earth i'm gonna shell out that sort of cash for a still-born card.
                      Anyway, all the others are catching up in the quality game.
                      They have to for DX9, and the image quality won't be very much different on any major vid cards.
                      The P has nothing going for it. Its a damned shame, but thats the way it goes.

                      I saw above some guy flaming a poor helpless matrox fan who decided to exile himself to Nvidia, coz of the P's low perfs.
                      IHe actually used the guys murc registration date !!!
                      crap! i've lost my login here a few times. And had to recreate a login.
                      I've been a matrox fan since the Mystique.

                      Anyway, please don't flame those who think Parhelia is a dissapointment. It is.

                      Suggestion to matrox: Get arse in gear and and attempt a rapid 0.13µ migration for 2003 adding the missing DX9 components. If it is possible, it should be done.
                      if it isn't, then aw Boo.

                      Whichever, I won't be getting a P (unless they suddenly become cheap)(er)(a lot).

                      PC-1 Fractal Design Arc Mini R2, 3800X, Asus B450M-PRO mATX, 2x8GB B-die@3800C16, AMD Vega64, Seasonic 850W Gold, Black Ice Nemesis/Laing DDC/EKWB 240 Loop (VRM>CPU>GPU), Noctua Fans.
                      Nas : i3/itx/2x4GB/8x4TB BTRFS/Raid6 (7 + Hotspare) Xpenology
                      +++ : FSP Nano 800VA (Pi's+switch) + 1600VA (PC-1+Nas)

                      Comment


                      • #26
                        Originally posted by Evildead666

                        Any serious businessman won't shell out the dosh for a parhelia, when he can give high quality desktops cards like G450 and G550's to them at a fraction of the price.
                        Why is the G450 still sold!?
                        P has really been a cock up. It just hasn't lived up to the hype.
                        Well they always have the 64MB Verison of the Parhelia to get for OEM deals...I dunno where your getting the hype for this...maybe it was hyped up by people here, but Matrox never really hyped it up that much. I'm slightly dissapointed in the P when you compair benchmarks of the card with the other cards out there, but Benchmarks aren't the end all for video cards..its just a marketing tool that companies can say that their lastest and greatest product can get bigger scores then the other companies product. If the P came out at the start of the year I bet everything would have been much different then what it is now.....
                        Why is it called tourist season, if we can't shoot at them?

                        Comment


                        • #27
                          why does everybody assume that matrox is going for 0.13µ for their refresh?

                          the rumor is that the fab they are using is jumping straight to 0.09µ.
                          This sig is a shameless atempt to make my post look bigger.

                          Comment


                          • #28
                            I'm not talking about the benchmarks, i'm talking about price/performance.
                            Its one of the lowest out there.
                            And the P is also rather late......

                            There was a LOT of hype for the Parhelia, and maybe everyone was just expecting/hoping for a little too much.

                            There's a lot of new hardware being touted for Xmas....i would rather see whats new in January.

                            Don't forget there are 2 sides to every story.
                            I already have a good enough quality 2D card (G550) and now i need a gaming card. But not a G4 or a Radeon. One of the next, next ......zzzzzzzzzzzzzzzzzz...uh, cards.

                            P is a bit of a big pill to swallow.
                            PC-1 Fractal Design Arc Mini R2, 3800X, Asus B450M-PRO mATX, 2x8GB B-die@3800C16, AMD Vega64, Seasonic 850W Gold, Black Ice Nemesis/Laing DDC/EKWB 240 Loop (VRM>CPU>GPU), Noctua Fans.
                            Nas : i3/itx/2x4GB/8x4TB BTRFS/Raid6 (7 + Hotspare) Xpenology
                            +++ : FSP Nano 800VA (Pi's+switch) + 1600VA (PC-1+Nas)

                            Comment


                            • #29
                              Originally posted by TDB
                              why does everybody assume that matrox is going for 0.13µ for their refresh?

                              the rumor is that the fab they are using is jumping straight to 0.09µ.
                              Yeah and by the time that happens its going to be 2005
                              Why is it called tourist season, if we can't shoot at them?

                              Comment


                              • #30
                                0.09µ, but for when?
                                The fabs are still having difficulty doing 0.13µ correctly, and the 0.15µ is just getting optimal results.

                                0.09µ is a long jump away, and the next but one cards will be coming out by then.

                                A shrink would be good, but won't win them any races.
                                Then again, they have never tried to win any races.

                                They just do what they do.
                                PC-1 Fractal Design Arc Mini R2, 3800X, Asus B450M-PRO mATX, 2x8GB B-die@3800C16, AMD Vega64, Seasonic 850W Gold, Black Ice Nemesis/Laing DDC/EKWB 240 Loop (VRM>CPU>GPU), Noctua Fans.
                                Nas : i3/itx/2x4GB/8x4TB BTRFS/Raid6 (7 + Hotspare) Xpenology
                                +++ : FSP Nano 800VA (Pi's+switch) + 1600VA (PC-1+Nas)

                                Comment

                                Working...
                                X