Announcement

Collapse
No announcement yet.

About current situation

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #16
    Malte:
    It is true that Matrox never make benchmark killer card & never wanna do that but... G400 wasn't so behind ATI or Nvidia like PArhelia is. Parhelia got about 10% of power in comparison with newest ATI or NV. If You will change some gamer gfx card from gforce to Parhelia he will ask.. why my new games are working so slow... or why they don't start....
    A CRAY is the only computer that runs an endless loop in just 4 hours...

    Comment


    • #17
      Originally posted by DGhost
      Image quality on the Parhelia has been thought of as worse than the previous generation of cards. the last card they had that produced "legendary" TV output quality was the G400/MAX.[/B]
      Whose idea was it to cripple the G450 with 64bit vram? They should've called it the G500 and given it 128bit ddr. Cost is one thing, but performance is meant to improve with new revisions.... the Gxxx cards were very performance sensitive to vram bandwidth.

      Originally posted by Nicram
      Malte:
      It is true that Matrox never make benchmark killer card & never wanna do that but... G400 wasn't so behind ATI or Nvidia like PArhelia is. Parhelia got about 10% of power in comparison with newest ATI or NV. If You will change some gamer gfx card from gforce to Parhelia he will ask.. why my new games are working so slow... or why they don't start....
      Hmm the G400 wasn't really behind Remember the atrocious 2D & 3D visual quality their cards had at the time? The G400 was often doing far more work per fps.

      Originally posted by SpiralDragon
      ay... same here... but what if Matrox does go bely up like its been already sudgested?
      That would depend on how it was handled. If another company with stacks of money were to buy Matrox and run it well, it may do well again. Then again, Nvidia might buy it...
      Last edited by G400SG16mb; 16 July 2004, 08:27.
      Matrox G4x0 32mb SG RAM DVI

      Comment


      • #18
        Its a shame how far Matrox has fallen since the G400MAX. That card was right up there in performance with everything else at the time, and had far superior display quality and the best TV out on the market. Actually it still has the best TV out.

        If they released the revised 8x core of the Parhelia instead of the slow one with the numerous bugs I think we wouldn't feel so disenchanted with Matrox. It was never like them to release broken hardware. They used to be the best, now its broken hardware and buggy drivers and lack of OS support.

        Comment


        • #19
          Twilight - Parhelia had twice the theoretical memory bandwidth of the competition. it also appears to have some fairly severe core or driver issues because it does not perform anywhere near where it should in synthetic tests (not things like 3dMark - little apps that are designed to stress one subsystem only). of course, how well engineered the memory controller is can impact performance a lot... and Matrox has a history of not having well performing ones...

          data compression and a few of the other bandwidth saving techniques would have helped, but the major problem with the P is that the memory controller (or possibly drivers) just plain sucks.

          G400SG16mb - graphics cards have always been sensitive to vram bandwidth, and the 16bit to 32bit conversion that took place effectively doubled the amount of bandwidth that was required to do the same work. the G450 in itself was not a bad idea - it brought a lot more things onto the main GPU and decreased power consumption and heat output, both good things. That being said, the move from a 128bit to 64bit bus was not as crippling as you make it sound since they moved from SG/SDRAM to DDR SDRAM. What they *should* have done, however, was clock it at least as high as the G400MAX and up the memory bandwidth a little bit past that to compensate for the loss of efficency due to the DDR transition. Of course, that would not have helped it compete with the GeForce2, but it would have at least brought more interest to the card (especially from existing Matrox customers). they should have also grabbed their balls and released a "high end" card... but... that would have been the proposed G800, iirc... and that is a whole new can of worms...
          "And yet, after spending 20+ years trying to evolve the user interface into something better, what's the most powerful improvement Apple was able to make? They finally put a god damned shell back in." -jwz

          Comment


          • #20
            Originally posted by Malte
            The PArhelia may be fine and more than enough for many people but you really have to pay a lot for what it offers. And as a Linux user I´m angry that with each card released after the G400 Matrox has decreased Linux/open source support step by step.
            Now I may have to change to NVidia, not because they make cards that much better, but because they have better support for the OS I use and they have cards for a reasonable price.
            That is the very reason why I am going to get an nVidia card next... though I'm going to miss the Triple-Head/Surround Gaming features of the Parhelia..

            Linux support is quite important to me, and it pisses me off to no end that Matrox in essence lied that the Parhelia would have support in linux. It is very half-assed linux support. Considering every distro is switching or has already switched to the 2.6.x kernel and the mtx drivers work like ass with it... and the 2.5/2.6 kernels have been out forever now! So now I've sold one of my three monitors, I'm waiting for a Leadtek 6800 GT to come out... then it's bye bye Matrox for me... sad, since I've previously owned a Millenium 2 (which actually I still have two of... my original one, that I bought retail, and one in my server.), a G200, which I bought when it came out (also retail) and a Marvel G400-TV (now in my mother's PC) and the Parhelia.

            I'd get an ATI, but their linux support sucks about as much as Matrox's. I'm just hoping that the 2D quality of nVidia's cards have improved greatly....

            Leech
            Wah! Wah!

            In a perfect world... spammers would get caught, go to jail, and share a cell with many men who have enlarged their penises, taken Viagra and are looking for a new relationship.

            Comment


            • #21
              Ahh yes - I love to read your remarks about that topic DGhost!

              But I like that card...speed isn't a big topic today, since my whole rig is a bit/too slow for games. I'm sure even with an ATI 9800 Pro it wouldn't perform better.

              However the next machine may have an X800 in it...

              Cheers, Hannes

              Comment


              • #22
                Matrox awake, please....

                Comment


                • #23
                  There are some other way to create impressive graphic solutions

                  Comment


                  • #24
                    definately, ever seen the Bitchin' Fast 2000 ?

                    Comment


                    • #26
                      maybe I should just pull AM's posting privileges in the crystal ball and here
                      Juu nin to iro


                      English doesn't borrow from other languages. It follows them down dark alleys, knocks them over, and goes through their pockets for loose grammar.

                      Comment


                      • #27
                        grumpy
                        P.S. You've been Spanked!

                        Comment


                        • #28
                          Hmm. I'm not a matrox groupie but really... I don't see why we whine so much about Matrox performance. Well actually I do - it's the gamers in us that whine, apart from specific complaints like linux, banding etc.

                          I went straight from a Radeon 9700 Pro to this P650 and I am delighted still. I am mostly a 2D graphic designer but even for just web browsing and looking at desktop icons i can see a huge quality difference with the Matrox. I admit that it does need a very fast system to back it up (I guess my 2185Mhz barton and 1gig ram help - the 9700 Pro had a 1050Mhz Duron with 384Mb).

                          I'm a very light gamer - since the 9700 I have really only played Quakeworld, Homeworld 2 and C&C Generals - but all have been great performers at medium res and high detail levels. Even Far Cry was ok with low res and 16x FAA. Talk about bugs? Well my bro's R9800 crawls at 2-5fps with zooming in on Homeworld2, I am ok.... bizarre.

                          I haven't even tried dual head yet - and that is supposed to be the strongest point of the Matrox!

                          Oh and I've used 3DS Max 6 with a 70,000 polygon shaded model that I could rotate in real time with good fps - can't remember if it was OpenGL mode or DirectX.

                          To the ppl who want the features that don't work so well, I agree you got screwed (Linux etc etc) but it is possible to have situations where a Matrox is just perfect - like for me, it is a Godly Photoshop card and I can still whoop my friends at C&C Generals without playing in wireframe mode

                          However I will be wanting a new card for Longhorn and if Matrox doesn't deliver then, I'll go grab that 3DLabs card I was too scared to get this time

                          Comment


                          • #29
                            blebleble...
                            Whirl. Parhelia is dedicated card for graphic (for 3dmax too). Your model got 70,000 polygons. Nice, couse it was worked fine but. Full scene, when i create it got more than 1 object! For example 100, or more objects with sometimes much more poligons. Then Parhelia is like turtle. OK. It's production but. Have You checked some scene demos. I mean PC scene (where playing freaky computer music & there are many art & many nice 3d scenes rendered in real time). If You do, you know how crappy they works. It's terrible that many of them work just fine on FX5200 & bad on Parhelia, couse of drivers that sometimes don't support some one, simple instruction that makes everything lagging. & You say that gamer is wake up in us. Well Parhelia is/was advertised as card for GAMERS!!! Wake up Whirl
                            Last edited by Nicram; 22 July 2004, 01:12.
                            A CRAY is the only computer that runs an endless loop in just 4 hours...

                            Comment


                            • #30
                              weren't the 9xxx cards meant to have good 2D quality? I swear the 9600 a relative has got, looks worse than a Matrox Millennium
                              Matrox G4x0 32mb SG RAM DVI

                              Comment

                              Working...
                              X