Announcement

Collapse
No announcement yet.

Matrox and Parhelia questions?

Collapse
This topic is closed.
X
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #31
    well, acording to benchmarks, even quad-textured games like serious sam se, is running rather slow, when you consider that parhelia has 4 TMUs per pipe, it could be because of non-optimized drivers, but it isn´t performing like one would expect.
    even pixel/vertex-shaders could need some optimizations regarding "simple" shader programs. I have been wondering about the performance too, either "something" is holding it back, or present applications don´t utilize parhelias features in a efficient way, because it has been optimized for other "less advanced" GPUs.
    This sig is a shameless atempt to make my post look bigger.

    Comment


    • #32
      I found this:
      An email exchange with Matrox revealed that Parhelia's 512-bit data paths are actually a kind of memory bus, where the five main processing units (vertex processor, pixel processor, 2D rendering engine, video engine, and DualHead display controller) each have within them sub-units that each make memory requests that the memory controller arbitrates and handles. There are about sixteen of these sub-units contained in the five main processing blocks.
      from: http://www.extremetech.com/article2/0,3973,29699,00.asp

      could it be that the memory bandwith doesn´t get fully utilized unless all five processing is used at the same time, perhaps matrox decided that they didn´t need all the bandwith for simple games, that doesn´t use all five processing units, for example a game that doesn´t use pixelshaders doesn´t get optimal use of the bandwith and memorycontrollers that is related to that processing block?
      Last edited by TdB; 3 July 2002, 05:30.
      This sig is a shameless atempt to make my post look bigger.

      Comment


      • #33
        There still seem to be indications that the Parhelia has a lot more muscle than we've seen yet. Drivers may explain it to a certain extent, but I'm wondering if the ways the games are currently programmed has much to do with it. All of the other cards on the market are severely bandwidth limited at the moment. The developers are also used to cards only supporting 4 or 8 lights in hardware, and many other limitations. I'm wondering if software designed specifically for the Parhelia (or more accurately, software designed for next-gen cards) will be able to show some of the Parhelia's real potential.

        Looking at the shots from the Reef demo, it certainly looks like having that running smoothly across 3 screens should use an enormous amount of bandwidth.

        As a side note, there have been rumours floating around regarding NVidia's NV30 - not that reliable, but two interesting items there. Firstly, although it was supposed to be released in August, it looks to have been pushed back until 2003. Secondly, they are apparently not planning on using a 256 bit DDR bus. there is the possibility of a QDR bus, but the rumors all seem pretty sure that it will use 128bit only.........

        Comment


        • #34
          I saw in a review (can not remember what one now...) that had some benches of Parhelia running Q3 in TH.

          As far as I remember it did not lose that much performance when running in TH.

          It was almost as fast running in TH than with a slightly higher resolution in single head! That seemed strange, as the overall resolution in TH was nearly twice as much!

          Maybe they have optimised Parhelia's memory bus/bandwidth utilisation for good fps in TH gaming, rather than high fps in single head......

          Comment


          • #35
            Originally posted by Cheesekeeper
            Drivers may explain it to a certain extent, but I'm wondering if the ways the games are currently programmed has much to do with it.
            So Matrox designed this card/gpu like Intel did with the P4?

            Comment


            • #36
              Reckless, I stated facts only. He did admit that this was the case when I called a spade a spade. I do not consider myself nor any of the BBz per se more knowledgable than Joe Schmoe off the street, but the MURC has us on the front lines unlike anyother around.

              This is for your benifit, we don't have to!

              We're doing our best to help others with their questions (albiet restricted by our NDA's) but that does not mean that we should be bashed/slandered when trying to assist.

              You attract more bee's with honey than you do with vinegar you know.
              "Be who you are and say what you feel, because those who mind don't matter, and those who matter don't mind." -- Dr. Seuss

              "Always do good. It will gratify some and astonish the rest." ~Mark Twain

              Comment


              • #37
                Originally posted by isochar


                So Matrox designed this card/gpu like Intel did with the P4?
                Not exactly. The design method employed by Intel for the P4 is known as "incompetance". I'm pretty sure Matrox has better techniques

                What I'm thinking is more that since the architecture of the Parhelia chip is fairly different to most existing chips, none of the current games are able to properly test what the card can do. For example, Matrox's SharkMark shows to Parhelia running miles in front of the competition. Big surprise. However, this probably isn't only due to the fact that the benchmark is heavily optimised for the Parhelia: it's also most likely using capabilities that the Parhelia has that the other cards don't. Although the benchmarks are based around DirectX (mostly), DirectX isn't really hardware independant. Games, benchmarks, and demos are all written to stress the capabilities of *current* hardware. Certain tests in 3DMark will only run on a Geforce3 (or more recent card). When demos are written to test the features that Parhelia and other future cards will use, this may show things in a very different light. At the time I got my Geforce3, it was the only card that could run the nature benchmark in 3DMark - not because of speed, it simply used features that only the GF3 had at the time. All current cards have them now.

                When they start writing games and benchmarks using more than 8 concurrent light sources, expect larger numbers of Pixel shaders and so on, this is where the Parhelia can really show off. No-one has written these yet, since there were no cards around to support them. It's not a matter of if, just when............
                Last edited by Cheesekeeper; 3 July 2002, 07:20.

                Comment


                • #38
                  You stated that it only takes time. But gamedevelopers only will develop techniques that are widely supported under the videocards. Since the parhelia is the only card supporting this it wouldn't be lucrative to support it in the newer games since matrox will sell few Parhelia's with this performance/price issue.

                  Secondly i want to add that the word 'NDA' can be banned on this forum. It's really getting annoying to hear every admin and bb about the NDA's, like its some sort of special word which makes them better!?. Also as stated earlier in this thread don't think you're any better or gain more respect because you've got a Parhelia.

                  Driver NDA's are load of crap imo. If matrox suddenly succeeds in getting a driver that boost performance to da max they are crazy not to put it out immediately. So the only explanation I can give on those NDA's that the drivers still aren't getting any better.

                  btw, im reading this forum for the last couple of months (4-5) so don't think im just a n00b because this is my first post. This is just something i had to say.

                  I really hope the parhelia will get better cause in many ways its the best card ever! Shame it sucks in 3d .

                  Comment


                  • #39
                    It still provides playable frame rates in almost any resolution.

                    Just because its not the fastest card does not mean 'it sucks in 3d'

                    Comment


                    • #40
                      Originally posted by mort
                      You stated that it only takes time. But gamedevelopers only will develop techniques that are widely supported under the videocards. Since the parhelia is the only card supporting this it wouldn't be lucrative to support it in the newer games since matrox will sell few Parhelia's with this performance/price issue.
                      That's true. But the R300 and NV30 are due sometime soon. As I mentioned earlier, it is rumoured that the NV30 has been pushed back until next year, but that's only a rumor. Carmack's comments regarding the Parhelia were concerning - it suggests that the Parhelia is not likely to be as effective for Doom3 as the R300. However, the videocard industry moves very quickly, and it won't take long for other companies to release their next-gen cards. When they do, the games and benchmarks will change. The Parhelia is really the first of the next-gen cards, the question is how it will compare with other manufacturer's offerings.

                      Comment


                      • #41
                        IMO, as the Parhelia does not 'seem' to offer a great deal of speed for its price game developers will most likely stick to what they 'know' sells well. Unless the Matrox developer relations team can do an AMAZING job with assistance, code, consultancy, etc. then the Parhelia will quickly become a niche games market unit only - much like the G400 became as soon as the GF DDR hit the shelves. Games developers touted T&L not EMBM...
                        Cheers, Reckless

                        Comment


                        • #42
                          Originally posted by McElvis
                          It still provides playable frame rates in almost any resolution.

                          Just because its not the fastest card does not mean 'it sucks in 3d'
                          I can't remember anyone saying it 'sucked' but most merely reflected that $400 for average peformance wasn't what they expected. Sure, it's playable given medium res and high quality but the competition are - no doubt - about to make the Parhelia look bottom of the range.
                          Cheers, Reckless

                          Comment


                          • #43
                            Originally posted by Reckless
                            IMO, as the Parhelia does not 'seem' to offer a great deal of speed for its price game developers will most likely stick to what they 'know' sells well. Unless the Matrox developer relations team can do an AMAZING job with assistance, code, consultancy, etc. then the Parhelia will quickly become a niche games market unit only - much like the G400 became as soon as the GF DDR hit the shelves. Games developers touted T&L not EMBM...
                            I'm not counting on game developers supporting the Parhelia in a big way. It would be nice, but it's not too likely. However many of the Parhelia's features can be expected in pretty much all next gen cards: more efficient AA, better hardware lighting support, more pixel shaders, and so on. They won't need to support the Parhelia specifically - all new cards are likely to have them. if the R300 and NV30 have them, the game developers will follow.....

                            edit: Ooops, I mean R400.....
                            Last edited by Cheesekeeper; 3 July 2002, 07:58.

                            Comment


                            • #44
                              50-80 fps is shit fps believe me.

                              For example Q3. I play that game allot .
                              You need to get a framerate that never drops below 125fps this is the magical number in the q3 engine. You jump further en move smoother. Now i have a Geforce2Ti which is nice but the 2d is killing me on my 22".
                              If you have a framerate of about 60 and you make a quick turn it really is noticable that the framerate drops and the smoothness is gone. Sure in none fps games you don't have that element but i really would like a card which is suitable for the future. And what i mean by that that with next generation games it can keep up and give me at least 60fps anything below that is just unplayable.

                              Don't get me wrong im not just a gamer, I really love the features of the matrox cause i use the computer for designing also and I've heard that the 2d capability is unmatched which is a big asset! But 450euro for a card that doesn't give adequate fps on todays benchmarks (and i don't mean like 200 cause thats crazy), just saddens me.

                              For me its a very hard desiscion, I'd hoped for the ultimate solution, good 2d and fast 3d..seems too good to be true though .

                              Comment


                              • #45
                                - all new cards are likely to have them. if the R300 and NV30 have them, the game developers will follow.....
                                I agree. It seems the feature set of the Parhelia will end up being ubiquitous this fall/winter. In which case industry support will be a given.
                                <a href="http://www.unspacy.com/ryu/systems.htm">Ryu's PCs</a>

                                Comment

                                Working...
                                X