Announcement

Collapse
No announcement yet.

Matrox Parhelia benches! weee I saw it in person!!!

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #61
    Greebe, agreed that the picture generation would better be VSync locked, so that each frame calculated and shown is exactly synced and in phase with the monitor frame rate.
    I have a non-game demonstration of this: I am in PAL land and watch TV on my computer screen. Unfortunately, the DVI output vertical frequency of the G550 (yes, I have a digital 17" LCD monitor, lucky me) cannot go as low as 50 Hz, nor as high as 75 Hz. The only available choice is 60 Hz, which obviously is not a multiple of the PAL 25 Hz frame rate. Result: slight, but noticeable stuttering of the TV picture. Even though there is no flickering whatsoever, due to the way LCD screens work. What a shame for Matrox.
    I am watching the TV and it's worthless.
    If I switch it on it is even worse.

    Comment


    • #62
      why wasn't this said?

      Ok, here's the point that hasn't been made and I feel could use to be.

      I honestly think the highest detectable frame rate by the human eye is less then what everyone has been suggesting.. You have to take into account the delay of the electrons traveling between you're brain and your eye. I know movies only run at 27fps and look fine... so lets get some value between mine and the previously sated 60 fps (which mind you is LOWER then the average frame rate in quake 3 in 3 monitor mode).

      BUT here's one reason to care about higher frame rates. These frame rates are taken from demo's from within the game. I don't know about you, but when I play first person shooters or even 3d based strategy games, I tend to get myself in pretty sticky situations not usually covered by these demos. Ever play RTCW and have 5 enemies camped out above the bunker shooting down at 3 of your teamates trying to place dynamite at the door of their bunker. both groups fireing at them while you try to move up from the side? or how aobut trying to rocket jump over a firefight in Rocket Arena 3? or an epic battle in a 3d RTS where 2 or 3 full player's forces are battleing it out in a base? These are times where frame rates can drop to half or less of a simple demo. Anyone remember back to quake 1, how huge of a difference there was between demo 1 and bigass1.dem? I think we kind of forgot this....

      Ok here's the point I have to make. I don't care how un-optomized these drivers are, you can't run an OLD game at highest quality on 3 monitor mode with a card that isn't even out yet for christ sake. I could understand if it was only running a new title or an unreleased title at a minimal frame rate, but the card's not even out yet and it's already outdated, and you people are talking about using it for 3 years to come, you obviously don't believe in upgrading..


      Now i'm not a huge fan of Nvidia, I'm actually a fan of a dead company 3dfx. I felt they had the best card release scheme, A new affordable card came out every 8 months or so and it ran current games amazingly. and when new features were supported by software and you needed a new card, well what do you know, a new card was out. They even had the foresight to see FSAA, motion blur, and more advanced texture compression were necessary. HELL they even were testing out 3 monitor support back around when voodoo2's were still around, And you know what they did with them, they hooked up those lcd shutter glasses to them and got a picture that appeared to pop out of the 3 monitors for even more imursion. But people wanted to get the most they could out of the cards they got, squeezing the penny to try to make their cards last longer, so we have no more 3dfx.



      What i'm trying to say here is, what do you get when you buy this card? A whole bunch of features you can't use in a few months, yes as a single monitor card it's almost as good as the card nvidia has out now, but what does that say?

      If you bought a Geforce 3 when it came out it still runs games just fine if you're sentimental about your video cards... and I bet you if you buy a geforce 4 it'll be good for quite a few days to come too... a card that will run Doom 3 at 1600x1200 with most or all options on and at an acceptable frame rate... If the matrox card can do that with some more bells and whistles great, but it seems like they're trying to sell it on multimonitor which if it can't handle quake III at highest quality with multimonitor how can we expect it to do doom III multimonitored let alone anything yet to be announced?



      The only arguement I feel matrox actually has is image quality, and that I have to say they may hold for a while. But I don't tend to want to see how crisp a characters textures are before I frag them... But I Guess you have to appreciate image quality a lot more when your screen isn't changing too often and you have to stare at the same picture for a while Image quality kicks ass, but not at the cost of useable performance, if matrox can deliver both bravo.

      Comment


      • #63
        I referred bubblbobbl to this thread, and I think his responce is tainted by reading only this thread.

        He missed a big point about the Parhelia. Its major selling point and feature set is not the triple head gaming, on the contrary, it is the revolutionary features and futureproofing the card is based on. It is designed to be futureproof, to be useful 3 years from now, instead of being the FPS king for a few months before a new generation of games are released that make it slow to a crawl. If the benchmaks were modernized to include newer games, which demand higher graphics standards, the GeForce series will fall off quicker than the Parhelia. During those high complexity scenes, the Parhelia is less likely to dip, and not dip nearly as much as a GeForce, FPS wise.

        As to the apparent slow performace compared to a GeForce 4, the Parhelia in 3head mode is pumping out 23% more pixels (I did the math) than single head 1600x1200, and about 3 times as many vertices. How would a GF4 do with that? And how much of a hit does a GF4 take when anisotropic filtering is enabled? And if they are comparible in performance now (assume equal, for this point), how well will they perform in a year? 2 years? I would rather drop 300 on a card that'll last me fine for about 3 years than to spend 200 on a card that'll be dog-ass slow on the next gen games in a year and make me upgrade again?


        But I don't tend to want to see how crisp a characters textures are before I frag them...
        And that's just funny to me.

        /Rant
        P=I^2*R
        Antec SX1240|Asus A7V333WR|Athlon XP2200 1.80Ghz|512 MB PC2700|TDK VeloCD 24-10-40b|Samsung 16x DVD|SBAudigy2|ATI Radeon 8500 128MB|WinTV Theater|15/20/60GB Maxtor|3x 100GB WD100JB RAID0 on Promise Fastrak Lite|WinXP-Pro|Samsung SyncMaster 181T and 700p+|Watercooled

        IBM Thinkpad T22|900Mhz|256MB|32GB|14.1TFT|Gentoo

        Comment


        • #64
          Someone in that thread on sharky said it won't run well with UT2003, well UT2003 runs just fine and smooth on my athlon 1ghz with GF3ti200... Doom 3 on the other hand...

          Comment


          • #65
            well, so far I think this card will be better for gaming than for example a gf4ti4600, because parhelia has been designed from the ground up, to NOT make those low-peaks in the framerate under stessful conditions.
            I think it is about time somebody had the guts to ignore the foolish focus on average framerates, and their importance to benchmarks, and actually make a GPU that can RUN games great,(rather than benchmarks).
            sometimes i feel most GPUs are made for reviewers to do well in benchmarks, and NOT to do well in real games.
            Matrox focus has always been quality above speed, so now they are delivering quality framerates(with focus on avoiding low peaks), rather than speedy framerates(with focus on high peaks and average framerates), and I think it is a great idea, i just hope reviewers are smart enough to understand this, however i fear that the usual current benchmarking tools isn´t suited for the parhelia.
            This sig is a shameless atempt to make my post look bigger.

            Comment


            • #66
              Originally posted by TDB
              well, so far I think this card will be better for gaming than for example a gf4ti4600, because parhelia has been designed from the ground up, to NOT make those low-peaks in the framerate under stessful conditions.
              I think it is about time somebody had the guts to ignore the foolish focus on average framerates, and their importance to benchmarks, and actually make a GPU that can RUN games great,(rather than benchmarks).
              sometimes i feel most GPUs are made for reviewers to do well in benchmarks, and NOT to do well in real games.
              Matrox focus has always been quality above speed, so now they are delivering quality framerates(with focus on avoiding low peaks), rather than speedy framerates(with focus on high peaks and average framerates), and I think it is a great idea, i just hope reviewers are smart enough to understand this, however i fear that the usual current benchmarking tools isn´t suited for the parhelia.
              That's one of the things I liked about the G400 ( along time ago)
              P4 2,4@2,6 AsusP4T-533 C 512 mb PC 1066 Quantum Atlas 10KII SCSI ATI R9700

              Comment


              • #67
                To rectify a statement by BubbleBobble earlier on: film runs at slightly under 24fps, not 27 fps - if I'm not mistaken.
                --

                David Van Dromme
                A.K.A. Stormlord/WOW
                Former C64 Scener and Advanced Gravis P'n'p betatester.

                Comment


                • #68
                  Originally posted by Stormlord
                  To rectify a statement by BubbleBobble earlier on: film runs at slightly under 24fps, not 27 fps - if I'm not mistaken.
                  Well, TVscreens have motion blur, you can't compare them to a fast paced shooter with a pc monitors.
                  P4 2,4@2,6 AsusP4T-533 C 512 mb PC 1066 Quantum Atlas 10KII SCSI ATI R9700

                  Comment


                  • #69
                    futureproofing
                    I had to cut your quote short, as soon as I saw this word.

                    The term "futureproofing" has to be the most meaningless term for the hardware/software industry...period. There is NO such animal.

                    Let's look at Doom III...Does anybody _really_ think that it's going to play that game with acceptable framerates? I wish the answer would be 'yes'...but I have very little doubt as to the answer.

                    By the time Doom III is released, I might look @ an NV30/R300 like it's not really powerful enough.

                    I'm sorry, but I just hate when certain terms are thrown around, and this one has never panned out in this particular industry.

                    Comment


                    • #70
                      It of course depends on your definition of acceptable framerates, but seriously ??
                      Do you really think Carmack is going to make a gameengine that makes a Parhelia not good enough.... Of course he is rich enough to be able to do that, bu personally if I was making a gameengine (and in many ways that's what it is) that I expec to license out, I would make sure at least over 5% of the actual market can play the damn game....
                      He's saying a 8500 and Ti4200 himself, and I'm 100% sure the Parhelia will match any of those two in actual gameplaying.....

                      Cobos
                      My Specs
                      AMD XP 1800+, MSI KT3 Ultra1, Matrox G400 32MB DH, IBM 9ES UW SCSI, Plextor 32X SCSI, Plextor 8x/2x CDRW SCSI, Toshiba 4.8X DVD ROM IDE, IBM 30GB 75GXP, IBM 60GB 60GXP, 120GB Maxtor 540X, Tekram DC390F UW, Santa Cruz Soundcard, Eizo 17'' F56 and Eizo 21'' T965' Selfmodded case with 2 PSU's.

                      Comment


                      • #71
                        Originally posted by Edward


                        Well, TVscreens have motion blur, you can't compare them to a fast paced shooter with a pc monitors.
                        Correction. TV Screens don't, the recordings displayed on them do. Even films use special techniques to keep the flow smooth at that framerate by using very calculated camera movements.

                        Rags

                        Comment


                        • #72
                          well, historically speaking: no 3d-game from ID software, has NEVER ran really good with all the eyecandy enabled, on hardware, that was out before the game-release.
                          but their 3d-games have always had the options to turn some of the eyecandy off, and my guess is, that it will be necessary to turn some of it off.

                          IIRC john carmack recommended the 12mb voodoo2 cards for quake3, way before quake3 was released, and we all now how fantastic a voodoo2 is in quake3, so just because he says that card x is best for doom3 at the moment, it doesn´t necesserely means that it will run doom3 great, just that it will work,

                          Personally, I don´t feel like basing my hardware-purchasing-timeframe, on a release-date for one single game. There are lots of great games out there NOW, i want to play, with al the eyecandy enabled.
                          I have a mid-end dx7 card, and dx8 games is finaly starting to apear, so I need a new card ASAP, cards that will run doom3 great with all the eyecandy enabled in hires, will probably not be available before 6 months AFTER doom3 is out.
                          Last edited by TdB; 15 June 2002, 17:27.
                          This sig is a shameless atempt to make my post look bigger.

                          Comment


                          • #73
                            1. most games now depend on culling to keep the speed up in complex scenes. OK, but not very sophisticated or efficient. Displacement mapping does it better and there are two methods for doing this: preimaged and sampled displacement mapping.

                            2. many hardware companies talk about support for preimaged displacement mapping, but Matrox has gone them one better by supporting and implementing a form of sampled displacement mapping.

                            SDM's advantage is that it can be used with both depth adaptive tesselation and LOD (level of detail) while PDM cannot.

                            3. In DX9 you'll find that M$ has licensed Matrox's hardware displacement mapping (HDM), which includes their depth adaptive tesselation and other technologies. As such Matrox is first out of the barn with these very advanced features.

                            Dr. Mordrid

                            Last edited by Dr Mordrid; 15 June 2002, 19:19.
                            Dr. Mordrid
                            ----------------------------
                            An elephant is a mouse built to government specifications.

                            I carry a gun because I can't throw a rock 1,250 fps

                            Comment


                            • #74
                              I disagree with point 2 Doc, because:

                              A) It's wrong and...
                              A) It's right!
                              Meet Jasmine.
                              flickr.com/photos/pace3000

                              Comment


                              • #75
                                Excuse me, but what is culling? A kind of LOD for polys?

                                AZ
                                There's an Opera in my macbook.

                                Comment

                                Working...
                                X