Announcement

Collapse
No announcement yet.

Matrox Parhelia benches! weee I saw it in person!!!

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #46
    No what was actually said was,

    As for 3Dmark we should expect around 11k with the fastest avaiable AXP on a KT333
    It does not say that that is what it was run on but what should be expected when run on the fastest available XP.

    Joel
    Libertarian is still the way to go if we truly want a real change.

    www.lp.org

    ******************************

    System Specs: AMD XP2000+ @1.68GHz(12.5x133), ASUS A7V133-C, 512MB PC133, Matrox Parhelia 128MB, SB Live! 5.1.
    OS: Windows XP Pro.
    Monitor: Cornerstone c1025 @ 1280x960 @85Hz.

    Comment


    • #47
      The fastest 3d mark currently is about 15k (AFAIK)

      I have seen a geforce 3 do 12500 on a high-end PIV system.

      So we can expect the Parhelia to be just short of a topoftheline geforce 3. Hmmm...

      Lets just hope this is reflected in the price, then..

      ~~DukeP~~
      Last edited by DukeP; 6 June 2002, 14:45.

      Comment


      • #48
        The fastest 3dmark is 16000 on a overclocked PIV and overclocked Ti4600, the normal score for a Ti4600 is about 9800-10,000 and when you O/C the CPU and Gfcxcard then people usually get higher. If the Parhelia can get around 12,000 on a normal system then i think it would be v.good and i think most people would be satisfied with those kind of scores
        Last edited by Jazzz; 7 June 2002, 07:53.

        Comment


        • #49
          That is the problem with benchmarks.
          Most of my friends that are into computers, modify their systems for one end or other.

          It is clear, that using a PIV overclocked to 145MHz (x4) will heavily influence any sort of benchmarks. But then again, I am only getting around 5K on my geforce2mx400 with a 133MHz PIV.
          Must wait and see...



          ~~DukeP~~

          Comment


          • #50
            Of course the other caveat is what would happen if 3Dmark was modified to say, apply more textures. If you normalized the results so the Parhelia's score would stay the same, I'd bet the GF4Ti boards would drop substantially in comparision.

            Right now there really isn't all that much out that can take advantage of the Parhelia's features. Doom III will, but it's not out yet.
            I'm thinking I'll probably wait on this board until Doom III or UT2003 are out or I aquire some game that really doesn't run acceptably on my G400. GTA3 took some tweaking... I also have to wait for 2D "linux" support, since I use this machine for work.
            Plus it's summer and I have sailing to do as soon as Gerry gets his motor working... you can't really paddle a 30 ft. boat, and you can't sail into the slip he's got.
            Mike

            Comment


            • #51
              I'm not a pro in games or hardware, but I'd wish to say several things about the cards performance. Firstly, I'm certainly not an FPS fan, I play rarely but I mostly concentrate on the game quality and the gameplay itself, than getting excited about how fast it turns or how high is that number in the corner of the screen showing the current FPS I think it would be like watching a movie and concentrating on what's written on the ad label sticked on the screen or how great the remote control works than on what the character says and the story itself My fav 3D game is Jedi Knight (the first, not the second), so if you've seen it's poor quality, you'll see I'm not kidding ) Though a picture quality of the graphics card is important for me. For it I guess I'd prefer a wise balance between quality and speed.

              I think LCD's are very common now, and with its digital nature it is a waste of time for a graphics card developer to concentrate of dull framerates, which's usefulness is under a big question as soon as the CRT's are getting away from market. The ability of the card to render a complex 3D scene at high speed is certainly exciting, but I guess that most gamers who don't do anything but playing games should by XBox or Playstation, while PC is a such a wide-range featured device, it would be a crime to limit it's functionality to just run games and having fun with Internet. That's why Matrox is my favourite card maker It's never cheating its customers with stupid things like frame rates and clock speed, which are the main keys to a "revolution" in processors nowadays. Simply higher speed (thanks to a bigger fan) is not a progress to me. An optimized architecture, clever time and resource consuming solutions -- these make things faster the right way.

              I heard some talks about Matrox not supporting DX9 fully, I'd like to say about browsers - as long as I know, no browser today have complete support for CSS2 standard, which is out for 4 or more years already. Because it has some features which are required only by specialized software. So standards are such things, I guess there aren't much games that fully support DX9, so I wonder how to test the new ATI and NVidia cards?

              Though I'm not a pro gamer, I do play games from time to time, and I must say about those FPS fights that what is really important is how the game actually looks on the PC, how it feels, and that is dependent not only on FPS, amount of features implemented or any other statistical sheets. For example, it is important HOW a feature are implemented, what does it really look like on the screen. Because I've seen some cards that could made any game look terrible, while having a nice list of features. And ofcourse, things like stable clear picture are very important. I'm very glad that Matrox made it's signal even more stable.

              About the old quake3 engine - I was recently reading an interview with John Carmack, and he said that their next-generation Doom3 engine is using the bump mapping everywhere. I thought it's main advantage is those pixel shaders, light effects, but he was repeating several times things about bump mapping. I mean, didn't G400 have it as "EMBM" for 4 years already? Then what's so new about it? Displacement mapping would sound more exciting to me.

              About the surround gaming - I'd wish to know if it's possible to have the supporting monitors to run in low resolution, because I don't really need them to sweat at the highest settings, anyway I'll turn to the enemy to shoot it, and will see it on the primary screen. It would be cool if they would not use all the eye candy also, but today's game engines won't allow it, I'm afraid.

              Ooh, I guess I wrote too much, but in total I'd say that there's no way to say Matrox is better than NVidia or ATI, they are all good companies, but just targeting on different markets. I'm glad Matrox exists and makes cards which are perfect for me, but it's great that NVidia makes cards that can easily impress a weak-minded one with its framerate or other things -- I mean, they make PC-makers life easier. Money and wisdom rarely walk together, so when a rich boy buys a PC he wants to hear himself saying "wow" and "cool", while with Matrox hardware this effect, IMHO, would be a little bit more difficult to achieve than with Geforce and Radeon )) You have to worry about details to be picky about the quality, while speed is so easy to measure ) Just like the size. I agree with all who compare these subjects ) So I don't share agression on ppl who hate Matrox and love NVidia, let them be happy with what they want, as long as we will soon(TM) get what we want ))

              Comment


              • #52
                I thought it's main advantage is those pixel shaders, light effects, but he was repeating several times things about bump mapping. I mean, didn't G400 have it as "EMBM" for 4 years already? Then what's so new about it? Displacement mapping would sound more exciting to me.
                I think you answered your own question, it takes some time before a feature gets used.
                However there are different kinds of bumpmapping, and I don´t think doom3 uses EMBM, but a different method called dot3.
                besideds pixelshaders gives the developer better control over how the bumbmapping is performed, traditional bumpmapping doesn´t allow the "bumps" to cast shadows on each other(self-shadowing), which is one of the main-features of the doom3-engine (AFAIK).

                Not everybody plays with a framerate-counter in the corner, personally i think it is destracting, and as long as the framerate doesn´t take my attention away from the game itself, then im happy.

                You don´t need to be a "pro-gamer", to be allowed to have an opinon. I don´t know anybody who play games for money(not counting betatesters), and i think it is a quite rare job.
                Last edited by TdB; 8 June 2002, 05:59.
                This sig is a shameless atempt to make my post look bigger.

                Comment


                • #53
                  Not to mention that hardly any betatesters for games play for money... Insofar as you don't call 1 copy of the game money (compared to how much most closed betatesters play the game it's at least really really crappy payment

                  Cobos
                  My Specs
                  AMD XP 1800+, MSI KT3 Ultra1, Matrox G400 32MB DH, IBM 9ES UW SCSI, Plextor 32X SCSI, Plextor 8x/2x CDRW SCSI, Toshiba 4.8X DVD ROM IDE, IBM 30GB 75GXP, IBM 60GB 60GXP, 120GB Maxtor 540X, Tekram DC390F UW, Santa Cruz Soundcard, Eizo 17'' F56 and Eizo 21'' T965' Selfmodded case with 2 PSU's.

                  Comment


                  • #54
                    oh hey Dr. M

                    Some good info in your post, but it fails to take into account what the eye is doing when playing a game. Re: fusion from visual latency: this only holds true if the eye is relatively fixated, ie, not scanning. Overlap causes fusion only when the flickering source image stays on the same part of the retina, re-stimulating the same set of retinal cells.

                    Easy to demonstrate this for yourself--turn your monitor refresh rate to 60hz, then pick one word on this page and stare hard at it for a few seconds; there's some flicker, but not bad. Now look rapidly from the top of the screen down to your keyboard and back several times; ouch--massive flicker, and a massive headache if you have to do this for very long. Now watch your fingers type a few words on the keyboard, and notice that screen flicker varies by large amounts as you do---this is because of the rapid tracking & targetting motions your eyes make in finding the keys to spell the words, what researchers call saccadic eye movements or saccades. Table 2 at http://www.tchain.com/otoneurology/practice/saccade.htm shows saccidic eye movements over a 20 degree span for normal subjects to hit 650 degrees per second velocity. When playing a game, even an RTS, your eyes are doing a lot of this; ranging over the whole screen looking for targets, shifting rapidly between targets, or between target and crosshair as you bring them together. So I think it's pretty obvious that all of us can detect flickering above 60hz while gaming, despite motion blur and visual latency.

                    I play Q2 & Q3A, and I especially like rail instagib mods; while 60fps might appear seamless, if you're not moving in game & have your eyes fixated on the crosshair, try pulling a running 120 degree turn and flickshot with the rail on a moving target and you'll apreciate every extra frame you can get---in an incredibly short slice of time you bring your target into your field of view, coordinate your relative positions & velocities to bring the crosshair to bear & fire. I'll guesstimate that a good flickshot like this will take about 0.1 sec for the sake of argument---that would mean at 60fps you get 6 frames of information to find & center your target, less than the 8.5 frames you get with 85fps & above. It makes a difference. Add in that when you pull a shot like that it's likely to be under conditions that produce a framerate minimum (fast motion, multiple moving targets, projectiles, particles, etc.) and it seems reasonable to want 200fps avg in your game, if that means fps never dips below 85 even on a large, crowded, poorly optimized custom map (like mine so far 8-)).

                    I'm so bored with this whole, "anything over 60fps is about penis size" chestnut, it's like the "nobody can see more than 32 bits of color" garbage.
                    It's about being competent at your game, and using the best tool for the job. So to speak, lol. It's also about inferring performance in future games, and thereby assessing how long a card will last in your machine. It's nice that some casual gamers are happy with 40fps avg, but patently silly for them to keep publicly insisting that it's not neccesary or possible to play at a higher level. That they so often mention phallic insecurity is just an embarrassing bit of psychological projection.

                    What was that about growing up?
                    8-)
                    ---e
                    Iwill KK266-R
                    Athlon Tbird 1GHz AYHJAR oc'd to 1.5 GHz
                    128 megs Corsair PC133
                    Windows 98 SE
                    Matrox G400 MAX DH 32mb
                    -----------------------------------

                    Comment


                    • #55
                      Lol.

                      Mostly accurate, PX5R.
                      (Hey, I played in one of the national CS leagues once - prob the oldest player around).


                      ~~DukeP~~

                      Comment


                      • #56
                        The question isn't about Quake 3. We all know it will run it comfortably. Whether it's 100 or 200 in Q3 is irrelevant. Whether it's 35 or 70 in Doom III will matter though.

                        Let's all stop debating it til the card arrives =)

                        P.
                        Meet Jasmine.
                        flickr.com/photos/pace3000

                        Comment


                        • #57
                          Nope. Not for me. Sorry.

                          I've been off TV for almost 3 months now. Only display this last 3 months I've looked at are 25ms total refresh LCDs at 1280x1024 and higher resolution.

                          Guess what? Beginning of this month, when I watch TV for the 1st time in 3 months, I hated my family's PAL CRT 34" Sony Vega TV. The display flickers like mad to me and is so jaggy or blurry (if you use the fake doubled resolution mode) that I have a headache every time I watch it. Gonna switch to plasma once I win BIG at Football betting.

                          (World Cup Football Season for you Yanks out there. And yes, USA had a GREAT game and HUGE upset against Portugal. And Yes, I won BIG. 7 to 1 odds. Very nice indeed. )

                          Originally posted by Dr Mordrid
                          They eye has a built-in latency of between 1/30th and 1/60th of a second, which is why most people perceive the 29.97fps / 59.94 fields/sec of NTSC TV as smooth.

                          As for myself I have excellent vision and have very fast reaction times...but still I find it very difficult to tell the difference between 60 fps and 200 fps.

                          IMHO arguing over boards with rates >100fps is just a bunch of people in a "mine's bigger!" contest. Pointless.

                          Dr. Mordrid

                          Comment


                          • #58
                            I can actually play all fps games today with my G400 MAX, but some features like T&L are missing, so sometimes things can get distorted. So I'm very confident that the new Perihelia 512 will be fine for years to come
                            Last edited by icemaker; 9 June 2002, 00:45.
                            AMD Athlonâ„¢ 64 processor 3200+
                            Microsoft® Windows® XP Professional Edition
                            MicroStar K8T Neo-FIS2R MS-6702 System Board
                            1GB composed of 2- 512MB DDR400 SDRAM 184-pin DIMMs
                            3.5" 1.44MB Floppy Disk Drive
                            160GB 7200RPM Ultra ATA/100
                            16x DVD-ROM Drive
                            4x DVD±R/±RW Drive
                            e-GeForce FX 5950 Ultra 256MB DDR VIVO Graphics Card
                            Integrated 6 Channel AC'97 Audio CODEC
                            56K V.92 PCI Internal Modem
                            Realtek Integrated 10/100/1000 Ethernet Controller
                            IIM IEEE 1394 Host Controller- 2 Ports

                            Comment


                            • #59
                              PXR5, I agree totally with you, and would also like to mention that the eye response speed is not uniform throughout the field of vision, being far shorter at the periphery. Try to look at a fluorescent light shining near the side of your vision field.
                              However, it must also be pointed out that there is no advantage in having a GPU capable of calculating 200 fps, if the monitor can display only 100 or less.
                              I am watching the TV and it's worthless.
                              If I switch it on it is even worse.

                              Comment


                              • #60
                                Michel/Doc et al,

                                One can reason the 200+fps doesn't matter when the page flip of the monitor is nothing more than... for shits'n giggles lets say 100Hz. But the eye can see the difference when Vsync is enabled and when a frame is skipped because it would have been rendered even if 1 nano second too late. In this senerio the screen would get updated with the same frame as the previous one (2 or 3) and or there will be a noticable lag between what the brain anticipates to be fluid and what is not because of this.

                                This is essentially the same arguement that was used for years in high end Audio. Many debunked that the ear is capable of hearing all the nuances of what was heard and thus was proved essentially the military studies of 3D sound positioning that was wanted to be incorporate into fighter aircraft. (done at Wright Patterson AFB)

                                In this case the ear/brain samples roughly at a 20ms rate but can differentiate between tonal shift, amplitude. and arrival times that are much smaller than the given sampling rate it is capable of detirmining on this single factor. A3D/EAX et al were derived from this military study.

                                Bottom line it this... go get yourself whatever speedy new card that's out and directly compare. You WILL see the difference. Might not be that much of an issue if you do not play extremely fast action FPS games, but it does have it's points to us that do play them.
                                "Be who you are and say what you feel, because those who mind don't matter, and those who matter don't mind." -- Dr. Seuss

                                "Always do good. It will gratify some and astonish the rest." ~Mark Twain

                                Comment

                                Working...
                                X