Anyway, is your refresh rate set to 100+ Hz with V-Sync turned ON? Don't forget that without a reg edit, all GF3/GF4 cards are limited to 60Hz in 3D...
Announcement
Collapse
No announcement yet.
Matrox Parhelia benches! weee I saw it in person!!!
Collapse
X
-
What was necessary was done yesterday;
We're currently working on the impossible;
For miracles, we ask for a 24 hours notice ...
(Workstation)
- Intel - Xeon X3210 @ 3.2 GHz on Asus P5E
- 2x OCZ Gold DDR2-800 1 GB
- ATI Radeon HD2900PRO & Matrox Millennium G550 PCIe
- 2x Seagate B.11 500 GB GB SATA
- ATI TV-Wonder 550 PCI-E
(Server)
- Intel Core 2 Duo E6400 @ 2.66 GHz on Asus P5L-MX
- 2x Crucial DDR2-667 1GB
- ATI X1900 XTX 512 MB
- 2x Maxtor D.10 200 GB SATA
-
Originally posted by Ali
The odd thing is that Matrox say the Parhelia will be able to turn on all the eye candy without having much of a performance drop, but from that link:
Quake3 medium detail 1024-768-32bit 55fps in surround gameing and 160fps in signle monitor mode
Why use medium detail? If you are only going to use 1024*768 (I would have thought 1280*1024) then why not pump up the eye candy?
Ali
amishDespite my nickname causing confusion, I have no religious affiliations.
Comment
-
realize that these drivers are rather fresh and that Matrox never claimed to provide faster raw performance than GeForce4. nonetheless, when the board is out and shipping, there will be no need to complain about performance.OFFICIAL EX- EMPLOYEE
<font size="1">"So now I'm dreaming<br>For myself I'm understanding <br>Performing there, one hundred thousand fans would gather one and all <br>And so decided, we could rule it all if we should <br>Dance all away across the greatest city in the nether world..."<p>- Central Park 09/24/03</font>
Comment
-
Really? How can they afford that?Asus P2B-LS, Celeron Tualatin 1.3Ghz (PowerLeap adapter), 256Mb PC100 CAS 2, Matrox Millenium G400 DualHead AGP, RainbowRunner G-series, Creative PC-DVD Dxr2, HP CD-RW 9200i, Quantum V 9Gb SCSI HD, Maxtor 20Gb Ultra-66 HD (52049U4), Soundblaster Audigy, ViewSonic PS790 19", Win2k (SP2)
Comment
-
Originally posted by Ant
I hear you sales guys are giving away one of those 3 screen monitor setups you picture there with every Parhelia. Is that true?
i'd be more than a little surprised to see a $10,000+ display bundled with Parhelia. Hey, maybe we can bundle a little delivery guy to help ship that monstrous package to each customer.OFFICIAL EX- EMPLOYEE
<font size="1">"So now I'm dreaming<br>For myself I'm understanding <br>Performing there, one hundred thousand fans would gather one and all <br>And so decided, we could rule it all if we should <br>Dance all away across the greatest city in the nether world..."<p>- Central Park 09/24/03</font>
Comment
-
I'm a bit disappointed by the benchmarks as well.
If an ancient engine like QIII only gets 55fps with medium detail it might be quite a bit lower with high detail. This also means that surround-gaming might not be that fun with anything newer.
And I'd really like to see a 1600x1200x32 x anisotropic x AA benchmark with some newer titles...
Comment
-
Originally posted by Indiana
I'm a bit disappointed by the benchmarks as well.
If an ancient engine like QIII only gets 55fps with medium detail it might be quite a bit lower with high detail.
As far as I've read the nVidia 4600 can muster up around 180FPH in 1024*768 in HQ mode (Q3 Demo001). OK, so the Parhelia doesn't hit it at that height but it's not far off - 160 at medium (assuming it was timedemo score?) therefore drop again for HQ. Has anyone ever benchmarked a 4600 doing Q3 in dual screen? I see not..... therefore anyone getting upset at only 55FPS in triple head mode is not comparing apples with apples. If several E3 stands were willing to show off UT2 with surround gaming then I'm sure I'd be happy with the speed. It's a huge marketing fair & you don't make your product look bad so you choose the best kit available - as JC did with Doom3 demo
I think we should also look at what the Q3 game engine was designed for - a fixed T&L system that was no doubt nVidia focused using OpenGL which again was probably nVidia focused. The benchmarks for the game are ridiculous these days - 100FPS in any res is the norm so it's showing it's benchmarking age. If Matrox have good developer relations for the Parhelia then there'd be more incentive for developers like iD to support DM or other new tech featured in the Parhelia.
If ya want speed then sure I've no doubt that you need to buy a current/upcoming generation card from nVidia/ATI. If you want to stick with the big M (for whatever reasons!) then you know you'll be getting a card that lasts for a lot longer than any GF. Whether you'll pay for the privilege we shall have to wait and see!Cheers, Reckless
Comment
-
I'll tell you what I want ... 85 FPS Surround Gaming on most all games current and through the next year with all features enabled. I can settle for less though.<TABLE BGCOLOR=Red><TR><TD><Font-weight="+1"><font COLOR=Black>The world just changed, Sep. 11, 2001</font></Font-weight></TR></TD></TABLE>
Comment
-
Gamers and game card makers need to grow up.....
Putting on my anatomists hat for a minute;
Basic visual physiology: it takes time to see things.
Visual Latency:
As such there is always a time delay between the instant when something happens in the world and the instant when we see that something. This is because a long sequence of neural events must occur before nerve impulses caused by an event can reach the visual cortex.
This delay is called visual latency. The graph of this function has a rise time of about 25 ms and a decay that varies by retinal cell type. Cones, the color receptors, have a decay of up to 200 ms. Rods, the brigntness (or luma) receptors have a decay of between 200 ms and 400 ms. Because of this long decay sequential visual stimulae overlap.
Persistance of Vision:
Vision is also sluggish: A briefly flashed visual stimulus appears to last much longer than its actual physical duration due to the decay of the visual latency response curve. This "persistence" of the visual response implies that we must be limited in our ability to tell whether a light is continuously on vs. one that is falshing on and off. We are, and this has implications for how we sense video frames.
Beyond about 60 Hz the CFF (critical flicker frequency) a flickering light (or sequence of video frames) appears to be steadily on. This is known as fusion or the flicker fusion frequency (FFF).
The FFF applies not only to flashing lights but to display refresh and frame rates.
The exact frequency at which fusion will occur depends on the type and brigheness of the source and the eye color of the individual (blue has a higher FFF by about 2 Hz), but even for high intensity sources it rarely (if ever) exceeds 100 Hz (or 100 fps).
For the record: brown eyes have an FFF of about 26 fps on an interlaced display. For blue eyes it's about 28 fps. Double these number for a non-interlaced display and you get 52 and 56 fps respectively.
As such the human eye is physically incapable of discerning the difference in flicker/frame rates above the FFF for a given source intensity.
Combined with the fact that video/game screens are not that intense (how many games take place in dim environments?) arguing over flash/frame rates beyond the FFF is silly because of the limits placed on us by our wetware (neural anatomy).
Motion Blur:
Some people claim to be able to exceed FFF's limitatioins, but in reality what they are sensing is the absense of something found in normal vision: motion blur.
Motion blur is part of our normal sensing of moving objects and is a side effect of the retinal cells visual latency. The eyes sensing of motion blur can be best simulated by taking a photograph of a moving object with a stationary camera at an exposure of about 125 ms (1/8 second).
Think of this as temporal anti-aliasing; the purpose of which is to smooth out our visual perception.
Properly done motion blur is something that should be addressed by digital content creators, but unfortunately many don't do a very good job of it. This is why so many players feel the need for frame rates >FFF. They know something doesn't "look right", but can't quantify it and presume they need more FPS to smooth out the games "look". What they really need is a well placed & timed blur.
Also unfortunate is that it's not always easy to achieve "properly done" motion blur in content, but this doesn't mean that hyperkinetic frame rates will help either. Our nervous systems just don't work fast enough for them to.
Uppance:
Matrox is on the right path in pursuing visual quality issues. Chasing FPS past the human FFF for a given scene is beating a dead horse.
The creatures that really need high FFF's are birds. Pigeons have an FFF of ~146, but I doubt they'll be playing Doom III when it comes out. Let NVIDIA market to that segment :-))
PS:
The guys at Epic games "get" this. They have flatly stated that having VSYNCH turned off, thereby allowing a frame rate higher than the monitors refresh rate, is pointless from an image quality point of view and they recommend against it. Since most people are running refreshes of <100 hz the whole >100 fps issue becomes moot.
And don't forget that any benches you are reading about are being done with late alpha or neonatal beta drivers.
Dr. Mordrid
Last edited by Dr Mordrid; 6 June 2002, 08:21.Dr. Mordrid
----------------------------
An elephant is a mouse built to government specifications.
I carry a gun because I can't throw a rock 1,250 fps
Comment
Comment