If this is your first visit, be sure to
check out the FAQ by clicking the
link above. You may have to register
before you can post: click the register link above to proceed. To start viewing messages,
select the forum that you want to visit from the selection below.
Maggi, luminosity may or may not be the reason for it.
I personally can recognize a difference between playing at 30 fps and 70 fps. There was an excellent page that explained why the human eye can detect the difference on a monitor. However, I do not recall the author referring to it as a problem of how much light is being emitted.
<font face="Verdana, Arial, Helvetica" size="2">Originally posted by az: Maggi: How come I notice flickering up to 85 Hz? (And I'm NOT bragging about it, it actually hurts my eyes to use my girlfriend's Monitor )
AZ</font>
you cannot really see it, but you can feel it.
ie. your eyes get tired, because @ 85Hz each 2nd line is drawn 42.5 times per second and below it even less.
<font face="Verdana, Arial, Helvetica" size="2">Originally posted by isochar: Maggi, luminosity may or may not be the reason for it.
I personally can recognize a difference between playing at 30 fps and 70 fps. There was an excellent page that explained why the human eye can detect the difference on a monitor. However, I do not recall the author referring to it as a problem of how much light is being emitted.</font>
It seems that the editor is about to rewrite that article, because I cannot remember this at the very end:
<font face="Verdana, Arial, Helvetica" size="2">Editor: In recent view of several findings, the human visual system can detect very small amounts of light (or changes in light). The US Air Force has done tests with their pilots to see how responsive the visual system is. In one experiment a picture of an aircraft was flashed on a screen in a dark room at 1/220th of a second. Pilots were consistently able to "see" the afterimage as well as identify the aircraft. This is a very specific situation though, but it shows how sensitive to light our visual system is.
In terms of games though, that 72 fps number is a bit low. Many manufacturers are aiming at 85 fps for an acceptable frame rate, but are now concentrating on adding more quality to a scene. While many gamers are tweaking their systems for 150+ fps in games, that may actually be more of a mental adjustment in terms of their performance ("Q3A runs at 166 fps, so I obviously hold an edge over the other guy!"). Many industry entertainment tests (such as visual simulation) show that the sweet spot for humans to "distance themselves from reality" is in between 85 and 120 fps. The second article to this will more effectively show how anything over 100 fps will give exceptionally minimal improvements in visual quality or the "suspension of reality" effect that higher framerates give.</font>
Cheers,
Maggi
Despite my nickname causing confusion, I am not female ...
<font face="Verdana, Arial, Helvetica" size="2">Originally posted by Maggi: your eyes get tired, because @ 85Hz each 2nd line is drawn 42.5 times per second and below it even less.</font>
Aren't you talking about interlace here ?
As for games and framerates, I read something a while back that made a lot of sense (to me at least). A movie is running smooth at 24 fps, while a game runs very poorly at 24 fps. This is because a movie is captured with a shutter speed of 1/24 sec, thereby blurring the image slightly, while a game takes still shots with infinite shutter speed, thereby giving the well known jerkiness of a game running at low fps.
3dfx did try to compensate this with their motion blur function, I've never seen it in action so I have no idea of how effective it is.
However if an accumulation buffer would store all the frames created and only display a frame every 1/24 (or 1/30) seconds containg the combination of all the frames caltulated, it should be able to mimic the effect you get in a movie, IMHO anyway
"That's right fool! Now I'm a flying talking donkey!"
P4 2.66, 512 mb PC2700, ATI Radeon 9000, Seagate Barracude IV 80 gb, Acer Al 732 17" TFT
A little OT, but I find it humorous to see what happens when you see a computer screen via a camcorder. That flickering shows no matter what! Anyone care to explain this phenomena?
Yes, it has to do with the refresh rate and the shutter rate not matching each other. If you watch a professionally done video, you won't see the scan line or flickering, they make a special program with a connection box that matches the shutter speed and the refresh rate to each other exactly, if it were to be out of synch one bit, you would see blanks of nothing.
<font face="Verdana, Arial, Helvetica" size="2">Originally posted by az: Maggi: But why can I see flickering? I can tell you if a Monitor uses 60, 75, or 85Hz with a white background just by looking at it...
AZ</font>
didn't understand it ?
at 60Hz, each 2nd line is drawn 30 times per second, thus you can see it flickering
from 85Hz upwards it get harder for the eye to distiguish between those alternating lines.
Despite my nickname causing confusion, I am not female ...
do you think that VGA monitor don't use interlace ?</font>
Yes, I was under the impression that they didn't. I remember some years back the biggest rave when looking at the brand new 14" monitors (!) was whether or not it could display 1024x768 non-interlaced.
I distinctly remember that the one I had could only display 1024x768 in 87hz interlaced, which looked awful, although that might just be bacause of the monitor displaying a resolution far beyond the actual number of pixels
"That's right fool! Now I'm a flying talking donkey!"
P4 2.66, 512 mb PC2700, ATI Radeon 9000, Seagate Barracude IV 80 gb, Acer Al 732 17" TFT
I've NEVER seen something like this ...some guy's review gets trashed and not only he's not mad, he comes here and apologizes... well, kudos to Nathan "nza" Davison for that...
As for the review, well, it's normal for a GeForce to beat the Kyro in 3DMark2001... we all know it relies heavily on T&L which Kyro doesn't have; also let's not forget the drivers... damn, Kyro isn't even for sale and we all know how the first sets of drivers are for every card... so i would say that *RIGHT NOW* you'll be able to play games better with a GeForce...rather than a Kyro... but tomorrow... who knows?
And please, don't call me nVidiot, i never owned/and maybe never will own an nVidia card... (have owned Cirrus Logic, S3, 3dfx, Matrox and currently a Radeon 64MB DDR VIVO... waited too long for G800...damn you Matrox...)
My system:
PIII 450@558
Abit BE6
128 Mb Hitachi PC100 RAM
OEM G400SH 16MB@160/180
(MGA Tweak Sys clk 360/2.25/2/2.25)
IBM 10 Gigger
SB Live! value
Hyundai 17" monitor
Oh yeah, PD6.10+TGL1.30+DX7.0A, all this on WIN98SE...
Comment