I'm doing a HUGE testing of my G400 in D3D / OpenGL with different CPU / G400 frequency. I'm about to finish the OpenGL/TurboGL ones (I'll post them tomorrow; very interesting results). What about D3D? I used 3DMarkMAX and... I hope you can reproduce that. Can you explain it?
1280x960x32 z16 : overall result / game1 / game2
P3 600 : 3720 / 35,3 / 39,3
P3 558 : 3986 / 37,8 / 42,8
P3 504 : 4393 / 41,9 / 46,2
P3 450 : 4818 / 46,7 / 49,7
P3 300 : 5718 / 60,8 / 54,1
CPU nominal speed is 450. G400 is oc'd to 165/206 (at standard clock, the same situation). As I know that is IMPOSSIBLE in real life, I did a quick look to NFS4: at 300mhz is MUCH slower than at 600mhz.
1280x960x32 z16 : overall result / game1 / game2
P3 600 : 3720 / 35,3 / 39,3
P3 558 : 3986 / 37,8 / 42,8
P3 504 : 4393 / 41,9 / 46,2
P3 450 : 4818 / 46,7 / 49,7
P3 300 : 5718 / 60,8 / 54,1
CPU nominal speed is 450. G400 is oc'd to 165/206 (at standard clock, the same situation). As I know that is IMPOSSIBLE in real life, I did a quick look to NFS4: at 300mhz is MUCH slower than at 600mhz.