I installed the 4.25 patch, so to test if finally all the z-buffer problems would be gone.
All I get is a comepletely corrupted display... colourd lines walking over my screen and stuff... luckily I know how to quit ut without seeing the menu.
So I am back in win2k desktop again, but there is screen corruption in win2k now too! (not much luckily). So I reboot, and install the new beta d3ddrv.dll that epic released.
Now I am able to start UT proporly, and no more display corruption. I enable use32bitzbuffer, and start the game.... STILL Z-BUFFER ERRORS (32-bit zbuffer IS enabled in my Powerdesk properties). So I read the UT fixes of 4.25 on the website, and they include the following text:
- fixed D3D 32 bit zbuffer allocation
- The 'Use32BitZBuffer' switch in the advanced Direct3D options, which is disabled by default, will force the use of a 32-bit Z buffer even in 16-bit color display modes. This can be used to fix flickering world surfaces and similar visual corruption when running in 16-bit mode on several cards, like the Matrox G400
Then I think: WTF? even in 32-bit, UT always gave me z-buffer corruption...
So I set colour to 32-bit in UT, and I even 32bittextures.... and yes, again, STILL Z-BUFFER CORRUPTION!
Now I am wondering, WHAT DID EPIC FIX???
or is it the new beta d3ddrv.dll that causes the z-buffer problem again, and the only version that it is fixed in causes huge display corruption on my pc?
[This message has been edited by dZeus (edited 04 August 2000).]
All I get is a comepletely corrupted display... colourd lines walking over my screen and stuff... luckily I know how to quit ut without seeing the menu.
So I am back in win2k desktop again, but there is screen corruption in win2k now too! (not much luckily). So I reboot, and install the new beta d3ddrv.dll that epic released.
Now I am able to start UT proporly, and no more display corruption. I enable use32bitzbuffer, and start the game.... STILL Z-BUFFER ERRORS (32-bit zbuffer IS enabled in my Powerdesk properties). So I read the UT fixes of 4.25 on the website, and they include the following text:
- fixed D3D 32 bit zbuffer allocation
- The 'Use32BitZBuffer' switch in the advanced Direct3D options, which is disabled by default, will force the use of a 32-bit Z buffer even in 16-bit color display modes. This can be used to fix flickering world surfaces and similar visual corruption when running in 16-bit mode on several cards, like the Matrox G400
Then I think: WTF? even in 32-bit, UT always gave me z-buffer corruption...
So I set colour to 32-bit in UT, and I even 32bittextures.... and yes, again, STILL Z-BUFFER CORRUPTION!
Now I am wondering, WHAT DID EPIC FIX???
or is it the new beta d3ddrv.dll that causes the z-buffer problem again, and the only version that it is fixed in causes huge display corruption on my pc?
[This message has been edited by dZeus (edited 04 August 2000).]
Comment