do you still have to use 'OpenGL optimize for accuracy' with these new drivers to fix the 16-bit etxture stuff in 32-bit colour mode?
Announcement
Collapse
No announcement yet.
Baldurs gate and Win2k PD5.51
Collapse
X
-
Err... I never had to do "optimize for accuracy". *shrug*
- Gurm
------------------
Listen up, you primitive screwheads! See this? This is my BOOMSTICK! Etc. etc.The Internet - where men are men, women are men, and teenage girls are FBI agents!
I'm the least you could do
If only life were as easy as you
I'm the least you could do, oh yeah
If only life were as easy as you
I would still get screwed
Comment
-
-
Nope. I never experienced the kind of corruption described there. I just had to force VSYNC to compensate for the flickering fade-ins.
- Gurm
------------------
Listen up, you primitive screwheads! See this? This is my BOOMSTICK! Etc. etc.The Internet - where men are men, women are men, and teenage girls are FBI agents!
I'm the least you could do
If only life were as easy as you
I'm the least you could do, oh yeah
If only life were as easy as you
I would still get screwed
Comment
-
well I don't know exactly what is going on in Baldur's gate 2, but in Indy3D there certainly still is the bug that certain OpenGL texture modes are displauyed at 16 bit in stead of 32 bit unless 'optimize for Accuracy' is used. I just tested it.
Of course I don't know what textures to look for in Baldurs gate 2, because I don't have that title. They effect you see is that they would look dithered when they shouldn't.
Comment
Comment