Hi
I have just recently bought an ASUS-7100 Geforce2 MX with 32Mb SDRAM. My video card before that was a Matrox G200. To my eyes the Nvidia card seem to quantize the colors more than the G200. This is especially visible with the Unreal engine (even in software rendering) where scenes use volumetric lightning.
To investigate this further to used the drawing program in Staroffice and MS office. I created a rectangle filled with a gradient color from the brightest of a color to black. The color bands are clearly visible so there is possibility to calculate how many of them there are. This can simply be done by using the rulers and investigate how many color bands there are per unit and then calculate that in respect to the entire rectangle length.
The problem with the Nvidia card is that I only get about 130 gradients while it really should be 256 of them in 32-bit color mode. When using 16-bit colors the values get very accurate with 32 gradients for blue and red and 64 for green. Are Nvidia using 7-bit DACs !??
I have paid for a card that have TrueColor capabilites and not a fake TrueColor mode. Are Nvidia having us on? The method is primitive but is clearly points that there are too few gradients in 32-bit mode.
The color quantization seems to be even worse when using Direct 3D. While the Matrox card renders the colors beautifully and accurate the colors of the Nvidia card are more greyish, like there is a lot of smog between me and the computer. In Direct 3D the colors are even more grayish and if the game also supports software rendering the color contrasts becomes much better when using that.
I had it with this card and will change it in the near future. The problem is I don't know for the moment which one to choose. Maybe I'll wait for the rumored G550.
I would really much like a second opinion about this, please.
I have just recently bought an ASUS-7100 Geforce2 MX with 32Mb SDRAM. My video card before that was a Matrox G200. To my eyes the Nvidia card seem to quantize the colors more than the G200. This is especially visible with the Unreal engine (even in software rendering) where scenes use volumetric lightning.
To investigate this further to used the drawing program in Staroffice and MS office. I created a rectangle filled with a gradient color from the brightest of a color to black. The color bands are clearly visible so there is possibility to calculate how many of them there are. This can simply be done by using the rulers and investigate how many color bands there are per unit and then calculate that in respect to the entire rectangle length.
The problem with the Nvidia card is that I only get about 130 gradients while it really should be 256 of them in 32-bit color mode. When using 16-bit colors the values get very accurate with 32 gradients for blue and red and 64 for green. Are Nvidia using 7-bit DACs !??
I have paid for a card that have TrueColor capabilites and not a fake TrueColor mode. Are Nvidia having us on? The method is primitive but is clearly points that there are too few gradients in 32-bit mode.
The color quantization seems to be even worse when using Direct 3D. While the Matrox card renders the colors beautifully and accurate the colors of the Nvidia card are more greyish, like there is a lot of smog between me and the computer. In Direct 3D the colors are even more grayish and if the game also supports software rendering the color contrasts becomes much better when using that.
I had it with this card and will change it in the near future. The problem is I don't know for the moment which one to choose. Maybe I'll wait for the rumored G550.
I would really much like a second opinion about this, please.
Comment