This may sound like a stupid question to many of you, but it has been a question that has nagged me every since this 32-bit rendering argument started.
I keep hearing over and over that 32-bit rendering is 16.7 million colors. This does not seem to support the math. A few years ago, 24-bit was the standard for desktop resolution. 24-bit also had 16.7 million colors. If you do the math, by raising 2 to the power of 24 you get 16.7 million. (2^24 = 16777216 colors)
Now this seems very strange to me, but 32-bit has a hell of a lot more colors than 24-bit (2^32 = 4294967296 colors)!!!!!
So could someone please explain to me why 32-bit rendering only supports 16.7 million colors? And if so, why then even bother to use 32-bit? Why not just use 24-bit? As far as I know, games would look perfectly fine in 24-bit color and probably run faster.
I keep hearing over and over that 32-bit rendering is 16.7 million colors. This does not seem to support the math. A few years ago, 24-bit was the standard for desktop resolution. 24-bit also had 16.7 million colors. If you do the math, by raising 2 to the power of 24 you get 16.7 million. (2^24 = 16777216 colors)
Now this seems very strange to me, but 32-bit has a hell of a lot more colors than 24-bit (2^32 = 4294967296 colors)!!!!!
So could someone please explain to me why 32-bit rendering only supports 16.7 million colors? And if so, why then even bother to use 32-bit? Why not just use 24-bit? As far as I know, games would look perfectly fine in 24-bit color and probably run faster.
Comment