Announcement

Collapse
No announcement yet.

OpenGL ICD only using 16 bit texel storage space?

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • OpenGL ICD only using 16 bit texel storage space?

    I just tried the Sense8 <a href="http://www.sense8.com/support/indy3d.html">Indy3D</a> v3.0 benchmark on the latest Matrox OpenGL ICD and this test seems to indicate that the Matrox OpenGL ICD is only storing textures using 16 bits.

    This results in a noticable banding textures in the image quality test. An example of this can be seen <a href="http://www.sense8.com/indy3d/Help/texcolspc2.jpg">here</a>.

    The test results report the following which shows that the driver is reporting 32 display colors.

    Graphics Driver: 1.1.3 Dec 15 2000
    Color Depth 24 bits
    Z Buffer depth 32 bits
    OpenGL PFD: ID[15], Color[32], Z[32], DB[1], Stereo[0], Stencil[0], Alpha[0]

    Is anyone else seeing the same thing with these latest drivers?

  • #2
    Well, I just tried quake3 with the latest win2k drivers, it's using 32 bit textures.

    Comment


    • #3
      This could explain the problem that a number of people experience with Baldur's Gate 2. I know this is a games issue, but I'd noticed in games as well as Sense8.

      Comment


      • #4
        Why are you using 24Bit color depth? You should be using 16 or 32.

        Rags

        Comment


        • #5
          Yup, I've seen the same thing. G200, w2k, 5.31 drivers, 32 bit desktop colors.

          Comment


          • #6
            Originally posted by Rags:
            Why are you using 24Bit color depth? You should be using 16 or 32.

            Rags

            You see, the program WAS actually using 32 bit colors. The program output indicates 24 bit of color because that is 8 Red, 8 Green and 8 Blue, giving you 24 bits.

            The remaining 8 bit are Alpha channel bits.

            And yes, the desktop was set to 32 bit color >:P

            Comment


            • #7
              Hey Guys,
              We resolved these issues recently, it's actually a bit of both. To put it simple, it's the way texture formats are enumerated in OpenGL, when selecting 32-bit in Quake 3 the texture formats used are different then ones in Baldur's Gate 2. So to solve the problem, run the Matrox Tweak Util and select the option "Optimize for Accuracy" to enable the problem 32-bit texture format that the application is requesting, this will use true 32-bit textures in both applications,

              Hope that helps and happy new year from Matrox!


              ------------------
              "The opinions or comments expressed by me do not necessary reflect those of Matrox Graphics Inc."
              "The opinions or comments expressed by me do not necessary reflect those of Matrox Graphics Inc."

              Comment


              • #8
                Thanks R0M! That resolved the issue!

                Comment


                • #9
                  This same problem was is evident in the demo "Glaze" from Evans and Sutherland.

                  Same problem, I would assume same workaround.

                  - Gurm

                  ------------------
                  Listen up, you primitive screwheads! See this? This is my BOOMSTICK! Etc. etc.
                  The Internet - where men are men, women are men, and teenage girls are FBI agents!

                  I'm the least you could do
                  If only life were as easy as you
                  I'm the least you could do, oh yeah
                  If only life were as easy as you
                  I would still get screwed

                  Comment

                  Working...
                  X