Announcement

Collapse
No announcement yet.

Force 32 bit Zbuffer

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Force 32 bit Zbuffer

    Is there any way to force a 32 bit zbuffer, not just enable it?
    Or how do i get UT to use 32 bit zbuffer, it look kinda ugly when i look at distant objects ...

  • #2
    Unless you're using the BETA 5.50 drivers, you will have to live with this for now. It's a driver bug, and will be fixed soon.
    Gigabyte P35-DS3L with a Q6600, 2GB Kingston HyperX (after *3* bad pairs of Crucial Ballistix 1066), Galaxy 8800GT 512MB, SB X-Fi, some drives, and a Dell 2005fpw. Running WinXP.

    Comment


    • #3
      I am using the beta 5.50 drivers, but i still have the problem.

      Comment


      • #4
        Even if 32-bit Z is enabled, it is up to the application to choose to use it.

        If Unreal isn't choosing a 32-bit Z, then it is a problem with the game, not the drivers.

        Comment


        • #5
          Yeah, and that's the problem: I want to force UT to use 32 bit Z-buffer if possible.

          Comment


          • #6
            The UT problem with distant objects and stuff is not a 16/32bit Z-buffer issue... it's a bug with the Matrox drivers.

            But the latest UT patch, 4.05 seemed to make it better tho...

            You'll just have to wait for matrox to fix it
            <font size=1>Primary System: ASUS P4B533-E, Intel Pentium4 1.6A GHz, 512MB Samsung PC2700, Leadtek GF4 Ti4200 64MB, SB Audigy, 2xSeagate Barracuda IV 80GB, Pioneer DVD 106S, NEC CD-R 40/10/40, InWin Q500 Case w/ Enermax 353W PSU, Windows XP Pro, Samsung SyncMaster 753DFX.

            Secondary System: ECS K7S5A, Athlon XP 1600+, 256MB PC133, Asus GF2 GTS 32MB, Seagate Barracuda IV 20GB, Aopen HQ08 Case, Windows XP Pro.</font>

            Comment


            • #7
              I've heard of other people seeing the same thing with other cards as well. I'm not so convinced it's a Matrox problem. If it was, why would a game patch make it better, as Edguy says...
              Core2 Duo E7500 2.93, Asus P5Q Pro Turbo, 4gig 1066 DDR2, 1gig Asus ENGTS250, SB X-Fi Gamer ,WD Caviar Black 1tb, Plextor PX-880SA, Dual Samsung 2494s

              Comment


              • #8
                haven't noticed it on any other cards than Matrox G200 and G400... seen other bugs but not this one.

                and i have noticed it on other games aswell... only with G200 and G400.

                and as i've heard Matrox is aware of the problem.
                <font size=1>Primary System: ASUS P4B533-E, Intel Pentium4 1.6A GHz, 512MB Samsung PC2700, Leadtek GF4 Ti4200 64MB, SB Audigy, 2xSeagate Barracuda IV 80GB, Pioneer DVD 106S, NEC CD-R 40/10/40, InWin Q500 Case w/ Enermax 353W PSU, Windows XP Pro, Samsung SyncMaster 753DFX.

                Secondary System: ECS K7S5A, Athlon XP 1600+, 256MB PC133, Asus GF2 GTS 32MB, Seagate Barracuda IV 20GB, Aopen HQ08 Case, Windows XP Pro.</font>

                Comment


                • #9
                  Yes, they are aware of it, but have not clearly stated it's a driver issue. During their in-house testing, the same clipping errors where found on ATI cards. They are trying to find the source of the error.

                  But if it was a driver problem, why would a game patch make it better? That somewhat indicates it's in the game engine...
                  Core2 Duo E7500 2.93, Asus P5Q Pro Turbo, 4gig 1066 DDR2, 1gig Asus ENGTS250, SB X-Fi Gamer ,WD Caviar Black 1tb, Plextor PX-880SA, Dual Samsung 2494s

                  Comment


                  • #10
                    Odd... the 5.50 beta drivers fixed the 'Z-problem' for me. Am I the only one? (would be cute)

                    [This message has been edited by Scytale (edited 16 January 2000).]
                    P3@600 | Abit BH6 V1.01 NV | 256MB PC133 | G400MAX (EU,AGP2X) | Quantum Atlas 10K | Hitachi CDR-8330 | Diamond FirePort 40 | 3c905B-TX | TB Montego A3D(1) | IntelliMouse Explorer | Iiyama VisionMaster Pro 17 | Win2K/NT4

                    Comment


                    • #11
                      Yes, i have noticed that the 4.05 patch makes it a little bit better, but not that much. But i'm quite sure the problem lies within the drivers. A game patch can make it better by moving the far clipping plane closer to the wiewer, which makes different zbuffer values come closer to each other. IN other words: higher precision but smaller range.

                      And if mr Scytale said beta 5.50 fixed his problem, it makes it more likely that it's the drivers fault.
                      I'm am going to do a complete reinstall of windows soon (not because of this ...), and then install the beta 5.50 drivers, i hope the problem disappears ..

                      Comment


                      • #12
                        PD 5.50 doesn't fix z buffer problems in UT, sorry, some texturing errors are better, but that's about it. The problem with UT is that I don't think it uses Z buffer, I believe it uses a w buffer. That is why nvidia cards appear fine, while ATI and Matrox cards do not.

                        Hugh G.

                        Comment


                        • #13
                          Hmmm... Hugh - I'm running UT at 16bpp and with 16-bit Z. With the 5.50 drivers distant objects no longer seem to pop in front of things, and the blinking textures we all know and hate are gone.

                          Except for the obvious 16-bit Z related imperfections, I'm not aware of any other Z related problem. If you can give me a pointer, I'll give it a closer look.
                          P3@600 | Abit BH6 V1.01 NV | 256MB PC133 | G400MAX (EU,AGP2X) | Quantum Atlas 10K | Hitachi CDR-8330 | Diamond FirePort 40 | 3c905B-TX | TB Montego A3D(1) | IntelliMouse Explorer | Iiyama VisionMaster Pro 17 | Win2K/NT4

                          Comment

                          Working...
                          X