Announcement

Collapse
No announcement yet.

New G400 Driver

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • New G400 Driver

    I'd like to see a new G400 driver with a gaming version of the X...
    Sat on a pile of deads, I enjoy my oysters.

  • #2
    Speaking of drivers, I wondered if M is ever going to release a new driver for the G-series line after the launch of P, since they must be working hard to boost P's performance and broken AF (i wonder if they will ever fix it tho). Many months ago, if i have not mistaken, I heard M will eventually release a unified driver for P and the "new" G-series lines. I wonder if that will ever happen. Seems odd because the architecture of P and G-series chips are completely different. But hey... nVIDIA have unified drivers for their TNT2 to GF4 tho...

    Comment


    • #3
      Matrox has stated on their forums that there will be no further development on the Gxxx drivers (unless there is a glaring bug that needs to be fixed to make the card usable).
      All resources are now on Parhelia drivers.
      Core2 Duo E7500 2.93, Asus P5Q Pro Turbo, 4gig 1066 DDR2, 1gig Asus ENGTS250, SB X-Fi Gamer ,WD Caviar Black 1tb, Plextor PX-880SA, Dual Samsung 2494s

      Comment


      • #4
        Does this also mean that there won't be any tv-out support for G450/550 in Linux, and that mgadriver-2.0 will forever remain beta?
        Loose bits sink chips.

        Comment


        • #6
          Yes, there are new drivers

          But what I was thinking of is if Matrox will give us G4xx drivers with a version of the power of X for games...
          Sat on a pile of deads, I enjoy my oysters.

          Comment


          • #7
            Sorry, but I cannot follow you ... what are you talking about, Drizzt ?
            Despite my nickname causing confusion, I am not female ...

            ASRock Fatal1ty X79 Professional
            Intel Core i7-3930K@4.3GHz
            be quiet! Dark Rock Pro 2
            4x 8GB G.Skill TridentX PC3-19200U@CR1
            2x MSI N670GTX PE OC (SLI)
            OCZ Vertex 4 256GB
            4x2TB Seagate Barracuda Green 5900.3 (2x4TB RAID0)
            Super Flower Golden Green Modular 800W
            Nanoxia Deep Silence 1
            LG BH10LS38
            LG DM2752D 27" 3D

            Comment


            • #8
              The Power of X combines the best of both approaches to deliver peak video editing performance, at a very aggressive price point, without quality compromises.

              The Power of X, as implemented in Matrox RT.X10, now exploits the full power of the CPU for:
              * Ultra-fast DV decoding without quality compromises
              * Smooth slow and fast motion control
              * Colour correction

              The Power of X, as implemented in Matrox RT.X10, exploits the full power of dedicated hardware for:
              * Analogue input and output
              * Compositing and true 3D geometric effects
              * High-quality bi-cubic and anisotropic effects filtering
              this one?

              I don´t think it will be released for the normal g400 cards.
              This sig is a shameless atempt to make my post look bigger.

              Comment


              • #9
                I think the same, but I have the right to hope it!
                Sat on a pile of deads, I enjoy my oysters.

                Comment


                • #10
                  Er, you can't implement the power of X like that. The power of X means the CPU does some features.

                  Example: The CPU can't suddenly add TnL to the card for instance...as TnL means moving transform and lighting from the CPU to the card. The whole point of graphics cards is to do the graphics stuff in hardware, rather than through the CPU in software. The power of X means that the X10x relies on the CPU to do some video stuff in software. It makes the X10/X100 cheaper, but less powerful.

                  P.
                  Meet Jasmine.
                  flickr.com/photos/pace3000

                  Comment


                  • #11
                    Speaking of drivers. Does anyone knows what the new drivers fix/do?
                    Let those who want to be simple, be simple.

                    Comment


                    • #12
                      Originally posted by Pace
                      Er, you can't implement the power of X like that. The power of X means the CPU does some features.

                      Example: The CPU can't suddenly add TnL to the card for instance...as TnL means moving transform and lighting from the CPU to the card. The whole point of graphics cards is to do the graphics stuff in hardware, rather than through the CPU in software. The power of X means that the X10x relies on the CPU to do some video stuff in software. It makes the X10/X100 cheaper, but less powerful.

                      P.
                      I don't agree on this.
                      Think about a game like the new Unreal. How much CPU time do you think it use? Even with a powerful IA, I think that it use not much more than 5-10% of it.
                      And think what is the waste of processors power in a dual system like a dual 2000+...

                      This leave a lot of space for other use...and you can easily use it to improve games visual quality.
                      Sat on a pile of deads, I enjoy my oysters.

                      Comment


                      • #13
                        I think you're mistaken. It takes CPU to handle all that geometry, physics, AI, etc. Flight sims especially are still very CPU-limited in general.
                        Blah blah blah nick blah blah confusion, blah blah blah blah frog.

                        Comment


                        • #14
                          Sorry Drizzt, but UT2003 uses 100% CPU time on my XP2000+. So will any game I play (UT1, Q3) if I run it at a lower res. At low resolutions, almost all games are CPU limited - so adding more work for the CPU is only going to slow it down. And once you get to high resolutions, it's the actual fillrate that slows you down, so offloading work from the graphics card wont help much there either

                          Even in the realtime video editing scene, it'd be much faster if everything was done in dedicated hardware - just the cost would be prohibitive!

                          P.
                          Meet Jasmine.
                          flickr.com/photos/pace3000

                          Comment


                          • #15
                            Originally posted by Pace
                            Sorry Drizzt, but UT2003 uses 100% CPU time on my XP2000+. So will any game I play (UT1, Q3) if I run it at a lower res.
                            P.
                            Pace, if UT1 use REALLY 100% CPU time on an XP2000, it should run at 200% CPU + on a Duron 800, so the game couldn't be played on it. While it run smoothly on my old processor and even on a K6-III 400 (on wich it should use then more than 400% or run at 1-2 FPS, wich is not the truth).


                            The CPU% use is ALWAYS 100% when you use DirectX, even if the processor is unused.
                            Sat on a pile of deads, I enjoy my oysters.

                            Comment

                            Working...
                            X