Announcement

Collapse
No announcement yet.

NEW hardware from both Nvidia and 3dfx.Where the hell is Matrox?

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #16
    www.3dforce.com

    Here's a link to links of tons of GeForce 2, Voodoo 5, and ATI Radeon previews. I gathered em all up and stuck em in one place. I just hope Matrox's next-gen offering can compete. From what I've read, the GeForce 2 smokes. It does take a whopping hit in 32-bit color, but even then it's fast as hell.

    The Rock
    Bart

    Comment


    • #17
      Graphic chips raw power is way ahead of video memory technology. DDR was the miracle solution with the GeForce, but now Nvidia raised the bar again, it became again a severe bottleneck.
      The most effective solution nowadays is several chips working in paralel with dedicated memory to each one (VSA-100 anyone?), let´s face it.

      But I have a feeling Matrox will do a major breaktrought with the 512 bit quadbus/256 bit memory bus DDR SDRAM G800...

      Comment


      • #18
        It does seem that bandwith problems seem to hamper the performance at very high resolutions(think 1280*1024 32bit and 1600*1200 32bit)but just like the existing geforce,you'll be able to overclock the memory anyways,i got my DDR running at 340mhz memory speed even though the card is only rated for 300,so i'm thinking that with the memory alredy clocked at 333mhz(DDR)it's likely that the memory can make it as high 380mhz,and not all card makers will ship their Geforce2's with the with the same memory speed,since aparently the ati card will ship with 400mhz DDR memory as standard(5 ns),which means that 400mhz DDR memory is available now...
        note to self...

        Assumption is the mother of all f***ups....

        Primary system :
        P4 2.8 ghz,1 gig DDR pc 2700(kingston),Radeon 9700(stock clock),audigy platinum and scsi all the way...

        Comment


        • #19
          Sharky
          "The GeForce2 GTS you're going to need to spend $350+ on for the fastest 3D accelerator on the block. Further down the pipeline expect to see 64MB DDR versions of the GTS selling for a plump $450. So if you've got a TNT2, Voodoo3 or G400, now would probably be a good time to consider the plunge with the GTS. "

          Yea sure...

          How about this line on his site:
          "Matrox is a behind on the 3D scene. "

          A Behind??? Bwhahahahahahhaha http://www.sharkyextreme.com/hardwar...guide/27.shtml


          Comment


          • #20
            I'd have to agree with Sharky. Matrox is behind the scene. (not "a" =) So is everyone else though. nVidia is calling the shots.

            The g400max was a tnt2ultra/V3 stomper. Doesn't really hold it's own against a GeForce, though. The G800 is going to have to put up a good fight against the NV20/Rampage for it to be reviewed well.

            Comment


            • #21
              Matrox better hurry up or they are going to miss their window. The G800 is probably going to be a nice card, but if it has to go head to head with the NV20...oh man. The NV20 is the architecture that is being used in the X-Box. The actual chip that is going into the X-Box is the NV25, but the NV25 is only a spring refresh of the NV20. See where I'm going with this? The NV20 is going to be something else...absolutely amazing. Matrox just needs to hurry the hell up!

              Sensei

              Comment


              • #22
                hmm, I think that perhaps other companies are in too much of a hurry to put out cards and make a quick buck. And just Maybe Matrox are wiser.

                Afterall, at the moment D3D 7.x only supports the most basic T&L in still geometry, whilst 8 should contain a full implementation.

                I would much rather have a real good yearly updat than a semi update every 6 months. I for one won't be getting a Gforce 2 or G450, or any kind of 3DFX card at all in the near future. My opinion of 3DFX is very low, particularly as my main use for video is 3DS MAX at 1280 x 1024 and often in 32bit D3D.

                I also use Lightwave which is a bitch, and worse even now that the GL rendering doesn't work at all with the latest drivers. LW5.6 that is.

                Comment

                Working...
                X