Announcement

Collapse
No announcement yet.

G800/G450 April Announcements?

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #16
    I think that Matrox is very much different than the other companies. For starters, they are not publicly owned. Also, they make their own boards as does ATI. It makes a huge difference in revenues. I think that this is why 3dfx started making their own boards too but have met a different level of success.

    Last time I checked, and that was a while back, Matrox was the second biggest video card manufacturer. Even if nvidia has 40% of the market, they only make about $30 per card sold. Matrox makes much more. And it's only recently that nvidia has gotten 40% of the market. So maybe Matrox is still second. And lets not forget the other things like professional video and networking.

    As I said previously, Matrox is privately owned. So maybe they don't feel the pressure to grab %40 of the market. All they care about is profit and I bet they make lots of it. Maybe they balance the number of sales with costs of manufacturing and distribution to maximize their profits. Also note the advertisement budget equal to 0 (we don't see too many ads from Matrox). nvidia on the other hand just tries to sell as many chips as possible and the card manufacturers take care of everything else (support, marketing, etc).

    Anyway, I feel like it sucks for me because I would rather get a Matrox card but Matrox is happy with keeping everything under NDA until the last second, at which point there is more wait until the product ships.

    Salmonius

    Comment


    • #17

      First off, for kicks, I OCed my cpu by 25% and upped my g400 to Max speeds. The result was a 35% increase in framerates. So it does look like a PIII 600 or 700 would get me to playable frame rates. That would more than double the cpu speed and allow use of the SSE optimizations, which I have read increase frame rates by up to 25%. This would make a good short term solution allowing my to ditch the V2 and still play Tribes and Tribes2 (when it comes out). Hopefully UT would run ok as well.

      If this works out I would then be in a position to wait till the G800 comes out. If it has twice the fill rate of the g450 in a single chip setup (something like 400MPS/800MTS) that would be acceptable. I hope it has FSAA as well as T and L (actually, it sounds like hardware clipping would be real plus too). The FSAA screen shots I've seen for the v5 have been impressive.

      If the "twice the fill rate of the g450" rumor is true, I hope that it's true for the a single chip. If it's only the case for the dual configuration, then the single chip G800 wouldn't be a good choice. (If it is true, that would put a dual chip G800 at ~800MPS/1600MTS, likely slower than the competition at that time, but damn fast!)

      I sure hope that Matrox releases preliminary specs for the G800 at WinHec, it would sure make planning alot easier.

      -AJ

      PS

      Although Matrox relies less on the gaming market than say 3dfx, it is still important for them to have a stong presence in all the major markets, not just in workstation graphics. The graphics industry is consolidating and they will need to be vertically integrated to survive.

      Everyone in the industry will need to look at their development cycles if Nvidia keeps the pace up. Honda and the other Japanese auto makers put enormous pressure on US makers to reduce their redesign times (honda was at 36 wks while the GM was at 60 and it was stating to hurt). The US automakers became more competitive (though not quite as good) and reaped the rewards of doing so. The same will need to occur at Matrox.

      So they don't have to match the leaders in every catagory, but they will need to be close (and that may be good enough for me, time will tell).

      Trying to figuring out what Matrox is up to is like tying to find a road that's not on the map, at night, while wearing welders googles!

      Comment


      • #18
        I love matrox cards, BUT I need more. I understand that matrox is geared more towards the whole OEM market. I just feel they aren't capitializing on the gamer market. If a realtively unknown nvidia can come in an squash 3dfx, then I know matrox could do it. Too little marketing, no hype. And this is even bad for the whole OEM strategy too. Just frustrated that matrox doesn't have the killer instinct that companies like nvidia, intel, microsoft. What they need is a new leader, someone from microsoft to come in and light some fires under some feet.
        Asus K7V
        Athlon 700
        128mb PC133 HSDRAM
        Matrox Millennium g400max
        Adaptec 2940U2W
        IBM 9gb U2W
        Plextor 8/20 cdr
        Diamond MX300
        3com 905b-tx

        Comment


        • #19
          Am I the only one who has read the interview with Dan Wood at <a href="http://www.murc.ws">MURC</a>? Or am I just the only one who can read between the lines..
          Q: What's the baseline for a next-generation videocard?
          A: The year 2000 videocard will have at least four times the fill of a cutting-edge videocard today. Fill rates these days are well above 300 megapixel, so it's got to be well above a gigapixel; it's got to have support for all the features of DirectX 7; it's got to have some ability, of course, to do things like bump-mapping and multi-head display, and framerates have to be extremely high, along with color quality. You must support higher resolution textures. And I think there's a real interest in solving the anti-aliasing problem at the next step of the game.
          Now that pretty much describes to me what can be expected from G800. Note that AA is mentioned but no word on T&L.
          So reading that, I'd say the things new in G800 compared to G400 are
          1) better overall performance
          2) better visual quality (if possible)
          2) FSAA


          :
          B

          [This message has been edited by Buuri (edited 13 April 2000).]

          Comment


          • #20
            Yo AJ,

            My computer has an Athlon 500@728 and my G400 is running 160/200 in Win2000Pro.

            UT runs perfectly with everything turned on in 800*600, 1024*768 needs some stuff turned of to keep smooth. I have a Taxan 19" tco99 moniter and don't mind playing in 800*600 in any game.

            Get a P3-600e, you can get them upto 800 reliably and that will work a treat.

            Check http://board.3dfiles.com for overclocking tips... that should keep you going untill the big G's are released.
            Cheers,

            archangle

            Comment


            • #21
              On what date was that article posted?

              I don't think he specifically mentioned T&L because is is supported in DX (and he said they needed to support _all_ the features of DX7). The cool thing is the the G800 is looking like it will be very feature rich!

              Reading other posts on the MURC, it sounds like the G800 will likely clock around 200-250Mhz. If it has double the fill rate of the G450, then I'm guessing that they've added a second pipeline (and hopefully two extra texture blocks). So a single G800 would make a much better choice than a g450. If they have a good implementation for the two chip G800, it will be outstanding, but just looking at the price will likely burn a hole in my wallet ;-)

              I'm leaning more and more towards doing a cpu upgrade and waiting for the fall to check out the G800, NV20 and 3dfx Rampage. I'm not real fond of 3dfx because of driver problems I've had with my V2 (vid card lockups suck). NVidia will need to improve their 2d quality to catch my interest. So that leaves the G800 looking pretty good for me.

              Archangle, thanks for the tips. I'll likely go for a PIII 650-700 (depending on price) after the July price cuts. I don't like to keep my machines OCed, though I do experiment with OCing to gauge performance gains (ok, plus its fun!). I seem to recall that on Athlon MBs the cpu bus is decoupled from the pci and agp ports (unlike my bx board), so OCing is much safer.

              I also think that a PIII in that range will have enough power, given that the vid cards out this fall will be offloading alot of work from the cpu. Although I wish I could pick up a Willamette (double pumped SSE ought to rock in 3d apps!), but I'm sure that they be starting a $800+ when they ship.

              -AJ


              Trying to figuring out what Matrox is up to is like tying to find a road that's not on the map, at night, while wearing welders googles!

              Comment


              • #22
                well the rumor mill seemed to mark T&L as a Definite feature of the G800 and from the looks of the interview with Dan Wood Matrox feels the Anti Aliasing issue is an important factor in their next chip design. Well if Matrox can deliver then it looks like they will have all of the features the "Major Players" in the 3D gamers market in their next chip design and we know they'd be nuts to give up on EMBM and Dual Head so this card is looking better by the minute and the final factor looks to be under control as Dan Wood puts it "it's got to be well over a gigapixel" if Matrox can pull it off we might see an upset in the market as Gamers who realize what's goin on sell their NV15's and voodoo5's and pick up a G800 I know this card sounds better than either of these chipsets not only for the T&L and AA ability but also heavily on the Dual Head and EMBM that hopefully with higher fillrate will play extremely smooth!!
                -Chris k.
                -Chris K.

                Comment


                • #23
                  I think the G800 will be out in a time frame closer to that of the 3dfx rampage and NV20, so the competition will be quite a bit stonger. It would be cool if you could do 4x FSAA on a single g800, but that would something on the order of 1GPS/TPS, which would be possible with a clock running at 250MHz with 4 pipelines/texture units. That would be super cool. Add some decent T&L power (something on the order of the what the rampage will have) and the single chip would be a real good next gen purchase.

                  Recently, I ran some tests on a celeron 400 with a 16MB vanilla g400 and Tribes, at 640x480 I was getting in the mid 30s for frame rates (with TurboGL). The more I think about it, a g450 (@ rumored specs) with a decent PIII will probably give really great frame rates, it just wont' support all the eye candy features that the g800 will.

                  -AJ
                  Trying to figuring out what Matrox is up to is like tying to find a road that's not on the map, at night, while wearing welders googles!

                  Comment


                  • #24
                    I bet the G800 will have vertex shaders and pixel shaders from the Direct X 8 spec. They would be foolish not to. Very cool stuff, being able to write your own pseudo-assembly that is converted to the chips own microcode to do massive amounts of vertex perturbation or color manipulation...

                    Photo-realistic 3D, here we come!!!

                    AlgoRhythm

                    Comment

                    Working...
                    X