Announcement

Collapse
No announcement yet.

GeForce FX is announced...

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #46
    The problem is not the praise fro Matrox, (that's your choice) but your malcontent for everything else. Which leads to hilarity of this board. It brings me and many others back to see the most recent posts.

    I have own plenty of Matrox cards. BTW.

    Comment


    • #47
      like what for example?gf fx is 4 months away?thats fact!doesnt have 48gig/sec bandwidth?thats a fact.previous gf dont have matrox 2d quality?fact. cant do triple or even dual head as well as matrox?fact. faster in 3d than matrox. fact. malcontent?i dont want a geforce unless it can do grat 2d and multi screen thats taste and preference not a vendeta against anything non matrox, just like nvidia fanboys dont want it unless it gives them 5000000 fps in quake 3 at 640x480
      is a flower best picked in it's prime or greater withered away by time?
      Talk about a dream, try to make it real.

      Comment


      • #48
        The rummor is that the card will be available midd january (first cards) to early february (in large quantities) at a 350-400$ price range, there are also rummors about a ultra version of it at around 500$ (like the GF2 ultra ? ) and nv31, nv34 lower end cards.
        So if it's close to a Radeon 9700 Pro price I say it would be somehow worth it (provided ATI doesn't go for the R350 by then ).
        The only game that might justify the price of the R300, R350 and NV30 cards is Doom III. The temptation of playing it with high AF and AA settings in 1024x768 or 1280x1024.

        As a performance/price card right now I'd go for one of them Albatron 8X AGP GF4 Ti4200 Turbo. That card can overclock past Ti4600 speeds (305/730 MHz, reported by Guru3D) and is priced around 180$ (215$ here).
        It would probably still be able to throw decent fps in Doom III in 1024x768x32, trilinear, no AF and AA and geometric and texture details on medium or high.

        Comment


        • #49
          Originally posted by borat
          like what for example?gf fx is 4 months away?thats fact!doesnt have 48gig/sec bandwidth?thats a fact.previous gf dont have matrox 2d quality?fact. cant do triple or even dual head as well as matrox?fact. faster in 3d than matrox. fact. malcontent?i dont want a geforce unless it can do grat 2d and multi screen thats taste and preference not a vendeta against anything non matrox, just like nvidia fanboys dont want it unless it gives them 5000000 fps in quake 3 at 640x480
          Borat gets it bang on. Matrox still leads the way in 2d quality.
          Chief Lemon Buyer no more Linux sucks but not as much
          Weather nut and sad git.

          My Weather Page

          Comment


          • #50
            When the pricing of the Radeon9700 non-Pro boards is in fact about GF4Ti4600-level, I think this is the most reasonable card to go for.

            Here we have to wait for the actual cards being available - as with the currently nonexistant Geforce FX.
            But we named the *dog* Indiana...
            My System
            2nd System (not for Windows lovers )
            German ATI-forum

            Comment


            • #51
              Originally posted by Indiana
              When the pricing of the Radeon9700 non-Pro boards is in fact about GF4Ti4600-level, I think this is the most reasonable card to go for.

              Here we have to wait for the actual cards being available - as with the currently nonexistant Geforce FX.
              A Radeon 9700 would be nice, since it doesn't have the memory bandwidth cut in half like the 9500 cards, still, it should be around 299$ and since the prices here are pretty much US prices + our 19% VAT it would make it 355$.
              That particular GeForce seems more appealing from where I stand.
              Last edited by Admiral; 19 November 2002, 10:47.

              Comment


              • #52
                Afronaut: Did you have a point? If it was intended at me I'm afraid I missed it...could you perhaps clarify it?

                borat: Nope, I don't have a Parhelia, my point stands on my experience with G200 and G400. I have never seen a Parhelia output, or R9700 so I won't compare them.

                Afronaut: Anything else?
                Meet Jasmine.
                flickr.com/photos/pace3000

                Comment


                • #53
                  Afronaut NEVER has a point, except to try to belittle others.
                  Gigabyte P35-DS3L with a Q6600, 2GB Kingston HyperX (after *3* bad pairs of Crucial Ballistix 1066), Galaxy 8800GT 512MB, SB X-Fi, some drives, and a Dell 2005fpw. Running WinXP.

                  Comment


                  • #54
                    Originally posted by chaoliang
                    Hi Novdid: How do you compare the 2D quality of 9700Pro and Matrox cards? (I don't know which ones you have used.) Thanks.
                    Well it was a looong time ago I used a M board, but the 2d in my opinion sits somewhere between the Radeon 7500(the same quality as the original Radeon) and the G400. I think the 2d is very acceptable, atleast on the first head, haven't tried out the second head yet though. I have another HQ 19" here so I'm more or less ready to test out the W2K dualscreen capabilities(yes, it's supposed to support individual resolutions in 2K). Note that I got a BBA 9700Pro, some other boards manufactured by others than ATI, Sapphire and Hercules may have an output of lesser quality .

                    Comment


                    • #55
                      That's exactly correct, the Radoen9700 has 2D quality just above the original Radeon and still a bit below the G400.
                      But its way above the Radeon8500 and all Geforce cards I've seen till now as well as well above a KyroII one (Hercules).
                      But we named the *dog* Indiana...
                      My System
                      2nd System (not for Windows lovers )
                      German ATI-forum

                      Comment


                      • #56
                        And I do have a Nvidia video card and I know exactly its weaknesses and strong points. Nvidia has a very strong image built basically on hype and on net fanboy propaganda. We´re talking about a Ti 4400 here.

                        Driver quality is well below what a non-nvidia user might expect from all the praising about its drivers and specially all the bashing about ATI drivers. Their "official" releases are very scarce, the driver updates are mostly from "leaks", and despite really fixing some bugs they do have the strange behaviours of always screwing a game or two. Matrox isn´t the best example about drivers, it´s a shame that some 1999 G400 bugs still made their way into parhelia (glquake dinamic lights). Everyone talks about Matrox don´t enabling more than Aniso 2x, and if you want my opinion, what´s the point? Even on a GF4 you really can´t play too much games at more than that without a huge performance dive. And I do think it´s NOT an hardware problem, but a driver issue, because they fixed it some time ago on OpenGL.

                        Fixing 60Hz refresh rate on 3D (win2k/XP) was only possible some weeks ago or so with 41.xx Nvidia drivers. It´s available since the beggining with Matrox Powerdesk. Creating custom resolutions is nearly impossible, while with MTSTU is very simple.

                        Video playback and 2d quality aren´t anything special. They also aren´t as terrible as some report, but surely the G400 was the sharpest card on my desktop.

                        The whole point is that Nvidia is at a position that all their products are seen as the latest and greatest and the big innovation of the year. When the R9700 was launched (and it´s an excelent product) there wasn´t a single review that didn´t imply "well, it´s a good product, but now that NV30 is just around the corner...". That was EXTREMELY unfair, because NV30 isn´t no quantum leap from the R300, far from it, actually it plays on the very same league, no more no less features. It´s just as exciting as R300, heck it´s a DX9 card, and DX9 isn´t out yet... People even compained about the power connector on the R300, how it´d stress the power supply, how poor design it showed, how crazy it was to release R300 on .15u... well NV30 also has a power connector and I still didn´t see one negative comment about it.

                        Comment


                        • #57
                          Nuno, you're totally right. But now it's really easy to shut up most NVidiots. All you have to do is a search and you'll most likely find some posts bashing other gfx-cards in the past for things that the NV30 does now as well..
                          There's not only the external power connector, think about all the (stupid and uninformed) bashing ATI took by the NVidiots for going to a 8x1 pipeline. Many of the ones that are now praising NView were trying ro play down Matrox' DualHead earlier....


                          Still I have to say that I've yet to experience better OpenGL drivers than the ones from NVidia.
                          DirectX, now this is another case...
                          But we named the *dog* Indiana...
                          My System
                          2nd System (not for Windows lovers )
                          German ATI-forum

                          Comment


                          • #58
                            Originally posted by Nuno
                            The whole point is that Nvidia is at a position that all their products are seen as the latest and greatest and the big innovation of the year. When the R9700 was launched (and it´s an excelent product) there wasn´t a single review that didn´t imply "well, it´s a good product, but now that NV30 is just around the corner...". That was EXTREMELY unfair, because NV30 isn´t no quantum leap from the R300, far from it, actually it plays on the very same league, no more no less features. It´s just as exciting as R300, heck it´s a DX9 card, and DX9 isn´t out yet... People even compained about the power connector on the R300, how it´d stress the power supply, how poor design it showed, how crazy it was to release R300 on .15u... well NV30 also has a power connector and I still didn´t see one negative comment about it.
                            Couldn't agree more.

                            There are people whom I have spoken to in person that say that the next Geforce is the thing and that the ATI product merely is some second grade hardware. I remeber that we mostly spoke about Doom3 where I pointed out that I believe that a R8500/GF4Ti and up will run the game very well, whereas he dissagreed and said that NV30 is going to be "da bomb" just becasue some nv fanboys on fragzone(swedish gamingsite) said so. We had this conversation maybe 6weeks ago when the first rumours about NV30 started to appear, and the guy has no idea what so ever what he's talking about.

                            Comment


                            • #59
                              In retrospect: Nv's pr machine impresses me, not only are they good at doing some serious marketing, their fanboys manage to do much of it for free!!!

                              Comment


                              • #60
                                I am one also who see's the 9700 and gf-fx to be very close.
                                For one it will be a while befor directx9 games make it out. Granted the gf-fx grafix features up the 9700 by a bit, but considering the gf-fx won't be out untill after x-mas I think the 9700 will be the level most programers will aim for as a standard. My example being even though the 8500 technically better then the gf3, both ran directx8 games the same. The same with this new generation, anything above directx9 spec is just not going to be used except in demos.

                                Ati has said the 9700 can adapt DDR2 now if they want and I am sure its die shrink is already in the works. And considering it already is a 256 bus part I think it gives it more room to grow than the gf-fx. Also the nvidia part has to ship with 1000mhz ddr2 to be competitive with its 128bit bus, thats got to cut huge into their profit margins. And you know Ati will chop the 9700 price the second the nvidia hits the street.

                                Both do AA and AF amazingly and any card that doesn't from this point on will not compete. There are no more fps guzzlers! The V3, gf2 days of stripped down features for high fps are gone.

                                Ps- Why do Ati and Nvidia release their fastest card first, then release stripped down models, while Matrox, SiS, PowerVR and others release a new card then faster models come after? Seems a reverse of marketing stratagy.
                                Oh my god MAGNUM!

                                Comment

                                Working...
                                X