Announcement

Collapse
No announcement yet.

Ok Murcers GIVE UP!

Collapse
This topic is closed.
X
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #91
    I find it amusing that people are trying to peddle non-Matrox solutions to loyal Matrox fans. Why are these people so frustrated? Why can't they appreciate that some people can be happy with their current cards?

    Personally, my G400 Max is not the bottleneck in my gaming platform. I don't play many games at any rate. I do appreciate the other features of this card. I might consider a new graphics solution after I upgrade my CPU and network link. When I do go shopping for a new card I'll probably want a bit more FPS but image quality will be paramount. I'll also look for the ability to encode/decode HDTV and control that image. I've been waiting for someone to express what spoogenet posted ... not every gamer is interested in only games that benefit from a high FPS and not every user buys a 3D card for games.
    <TABLE BGCOLOR=Red><TR><TD><Font-weight="+1"><font COLOR=Black>The world just changed, Sep. 11, 2001</font></Font-weight></TR></TD></TABLE>

    Comment


    • #92
      I don't know if you're referring to me, but I'm most definitely not trying to peddle anything to anyone. I just know that there are people out there who are waiting for the G800 to come out, and it just isn't going to be at all what you're expecting. Anyway, far be it for me to tell people how to spend their money. Wait away.

      Comment


      • #93
        nihilist:

        You sound like you know what the G800 will be like. Any particular reason why you would know?

        But see, speculation is just that...speculation. It doesn't imply any real accuracy. Personally, I haven't speculated anything on the G800. I could give a list of what I would like to see, but I quite frankly have no clue what it will be like, when, where, how, whatever. But I do know Matrox is hiring in Boca Raton, FL.

        Note: I don't ask to be an ass, I ask because I don't know, and I am curious why you're so confident on it, that's all.

        b
        Why do today what you can put off until tomorrow? But why put off until tomorrow what you can put off altogether?

        Comment


        • #94
          Hey spoogenet,

          nihilist doesn't know dick. At this point I would take anything you hear about the G800 with a grain of salt until Matrox makes an official statement. Sorry I can't say anything but you know how those NDAs can be.

          Joel



          [This message has been edited by Joel (edited 19 January 2001).]
          Libertarian is still the way to go if we truly want a real change.

          www.lp.org

          ******************************

          System Specs: AMD XP2000+ @1.68GHz(12.5x133), ASUS A7V133-C, 512MB PC133, Matrox Parhelia 128MB, SB Live! 5.1.
          OS: Windows XP Pro.
          Monitor: Cornerstone c1025 @ 1280x960 @85Hz.

          Comment


          • #95
            Ok, so they only companies you will buy video cards from are those who make their own cards? Aha, so you have considered the Radeon then? It has new technology (3d textures, quad vertex skinning, HyperZ, etc), not the fastest of the pack, not N****, does that qualify? What am I thinking, of course not, the ATI MACH 64 had crappy drivers back in 1996 or something.

            {Note, I don't care what card anybody uses or when they want to upgrade. I still have a V3, who am I to judge? I just find it amusing to see the contradictions people put up to try to hide a simple brand preference. I'm sure if this were pepsi users resource center, people would be talking about worker conditions in coke caneries in Ethiopia as a reason they would never consider coke. )

            [Was going to say coca-cola to avoid remarks about cocaine, but I don't know how to spell that. ]

            Comment


            • #96
              BTW, there is only one supplier of Kyro cards as far as I know, and they also happen to be the only maker of them as well. PowerColour. They don't make the chips to my knowledge, but it's not exactly the same situation that you get with N***** cards.

              Comment


              • #97
                yes, I would consider the Radeon after Matrox
                System 1:
                AMD 1.4 AYJHA-Y factory unlocked @ 1656 with Thermalright SK6 and 7k Delta fan
                Epox 8K7A
                2x256mb Micron pc-2100 DDR
                an AGP port all warmed up and ready to be stuffed full of Parhelia II+
                SBLIVE 5.1
                Maxtor 40g 7,200 @ ATA-100
                IBM 40GB 7,200 @ ATA-100
                Pinnacle DV Plus firewire
                3Com Hardware Modem
                Teac 20/10/40 burner
                Antec 350w power supply in a Colorcase 303usb Stainless

                New system: Under development

                Comment


                • #98
                  My G400 Max IS the gaming bottleneck in my system. But since it does everything else so well I can forgive it (it's not like it was when I had it in a K6-3 system). And it's not really that it's bad at gaming but compared to the effortless brawn of the GF2 & Radeon in 3D it is a bit lagging and at it's limits. If it could magically go faster in 3D it would be the best card out although the Radeon would still be close overall. So Matrox cards have only one real weakness but with some more horsepower they would really be something else. At this point I would not recommend Matrox to a gamer but I do recommend them to business users all the time.

                  Comment


                  • #99
                    Himself,

                    If you read what I posted carefully, I think I explained it fairly well. At any rate, I will reiterate.

                    The reason the T&L on the GeForce is not a compelling feature is because it is very limited in it's capabilities. Everyone knows this. DX8 is the API for T&L, and the GeForce was designed for use in DX7, and there is no real way for it to take advantage of the other nice features in DX8.

                    If you want to see what I mean, stay tuned. There will be cards coming up that will smoke in DX8.

                    Rags

                    Comment


                    • I don't think TCL will be worth a damn for the next year whether DX8 is fully implemented in cards or not. When games are designed for TCL and DX8, and it actually makes some visual difference or real speed difference, then is the time to think about whether your card can only do static TCL instead of programmable, you might even be able to afford one then too.

                      Comment


                      • 100 useless posts
                        System 1:
                        AMD 1.4 AYJHA-Y factory unlocked @ 1656 with Thermalright SK6 and 7k Delta fan
                        Epox 8K7A
                        2x256mb Micron pc-2100 DDR
                        an AGP port all warmed up and ready to be stuffed full of Parhelia II+
                        SBLIVE 5.1
                        Maxtor 40g 7,200 @ ATA-100
                        IBM 40GB 7,200 @ ATA-100
                        Pinnacle DV Plus firewire
                        3Com Hardware Modem
                        Teac 20/10/40 burner
                        Antec 350w power supply in a Colorcase 303usb Stainless

                        New system: Under development

                        Comment


                        • does that make 882 usefull posts then ?
                          clatto verata nectKRHMM...

                          Comment


                          • closer to 2,000 due to the last server crash
                            System 1:
                            AMD 1.4 AYJHA-Y factory unlocked @ 1656 with Thermalright SK6 and 7k Delta fan
                            Epox 8K7A
                            2x256mb Micron pc-2100 DDR
                            an AGP port all warmed up and ready to be stuffed full of Parhelia II+
                            SBLIVE 5.1
                            Maxtor 40g 7,200 @ ATA-100
                            IBM 40GB 7,200 @ ATA-100
                            Pinnacle DV Plus firewire
                            3Com Hardware Modem
                            Teac 20/10/40 burner
                            Antec 350w power supply in a Colorcase 303usb Stainless

                            New system: Under development

                            Comment


                            • For the record...

                              The only real difference in which T.L support changes from DX7 to DX8 is that the transform and lighting operations can be programed and modified by the developer,in esscence t.l operations in upcoming DX8 video cards have their own sub-api if you will.

                              That can't be done on Geforce 2 cards because it's T.l support is hardwired into the chip and follows the DX7 standard very closely,so it's less flexible and the developer has to optimize the T.l routines following Nvidias guidelines for maximum performance.

                              That's why a lot of developers weren't/aren't to enthusiastic to support it in dx7,because you'd have to essentially make two versions of the same game,one for T.L cards and one for non T.l video cards(right down to the game engine itself).

                              The actual calculations performed are the same regardless of api.

                              And i don't know about you guys,but the unreal tech demos that showed outdoor scenes with over 100.000 polys per frame running on a geforce card in realtime at over 30 fps looked pretty impressive to me.

                              That's roughly 5 to 6 times more than any currently shipping game,so while you can say that the Geforce's T.l unit is less flexible than upcoming video cards,it far from useless.

                              Hope that clears up the confusion.... .

                              note to self...

                              Assumption is the mother of all f***ups....

                              Primary system :
                              P4 2.8 ghz,1 gig DDR pc 2700(kingston),Radeon 9700(stock clock),audigy platinum and scsi all the way...

                              Comment


                              • I will go on the record of saying that the current GeForce's T&L unit is useless. Because it is. It has been proven time and again that unless a game is hard coded to support the T&L through and through, you will see NO benefit between software and hardware T&L if you are running a fast enough processor. So, I guess if you are into tree demos and 3DMark 2K, then sure you have two uses

                                Rags

                                Comment

                                Working...
                                X