Announcement

Collapse
No announcement yet.

Parhelia!!!

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Don't forget that the G400 came out when the TNT did, and held up against the 2, the Ultra, and often the Geforce. For that period of time, the other companies were playing catch-up.
    Gigabyte P35-DS3L with a Q6600, 2GB Kingston HyperX (after *3* bad pairs of Crucial Ballistix 1066), Galaxy 8800GT 512MB, SB X-Fi, some drives, and a Dell 2005fpw. Running WinXP.

    Comment


    • Quite true...but unfortunately for matrox,that was nearly 3 years ago ....A lot has changed since then and it isn't getting any easier to develop high end graphics cards because the investment in R&D is huge....


      For instance one of the reasons why the gf3 was delayed,was that at the time,both 0.15 micron production on the scale the gf3 required(with it's 57 million transistors),wasn't quite ready yet and the basic desing was so complex(at the time of course),that it kept breaking the validation tools needed to see if everything was operating properly...So what did Nvidia do...simple,release a higher clock version of the GF2 gts(the pro and ultra versions),while delaying the GF3's intro another six months,and get all the kinks ironed out...



      Now the basic problem today is still the same,but instead of 0.15 micron,it's the 0.13 micron that isn't widely available yet(save for intel of course) and upcoming graphics chips will be using close to 80 million transistors,assuming of course they'll stay the same physical size as today's GF3's and GF4's,with their 57 and 63 million transistors at 0.15 micron,respectively....



      So i'd like to believe that matrox is able to make such a huge leap forward,but going from what essentially are DX6 cards(G400/450/550) directly to a full DX9 card is just a litttle to much to believe,even more so when by ant's own admission,the G550 is only a shadow of what the G800 was supposed to be....
      note to self...

      Assumption is the mother of all f***ups....

      Primary system :
      P4 2.8 ghz,1 gig DDR pc 2700(kingston),Radeon 9700(stock clock),audigy platinum and scsi all the way...

      Comment


      • superfly... how would it be possible for Matrox to release a full DX9 part in the middle of the summer (baring it's released then) when DX9 won't be completed til the fall?

        I think you're assuming too much in one hand and not enough in the other. Guess we can call that a Royal Mother f*ckup of Assumption
        "Be who you are and say what you feel, because those who mind don't matter, and those who matter don't mind." -- Dr. Seuss

        "Always do good. It will gratify some and astonish the rest." ~Mark Twain

        Comment


        • Quite true...but unfortunately for matrox,that was nearly 3 years ago ....A lot has changed since then and it isn't getting any easier to develop high end graphics cards because the investment in R&D is huge....
          Three years, huh? Wow, Matrox tends to release a brand-new ass-kicking core about every 3 years.

          Yes, and nVidia was late with the GF3 because the tools weren't up to it....right.... There are/were plenty of things out there more complicated than a GF3, and they managed to get done.

          Now the basic problem today is still the same,but instead of 0.15 micron,it's the 0.13 micron that isn't widely available yet(save for intel of course) and upcoming graphics chips will be using close to 80 million transistors,assuming of course they'll stay the same physical size as today's GF3's and GF4's,with their 57 and 63 million transistors at 0.15 micron,respectively....
          And?
          Gigabyte P35-DS3L with a Q6600, 2GB Kingston HyperX (after *3* bad pairs of Crucial Ballistix 1066), Galaxy 8800GT 512MB, SB X-Fi, some drives, and a Dell 2005fpw. Running WinXP.

          Comment


          • I remember back when nV was boasting that their 3D capabilities were so bloody awesome that their newly released GeFart was in the same catagory as SGI's cards... this was not only grossly untrue, but SGI took offence and slammed their arrogant little arses for it.

            LOL How they do forget

            Me thinks this time around SGI will nod their head welcoming Matrox into the field of Pro 3D.
            "Be who you are and say what you feel, because those who mind don't matter, and those who matter don't mind." -- Dr. Seuss

            "Always do good. It will gratify some and astonish the rest." ~Mark Twain

            Comment


            • Greebe...the specifications for DX9 are decided upon well in advance so that once it's actually been fully tested and available for release,chip makers can/will have products that can use those features....


              Currently,the DX9 spec is in early beta testing,so the feature llist is finalised and the target release date,assuming( ) that every goes to plan,is this fall....That doesn't stop any given company from(potentially)releasing peoducts that at least support some of it's features a couple of months early.


              As far as the SGI comments go,that's just marking hype that any company will boast,anyone with half a brain will just treat simply as that...hype.



              Wombat...

              All i'm saying is that going from releatively simple desings like the G400 series was(with i believe about 10 million transistors)to something as complex as a part/full DX9 part after a nearly 3 hiatus and bearing what happened with the G800,as well as the employee poaching situation that happened with Nvidia,doesn't help any when it comes to making a beyond state of the art(vigilant's own words) chip and memory tech that can beat anything that Nvidia and ATI have to offer.
              note to self...

              Assumption is the mother of all f***ups....

              Primary system :
              P4 2.8 ghz,1 gig DDR pc 2700(kingston),Radeon 9700(stock clock),audigy platinum and scsi all the way...

              Comment


              • While Matrox's on the shelf products may have been out of the loop I don't think their R&D products have, and Matrox have some damn fine R&D. While on the face of it it looks like Matrox have stood still since the G400 and yes as far as on the shelf products go they have but I feel confident to say their R&D has been on the ball and up to date but let down by some technology transfer problems and certain marketing decisions dictating where Matrox were to focus. Hence we saw the G550 instead of the G800 which could have lived up to the hype.

                While nV did indeed poach several people from Matrox I doubt that it had any effect of product development, it seems to have led to a technology transfer of sorts between the companies but I'd presume that was old technology, i.e. DualHead.

                If there is any truth to Parhelia, which we can only wait and see, then I would doubt it would be a full DX9 part but would expect there to be some elements of it in there. As far as the memory technology goes there is no need to be thinking of exotic forms of technology for the proclaimed bandwidth, it is quite easy to acheive, just do a little homework and look at how Matrox have managed things in the past.

                Originally posted by superfly
                All i'm saying is that going from releatively simple desings like the G400 series was(with i believe about 10 million transistors)to something as complex as a part/full DX9 part after a nearly 3 hiatus and bearing what happened with the G800,as well as the employee poaching situation that happened with Nvidia,doesn't help any when it comes to making a beyond state of the art(vigilant's own words) chip and memory tech that can beat anything that Nvidia and ATI have to offer.

                Comment


                • I find all of this talk about Matrox not being able to produce a card because they haven't in the past 3 years comical...

                  they have very complex cards in production now, I wouldn't think it's all that easy to produce a RT2500 with all those realtime video effects...

                  and if I do remember, when Matrox came out with the G200, noone thought it possible because the Mytique was so old and Matrox hadn't explored the "new" 3D... Matrox had their success with the Millenium series in the Pro and Business market... [Hmm hauntingly familuar]
                  The first foray into the 3D was the mystique, which for it's time was on the top of the game, then they tried to patch up their 3D support with an add-in card Ala-3Dfx with the M3D...
                  Then they come out with a 32bit full featured 3D card that had features none of the cards had...

                  Then we started the series of rehashing again... And we also got intro'd to nV quick product cycle... Which is the ONLY reason for basing the arguement of
                  "they've been to idle to do anything big like this"


                  As always with rumours, time will tell... Seems to me there is something in the Que that will replace my G400 nicely...



                  Craig
                  1.3 Taulatin @1600 - Watercooled, DangerDen waterblock, Enhiem 1046 pump, 8x6x2 HeaterCore Radiator - Asus TUSL2C - 256 MB Corsair PC150 - G400 DH 32b SGR - IBM 20Gb 75GXP HDD - InWin A500

                  Comment


                  • So i'd like to believe that matrox is able to make such a huge leap forward,but going from what essentially are DX6 cards(G400/450/550) directly to a full DX9 card is just a litttle to much to believe,even more so when by ant's own admission,the G550 is only a shadow of what the G800 was supposed to be....
                    nobody say they haven´t worked on dx7/dx8 cards before, they simply choosed not to release them, i do believe that their R&D staff has learned something in those 3 years, they were probably busy making the g800/g550, and they probably learned something from that.

                    I wouldn´t believe it if someone said: "they designed the g800/g550 and learned nothing about high-end 3d".
                    This sig is a shameless atempt to make my post look bigger.

                    Comment


                    • Of course I couldnt help posting on that Ars board. But seriously, these fools are out of it. Matrox's VGA card business is the smallest thing they deal with! It's more of a convenience really, than anything else. After all, their Digisuite cards are still very highly regarded in the NLE market, and the RT2500 is one of the most popular DV editing card out there. Nvidia sell's VGA cards. Same as ATI. Period. They do not sell $40,000 dollar editing systems. To even think that Nvidia and ATI are superior company's cause they make fast gaming cards is such a joke, I'm still laughing at it. Yes, they might have good gaming cards out there, but thats their only revenue source and of course they are going to keep making new parts (every 6 months, costing everybody more money, but helping them stay afloat) and due to their busines structure that makes sense, but Matrox doesnt fit in this mold.

                      Oh whatever, I still cant beleive the conceit of some of those poeple at the other board.

                      BTW Ant, please, even if Parhelia shows up, dont bother heading oer as I'm sure you'll be lambasted as some Matrox flunky that was just stringing them on, or some other illogical crap.
                      A computer is like sex. Your never 100% sure what your doing but when all goes well, it feels REAL good.

                      Comment


                      • Originally posted by Ant

                        If there is any truth to Parhelia, which we can only wait and see, then I would doubt it would be a full DX9 part but would expect there to be some elements of it in there. As far as the memory technology goes there is no need to be thinking of exotic forms of technology for the proclaimed bandwidth, it is quite easy to acheive, just do a little homework and look at how Matrox have managed things in the past.


                        I'd to see the card myself,i really would....but until it's actually out and hopefully handing out Nvidia's ass back to them(if the specs are real and the card lives up to it's potential)...it still falls into the vaporware category...


                        Ant...regarding the memory bandwith figures quoted,as far as i can see,the only ways to get that much bandwith(~19 gig/sec) is too either use embeded memory(Edram),which was promptly shot down by greebe or resorting to an external 256 bit wide bus between main memory the the graphics chip itself and using 300mhz DDR sdram(with a 256 bit wide bus,it dishes out exactly 19 gigs/sec)...


                        But building a card with 256 bit wide bus is anything but easy or cheap,considering the extra traces needed on the pcb,which would make the board either physically larger or need more layers.

                        The timing issues with all those extra traces is also a problem,the extra pins on the graphics chips is another and memory granularity issues are another.
                        note to self...

                        Assumption is the mother of all f***ups....

                        Primary system :
                        P4 2.8 ghz,1 gig DDR pc 2700(kingston),Radeon 9700(stock clock),audigy platinum and scsi all the way...

                        Comment


                        • Our beloved Ant has posted at
                          The Tech-Report

                          and I also started a thread at
                          Anandtech

                          There are, suprisingly, quite a few G400 users at Anandtech.

                          amish
                          Despite my nickname causing confusion, I have no religious affiliations.

                          Comment


                          • don't the gf4 ti 4600 have 256 bit buses?

                            AZ
                            There's an Opera in my macbook.

                            Comment


                            • No...the GF3 and 4 series still use a 128 bit memory bus that split into 4 seperate 32 bit buses,each one with it's own memory controler..


                              The main difference that the GF4 adds is that apart from the higher operating frequencies,each memory controler has a small cache added to it,and acts much like L2 caches found on current cpu's,which overall,makes the GF4 about 10% more memory bandwith efficient that the GF3,even if both were running at the same speed....
                              note to self...

                              Assumption is the mother of all f***ups....

                              Primary system :
                              P4 2.8 ghz,1 gig DDR pc 2700(kingston),Radeon 9700(stock clock),audigy platinum and scsi all the way...

                              Comment


                              • Ant...judging by your silence regarding the 256bit bus issue...should we take that as part of the real specs or not.?
                                note to self...

                                Assumption is the mother of all f***ups....

                                Primary system :
                                P4 2.8 ghz,1 gig DDR pc 2700(kingston),Radeon 9700(stock clock),audigy platinum and scsi all the way...

                                Comment

                                Working...
                                X