Announcement

Collapse
No announcement yet.

anyone else a little dissapointed?

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #16
    if you can turn all the bells and whistles on and have 3 displays running at once with playable framerates then that is all i want. i do not need a high 3d mark score at settings which i do not use, i really hope that this card can implement all its features and display smooth graphics, if it can i am sure that it will move the goalposts for the graphics card industry and matrox will have scored the first goal
    is a flower best picked in it's prime or greater withered away by time?
    Talk about a dream, try to make it real.

    Comment


    • #17
      IF THEN. "End of the year" can slip to "Sometime NEXT year" faster than you can say "Pipefitters Union on strike!!"
      Exactly.

      I'll bet the same people who are saying "why didn't they do this on 0.13, add more features, cracnk up the Mhz..." etc., would be the same ones saying "They should have released this 6 months ago!" if Matrox had done that.

      The bottom line is, there is a limit to how many transistors (read: features, Mhz, etc.) you can cram into any given process size. At 80 Million and 250-350 Mhz, Matrox is REALLY pushing the 0.15 micron process to its limits.

      0.13 is not yet ready, and like it or not, the "high volume" vendors like ATI and nVidia will get first crack at it. That's just the way the business is. The Matrox's and 3D Labs of the world will be a product cylcle or so later. If Matrox was targeting 0.13, I can bet it would be closer to a year, rather than 6 months or "this fall" like many people think, that we'd see such a part.

      Unless you want Matrox to share the same fate as BitBoys...there comes a point when you have to ship a product. With DX8 fully mature, and DX8 titles finally starting to become the standard, now is the perfect time.

      The Parhelia exceeds my expectations for a 0.15 micron design. Matrox is setting a new standard for overall product quality and performance at that specification. They are pioneering the 256 bit DDR bus (along with 3D Labs). Gaming with AA and Anisotropic filtering at high resoultion should FINALLY be a reality with this part in the majority of games. DX9 compatibility would just be a waste of silicon right now. Real gamers know it. Ask anyone with the original GeForce3 if they aren't going to want to upgrade to something new this summer fall, now that DX8 titles that GeForce3 was suppossed to excel at, are finally starting to ship.

      This is not to say that, assuming NV30 and R300 are fully DX9 compatible, that they will be a "waste." It all depends on how they improve the performance and quality of *existing DX8* titles, as well as provide DX9 functionality. It's just not possible to do on 0.15 microns.

      Note to all: don't expect Matrox to follow with a 0.13 part "soon." I would guess about one year.
      If a bear shits in the woods, and no one is there to smell it, does it stink?

      Comment


      • #18
        Originally posted by Electric Amish
        That would have been nice, Jazzz, but the .13 process isn't up and running yet at the Fab. If they were going .13 they would have to wait for release until the end of the year.

        amish
        i thought haig said it was possible for them to mass product .13 right now also i think he said that they contract it out so the people they contract might not have .13 process set up

        Comment


        • #19
          I thought haig said that, IT was possible to mass produce 0.13 right now, not that matrox could, and yes it is possible, intel does it with their CPUs, however GPUs is larger, and may not be possible yet.
          This sig is a shameless atempt to make my post look bigger.

          Comment


          • #20
            I think Joe hit it pretty well.

            Also, .13 would have more hangups for Matrox right now than it would be worth. Intel is making CPUs at .13 right now, but they have an unusual advantage in the amount of communication they get between design and fab, something you can only expect if the same company is doing both.

            I think Matrox was smart to go for the .15. The complexity, waste power, and time delay of .13 would probably not have been worth it for them at this time.

            Edit: What makes you think that 130nm is cheaper?
            Last edited by Wombat; 14 May 2002, 11:56.
            Gigabyte P35-DS3L with a Q6600, 2GB Kingston HyperX (after *3* bad pairs of Crucial Ballistix 1066), Galaxy 8800GT 512MB, SB X-Fi, some drives, and a Dell 2005fpw. Running WinXP.

            Comment


            • #21
              Originally posted by Wombat
              I think Joe hit it pretty well.

              Edit: What makes you think that 130nm is cheaper?
              Ive always been under the impression .13 is cheaper, i have read CPU articles where it usually says something like (e.g) "AMD switch should be helpful as the chips would cost less, consume less power and overclock better" When Intel switched to 130nm I also remember reading that 130nm is cheaper than 180nm, so I thought the same applied to gfx cards.

              Comment


              • #22
                Jazzz
                Building new fabs costs vast amounts of money and takes time. Not only that but I also believe that it also takes time to ramp up production. In a years time .13 will be a lot more readily available than it is now and the price will come down.

                Regards MD
                Interests include:
                Computing, Reading, Pubs, Restuarants, Pubs, Curries, More Pubs and more Curries

                Comment


                • #23
                  Depends on it, if you use 0.13micron to put more fueatures in the chip, things could even turn out to be more expensive. if you create thesame core in 0.15 and 0.13, the 0.13 will be cheaper because of a smaller diesize and probably better yield.

                  Comment


                  • #24
                    I am not disappointed at all.
                    I really want this card NOW , not some fantastic 'catch it all card' in some distant future (NOT-SO-SOON).
                    I am faithful, Parhelia will fit my needs.
                    IMHO Matrox makes a very impressive comeback.
                    Hati

                    Comment


                    • #25
                      Yield could go either way, but is likely to go down. .13 is new, and expensive. It <I>will</I> be cheaper than .15, but not for anything made in the near future. Also, it takes volume for those cost savings to pan out.
                      Gigabyte P35-DS3L with a Q6600, 2GB Kingston HyperX (after *3* bad pairs of Crucial Ballistix 1066), Galaxy 8800GT 512MB, SB X-Fi, some drives, and a Dell 2005fpw. Running WinXP.

                      Comment


                      • #26
                        I'm not disappointed about the technical features of the card itself. There is a pile of future built into it, plus many features that many cards on the market don't have.

                        What I am really wondering is - what will this card be like in OpenGL? Everywhere I read sofar only hints at DX8.1 & 9...
                        ECS K7S5A Pro, Athlon XP 2100+, 512 Megs PC-3200 CAS2.5, HIS Radeon 9550/VIVO 256Meg DDR

                        Asus A7N8X-E Deluxe C Mobile Athlon 2500+ @ 2.2GHz, 1GB PC-3200 CAS2.5, Hauppauge MCE 150, Nvidia 6600 256DDR

                        Asus A8R32 MVP, Sempron 1600+ @ 2.23GHz, 1 Gig DDR2 RAM, ATI 1900GT

                        Comment


                        • #27
                          What the ****!

                          We get to see the specs for one hell of a card and people are pissing and moaning that it might not play quake at 1000fps.
                          If that’s all you want then there are other companies to choose from. If you want real innovation and something that will last more than six months then you should be smiling right now.

                          How many time have you read, where people buy a new MATROX card and one week late put their old card back in because the visual quality alone sucks. This is something I have read many times about other manufactures and have had the experience myself. I found out that my computer can do other things than play quake.

                          All I can say is “MATROX BRING IT ON”.

                          Comment


                          • #28
                            You go girl!
                            I'm with the ugly guy below me

                            (It's amazing how many threads I kill with that line )

                            Comment


                            • #29
                              I don't understand what the fuss is all about!!!!

                              Parhelia will kick major booty, and that's that! I don't and never will care what Tom or Anand has to say because as far as I'm concerned non of them actually seen final product in action.

                              Wait people, just wait for actual benchmarks and then drool even more

                              Cheers,
                              Elie

                              Comment


                              • #30
                                Can't say I'm disappointed at all.
                                I'm pushing harder than ever to finnish my education just to get a job so I can pay for a parhelia
                                Can't really expect a card to fulfill every single wish out there and I'm really happy with what I've seen today!

                                But sure there is a point in the reviews - we still have to see an actual card ( I doubt it'll let anyone down though ).

                                /Flyke

                                Comment

                                Working...
                                X