Announcement

Collapse
No announcement yet.

Parhelia needs .13 micron

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #31
    Nvidia ISN'T shipping them cheaper. Their prices have gone up over time.

    Also, there isn't any 130nm ready out there. Nothing, nada. Intel has it, and they don't share. Nobody else is doing it well.

    The only way Parhelia could come out on 130nm would be to wait. Months. The process wouldn't be mature even then. Months after that, they'd be able to ship in volume enough to meet demand. This is also assuming it takes zero effort to port a design from one process to another, when it really takes a lot of effort and time. The only way Matrox could ship now, and in volume, was 150nm. It was the wise choice.
    Gigabyte P35-DS3L with a Q6600, 2GB Kingston HyperX (after *3* bad pairs of Crucial Ballistix 1066), Galaxy 8800GT 512MB, SB X-Fi, some drives, and a Dell 2005fpw. Running WinXP.

    Comment


    • #32
      Smaller process gives more processor per wafer, thereby increasing yield while keeping the costs of raw materials the same.
      This isn't necessarily true, as Wombat has alluded to. AMD's TBred is so small that they are giving special guidelines about heatsinks to be used. You're talking about an area the size of a pinkynail trying to dissapate 70 watts of heat here.. it gets difficult. Even WITH a heat spreader. I think the more difficult problem once you get to processes like .09 and .065 is making them practical. The process work getting the traces that small is a real bitch, but then when you find that you've gone to all that work, and the damn things have their heat concentrated into such a small area that it becomes impossible to run them at targeted speeds.. now that's REALLY gotta suck. SI-28 and SOI are there to combat these problems to a degree, but things are gonna get pretty tight here the next few years at any rate. Chips really can't get too much smaller unless they can be made to generate less heat intrinsically.

      Comment


      • #33
        Well, this really has nothing to do with games, and as the talk has turned to a more general discussion of fab processes, this thread will now be moved to General Hardware
        Core2 Duo E7500 2.93, Asus P5Q Pro Turbo, 4gig 1066 DDR2, 1gig Asus ENGTS250, SB X-Fi Gamer ,WD Caviar Black 1tb, Plextor PX-880SA, Dual Samsung 2494s

        Comment


        • #34
          To make things harder, making chips smaller makes them generate <I>more</I> heat intrinsically.

          This may be nit-picking, but yield is generally considered percentage of good die. If you go from 20 of 50 die being good to 30 of 100 die being good, your yield has gone <I>down</I>.
          Gigabyte P35-DS3L with a Q6600, 2GB Kingston HyperX (after *3* bad pairs of Crucial Ballistix 1066), Galaxy 8800GT 512MB, SB X-Fi, some drives, and a Dell 2005fpw. Running WinXP.

          Comment


          • #35
            Originally posted by Kruzin
            Well, this really has nothing to do with games, and as the talk has turned to a more general discussion of fab processes, this thread will now be moved to General Hardware
            I was wondering when this was going to be moved.


            What Wombat and KvHagedorn is saying is very true. The smaller you get things the harder it is to dissipate the heat, and the faster you go the more heat you generate. Work=Heat.

            As to prices dropping I would think it has more to do with the current economic conditions than moving to a .13 process.

            This may be nit-picking, but yield is generally considered percentage of good die. If you go from 20 of 50 die being good to 30 of 100 die being good, your yield has gone down.
            Hit that one right on the head! Going to a .13 process increases the chance that die yield will go down. Smaller lines = more of a chance that a particle will short out the die. Just takes one in the right place.

            Oboy
            Time to make the wafers!
            Oboy Inside!

            intel P4 2.26 @ 2.957Ghz

            "Life isn't like a box of chocolates...it's more like a jar of
            jalapenos. What you do today, might burn your ass tomorrow."

            Comment


            • #36
              Soory, your wrong

              "Also, there isn't any 130nm ready out there. Nothing, nada. Intel has it, and they don't share. Nobody else is doing it well"

              Not true my friend. As TSMC has thier .13 micron process up. How do I know you ask. Simple, I have tested and used the NV30. It was/is built on the .13 process from TSMC.

              Then there is the statement about Nvidia's chips costing video card makers more money. They don't the TNTs used to set back companies around 60 bucks per chip. the GF4s are in the nieghborhood of 35-40 bucks per chip.

              Comment


              • #37
                Cool Wombat!!!

                We have something else in common. Our FAB down here is sole source for Mckinley.

                Paul
                "Never interfere with the enemy when he is in the process of destroying himself"

                Comment


                • #38
                  Not true my friend. As TSMC has thier .13 micron process up. How do I know you ask. Simple, I have tested and used the NV30. It was/is built on the .13 process from TSMC.
                  And the NV30 is out in bulk <I>when</I>? Thought so. Notice I said that nobody else is doing it well. I'm sure they can get a couple parts working just by luck. "Soory" to you. I also left out AMD. Why? Because they're doing a shit job with it and everybody knows it.
                  Last edited by Wombat; 22 June 2002, 00:47.
                  Gigabyte P35-DS3L with a Q6600, 2GB Kingston HyperX (after *3* bad pairs of Crucial Ballistix 1066), Galaxy 8800GT 512MB, SB X-Fi, some drives, and a Dell 2005fpw. Running WinXP.

                  Comment


                  • #39
                    Logan,

                    Mind sharing any info, from what you had a chance to see?

                    Is it a monster...Was their CEO just talkin' smack when he stated that this thing would be the biggest/greatest contribution to the field of 3D graphics?

                    Comment


                    • #40
                      Originally posted by [Ch]amsalot
                      RedRed,

                      Thank you too for the welcome. I've been the lead moderator at the MaximumPC Magazine Commport forums for several years, so dealing with people of all personalities goes with the territory. However, as you can guess by my willingness to use an NVIDIA logo on a Matrox forum, I won't let others push me around, and I back up my "NSHO's" with facts when my hand is called.

                      Perhaps Wombat and I will earn each others' respect, perhaps not. But seeing as he has good taste in hardware components he can't be too bad.

                      Lest anyone get confused, I'm no NVIDA fan boy. I've had cards from just about every major manufacturer including Matrox, ATI, NVIDIA, and 3dfx. I'm using the NVIDIA logo cause that's what I have in my current rig.

                      -[Ch]amsalot
                      First, welcome to the forums I hope you find it as enjoying as the rest of us who call it home.

                      Second, why would the show of an nVidia log be intimidating, showing your resiliance, or show anything other than you may want to get attention?

                      In forums, you don't show resiliance by displaying an unpopular logo, you earn through good debate.

                      I will address your first post later on, and maybe you can enlighten me with some good debate.

                      Rags

                      Comment


                      • #41
                        Originally posted by [Ch]amsalot
                        You see the world differently from me. I see the end results... Intel, AMD, and NVIDIA shipping out faster procs cheaper than ever before on a smaller process. I think to myself, if NVIDIA can do it, why not Matrox?

                        So, not to beat a dead horse, but why not Matrox? Politics? Limited access to .13 micron fabs? Money? I find it difficult to believe that the next gen parts from ATI and NVIDIA won't be based upon .13 micron.

                        -[Ch]ams
                        Because Matrox decided to get their next gen product to market on time. They needed it to get out, and it's worth the wait.

                        I am sure they will move to the next process "when they are ready".


                        Rags

                        Comment


                        • #42
                          Re: Soory, your wrong

                          Originally posted by Logan
                          "Also, there isn't any 130nm ready out there. Nothing, nada. Intel has it, and they don't share. Nobody else is doing it well"

                          Not true my friend. As TSMC has thier .13 micron process up. How do I know you ask. Simple, I have tested and used the NV30. It was/is built on the .13 process from TSMC.

                          Then there is the statement about Nvidia's chips costing video card makers more money. They don't the TNTs used to set back companies around 60 bucks per chip. the GF4s are in the nieghborhood of 35-40 bucks per chip.
                          It's obvious you have a reading problem. TSMC does NOT have the process down well enough to make it a viable process cost wise yet. I am sure they will get it refined, but it takes time, and that time is definitely not now.

                          Rags

                          Comment


                          • #43
                            Re: Parhelia needs .13 micron


                            My question should be taken literally.
                            Then my answer should be taken as such too.

                            At $399 for a 128MB card, Matrox is effectively keeping the Parhelia out of the hands of the masses and even out of the hands of most die-hard gamers.
                            I don't think Matrox is aiming for the masses. I am certain they will cover the mass market with a different product. This is not the toy, it is for people who are serious about their graphics.

                            the price is not acceptable. Fine image quality and acceptable framerates can be had with sub-$200 GeForce4 Ti 4200 cards.
                            No triple head, poorer gameplay in complex games, and far less features. A Ti4200 is not a card I would recommend to many, really. The Parhelia is a high end product deserving a high end price tag. That price tag happens to be cheaper than the high end price tag of the top of the line nVidia cards now and at their release.

                            In a stagnant market, the Parhelia would do very well. But we all know that the video card market is extremely fast paced. How soon will NVIDIA and ATI adopt their own form of 16xFAA?
                            Judging by their speed of stealing matrox engineers to help them do dualhead for GF3, I would say about 2 years to play catch up.

                            How soon will these 3D card giants introduce parts that are twice as fast as the Parhelia with even more advanced feature sets (e.g., fully DX9 compliant pixel shaders)? How soon will your $399 investment be rendered obsolete?
                            Your investment is not rendered obsolete because a faster product has come out. It is rendered obsolete when it won't perform at the level you wish it to.


                            In my NSHO,
                            I see some heaping helpings of humility in your future if you continue like this

                            Matrox needs to move Parhelia immediately to a .13 micron process,
                            No fab is ready to mass produce and debug a .13 process on a large scale yet. Call me in 4 months then let's talk about it. Because as it sits now, Matrox has their next gen part out and ready, and are tweaking it quickly while the others are trying to figure out how to keep their PCBs from cross talking on a 256 bit memory bus.


                            both to increase clock speeds as well as to drastically reduce cost. If not, Parhelia may be only slightly more successful than another extremely high-end (and very high-priced) board that never saw the light of day --- 3dfx's Voodoo5 6000.
                            Sorry, but 3dfx and Matrox don't compare. We shall soon see how close or far off the mark you are. But my gut tells me Matrox has a great platform from which to build current and future products from. They opened up a wide range of options, and played a smart move by moving it to market now to get it out there and let it get some daylight before the window of oportunity passed them by.



                            Rags

                            Comment


                            • #44
                              Nice said Rags!
                              NocturnDragon

                              Comment


                              • #45
                                I don't think Matrox is aiming for the masses. I am certain they will cover the mass market with a different product. This is not the toy, it is for people who are serious about their graphics.
                                As I said before, it's too slow for gamers on the current crop of games. When new games come out that would actually run faster on Parhelia, all other companies will have their next-gen part available. The $400 price will certainly scare gamers away especially since, on current available titles, excellent perfomance and quality can be had for around $200 (GF4 Ti 4200/4400).

                                For 3D professionals, the card has no OpenGL track record and therefore OEMs will be extremely hesitant to adopt it. It also will be slower for most professionals than NVIDIA Quadro.

                                For business execs. who want multi-monitor, there are far cheaper alternatives from ATI.

                                So, Matrox has targeted a relatively small audience, specifically, very wealthy gamers who value 16x FAA over sheer FPS, professionals who must have 3 monitor support, and of course, Matrox loyalists. The R&D costs for the card must have been tremendous, and the costs of manufacture (including preparing the fab) must have been astronomical. Even at $400, it will take many, many card sales for Matrox to break even. I do not see this card being popular with OEMs.

                                Only time will tell whether the Parhelia will be an economic success or disaster for Matrox (and I certainly hope it does well). Regardless, Matrox's Parhelia is a technological achievment that will raise the bar for all future 3D cards to come.

                                -[Ch]amsalot

                                Comment

                                Working...
                                X