Announcement

Collapse
No announcement yet.

Parhelia!!!

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Originally posted by Helevitia


    This is such crap. What is the point of signing an NDA if you are going to f*ck it up by telling people what you know? Credibility comes from years of respect for which Ant has proven time and time again. Sounds like you are just trying to bait him and others to give out more information. Gawd, I am getting tired of all of this negativity. Why can't people just accept the infomration for what it's worth and be happy about it. They could have told us nothing and we'd all still be sitting here with our thumbs in our butts wondering if we should be upgrading to a GF4 or ATI. I, for one, am extremely greatful that the info was released and I hope to god they announce the card..."The S word"....so everyone will quit whining. Sheesh!

    Dave

    Non disclosure agreements(if any were signed at all)only applies if you're sure that the actual specs are indeed true,wich in that case,is fully understandable...nobody in their right mind would disclose it...

    This news however,still falls under the pure speculation category,just like vigilant said it does,even though he may has informed sources in the know about what matrox may or may not release in the near term and what form it might take(specifications wise),nothing is certain at this point in time...


    In my case,i'm not even considering an upgrade at all until there's cards out there that significantly out perform my own(50~60% better) and bring full DX9 support as well...so just chalk up my questions to curiosity really,nothing more.
    note to self...

    Assumption is the mother of all f***ups....

    Primary system :
    P4 2.8 ghz,1 gig DDR pc 2700(kingston),Radeon 9700(stock clock),audigy platinum and scsi all the way...

    Comment


    • Are you just now getting clued in on this fact Technoid... regardless our NDA still applies
      "Be who you are and say what you feel, because those who mind don't matter, and those who matter don't mind." -- Dr. Seuss

      "Always do good. It will gratify some and astonish the rest." ~Mark Twain

      Comment


      • Greebe:
        dZeus:
        GNEP:

        You did not exactly understand why I said what I said but anyway, it does not mather at all!

        Long live Matrox!
        Long live quality products!
        Long live Parhelia?


        Macbeth

        Comment


        • Non disclosure agreements(if any were signed at all)only applies if you're sure that the actual specs are indeed true,wich in that case,is fully understandable...nobody in their right mind would disclose it...
          You got that right

          This news however,still falls under the pure speculation category,just like vigilant said it does,even though he may has informed sources in the know about what matrox may or may not release in the near term and what form it might take(specifications wise),nothing is certain at this point in time...
          Some of us may only drop remotely interesting hints of what's to come (as above ie No Damn Answers). But don't think for a second that's based on pure speculation!

          In my case,i'm not even considering an upgrade at all until there's cards out there that significantly out perform my own(50~60% better) and bring full DX9 support as well...
          It's ok superfly, we already know where your money is going to be spent
          "Be who you are and say what you feel, because those who mind don't matter, and those who matter don't mind." -- Dr. Seuss

          "Always do good. It will gratify some and astonish the rest." ~Mark Twain

          Comment


          • Where did these full DX9 specs come from?

            We know displacement maps will be included.

            We know shaders will be included, following the ATI specs, as Matrox and ATI have and agreement about those(well, at least in openGL), and it would be silly for ATI/Matrox to make the silicon for openGL incompatible with DX9, given that ATI has a large say in the development of DX9.

            I would not expect the next card to be 100% DX9 compatible, but it will almost definately be DX9 compliant.

            As it will probably be another 3 years untill you NEED a DX9 card to play games, its not an issue.

            Ali

            Comment


            • OK guys, ponder this:

              why would anyone give a single-chip card a plural name?

              All the hints are in this thread.

              rubank

              Comment


              • Originally posted by Greebe


                It's ok superfly, we already know where your money is going to be spent

                You could be right,if it turns out to be THAT good,since i don't mind spending 400~500,even up to 600$ on a card as long as it performs the way i expect and demand it to and has a realistic 18~24 month life expectancy as a viable card,be it gaming or otherwise...


                Just remember that i will be comparing it to both the NV30 and ATI's R300(radeon 9*** series),before i pass final judgement....


                Ali...It isn't so much much the extra features in DX9 that i'm looking for in my next card,although ultimately it'll be a nice thing to have,it's mostly having the raw fillrate,texture handling and bandwith needed to handle any title that'll be released within the next 2 years at least...

                Everyone is emphasizing poly transform engine power these days,even though no game will like use even a fraction of what current cards can already do,polygon wise,since overall bus speeds,while steadily improving,aren't nearly fast enough to feed current cards with enough polys to really make full use of their transform engines as it is,when used in an actual game environment...


                Even next gen games like Doom 3 will average about 150.000 per frame,which comes out to about 9 million total if we want 60 fps,less than a quarter than what a GF3 can do,but the main performance hit will be from the use of full hardware lighting(no more lightmaps)and the use heavy multitexturing(entire scenes using environmental bump maps)...


                Yet chip makers keep boasting higher numbers for their transformation engines because building faster ones is a relatively straight foward process,the basic calculation are always the same,there's just more of them ot process...


                The next step is seriously boosting the hardware lighting capabilites,but that's hugely more difficult to and costs alot more in terms of transistor budget to accomplish, since the calculations involved are way more complex compared to just processing vertices,especially for the more complex forms of lighting,like raytrace lighting for instance...Even hardware costing several thousand$ still can't do it in realtime...
                note to self...

                Assumption is the mother of all f***ups....

                Primary system :
                P4 2.8 ghz,1 gig DDR pc 2700(kingston),Radeon 9700(stock clock),audigy platinum and scsi all the way...

                Comment


                • The only thing I'm worried about is if Matrox has enough money to mass produce+distribute the Perhelia at all. I heard two months back that they're nearly hitting Chapter 11..

                  .
                  .
                  .

                  Don't get me wrong though. I really hope they'd release something that can bring 'em back to the high-end video card market. I remember how much I loved my Millenium G400.

                  Comment


                  • They should have more than enough resources to launch that something we are all hoping to get.

                    Comment


                    • We'll see in 2 months..right Greebe...
                      note to self...

                      Assumption is the mother of all f***ups....

                      Primary system :
                      P4 2.8 ghz,1 gig DDR pc 2700(kingston),Radeon 9700(stock clock),audigy platinum and scsi all the way...

                      Comment


                      • I only need one answer when....

                        and don't f%$%%g say Soon $%$^$%

                        Comment


                        • Soo...aouch...don't hit me!!!

                          Comment


                          • (chain saw on stanby)

                            Comment


                            • just a thought....

                              The feature set for DirectX 9 has probably already been finalized, or close to finalization. Microsoft probably has been working with several graphics cards manufacturers (matrox included) in getting what features and how they work sorted out...

                              Keep in mind that DirectX is more than just portion that interfaces with the card. It is both an API and a full blown SDK. The first stage that they would (logically) have to do in their design process is complete a feature set. The second stage would then be making the DirectX libraries capable of such features and updating the source headers to include the features. the third stage would probably primarily consist of bug searching, documenting the changes to the API, and writing examples for using the API, or pretty much making the SDK well rounded.

                              The GeForce3 was released (behind schedule) 5 months after DirectX8 was released. My understand is that it was supposed to ship around the time DirectX8 was. There is no way that they started implementing these features as late into the stages of DirectX8's development as when it started beta testing. It takes far longer to design a chip, develop a product around the chip, write/modify drivers to work with the hardware, make the drivers work with OpenGL and Direct3d, and then debug all of the above so that it works right. keep in mind that when a company announces a chip, its finished. they are making them, they are working with other companies to ensure their designs work, they are polishing drivers, they are pretty much working on shipping the product, not finishing it.

                              Anyways, my point was going to be somewhere along the lines that the features of DirectX9 could be feasably implemented before it is released (or beta testing has even started) on a product that isn't released. All that such a product would need is DirectX9 native drivers. Remember that was has not been finalized is the DirectX9 API, the feature set could very well be finalized.
                              "And yet, after spending 20+ years trying to evolve the user interface into something better, what's the most powerful improvement Apple was able to make? They finally put a god damned shell back in." -jwz

                              Comment


                              • Seems like Matrox had the Parhelia planned for H1-2002 a while...

                                Quote from MURC April 20, 2001:
                                "A Matrox rep at the AMD Tech Team dinner in Philadelphia tonight provided solid info regarding Matrox's current and future plans. To quote: '"G800 is completely dead." Apparently, the G800 was 'based on G400 design' and featured doubled pipelines, 128bit DDR memory, hardware transform and lighting, and some form of dual-chip technology. A slimmed down version will be hitting markets shortly, however. G550 is a single-chip solution featuring the same doubled rendering pipelines, 128-bit DDR support, but only a limited form of T&L. According to the rep, Matrox analyzed T&L and found only one feature which needed immediate implementation: four-vertex matrix skinning. For those who didn't read up on Radeon's new tech, this skinning method allows two polygon objects to be "glued" together, and the hardware provides an interpolated join, resulting in smoother looking joints in 3D models, especially when moving.

                                The G550 will, of course, feature Matrox's great eDualHead tech and their renowned 2D clarity. The card is expected to fall between the GeForce 2 MX 400 and the GeForce 2 Pro in speed, most often falling just behind a GeForce 2 GTS. The G550 is expected to run in the $150 to 200 price range and be available within the next few weeks, "end of May at the latest."

                                In a more nebulous statement, the rep said, "Remember when G400 MAX first came out, and it was fastest thing and looked best? We are doing that again in first half 2002. We will be kings of 3D again, until someone jumps past us again. You know how 3D is." Sounds like G1000 will be a kicker next spring. Here's hoping Matrox can field a six-month cycle and get back in the game."
                                a Rebel at heart

                                Comment

                                Working...
                                X