Announcement

Collapse
No announcement yet.

Vertex Shader 2.0 on Parhelia

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #61
    If games requiring 2.0 shaders are going to (more than likely) run slowly on a Parhelia, most probably too slowly to be playable (Doom III), then why invest all the extra time into enabling them on the current range of cards Parhelia cards? It would be a complete waste of time and money. I sure Matrox knows this, which is why they are stalling development, they are in business after all, and time is money.

    An industry like the graphics card industry is full of hype, and we as the general public are often taken in by the hype and meaningless 'buzz-words'. I bet most of the people harking on about the lack of 2.0 shaders don't really have much of a clue as to how such an upgrade could benefit them.

    I stand by what I said before, it really isn't that big a deal. Matrox would be much better off investing their time and money into developing the next generation of Parhelia cards, which will more than likely have 2.0 shaders or better as standard.

    Why flog a dead horse? - sad perhaps, but true, the Parhelia is old hat now, already several generations old. Matrox has a slower release cycle than the major players, but it also manufactures (IMHO) far better cards.
    What do you want a signature for?

    Comment


    • #62
      perhaps...

      maybe it's something like this .

      Comment


      • #63
        If that says anything, it's that Matrox treads carefully, sussing out the potential demand for a technology before developing for it.

        Matrox is not a particularly large company, and as such almost certainly cannot afford to make too many expensive mistakes, why not sit back, let the big players get on with it, and see what happens.
        What do you want a signature for?

        Comment


        • #64
          than what about EMBM/triple head in games/displacement mapping?

          great features if you ask me...but not many games use them seriously...

          Comment


          • #65
            Why do people think that the hardware requirements for doom3, will be repesentative for dx9 games (or games that can make use of VS 2.0)? just because the doom3 alpha runs slow on parhelia, doesn´t mean that every game that uses VS 2.0 will run just as slow.

            id softwares other games has always been more demanding than other games that were out at that time. and in the case of doom3 performance, it is more a question about an optimized codepath, than the direct3d drivers for the parhelia,(which BTW doesn´t matter at all with doom3, since it uses opengl).

            VS 2.0 is direct3d and has absoulutely nothing to do with doom3. matrox said parhelia supported opengl1.3 and nothing more AFAIK (though they mentioned a fragmentshader-extension for opengl, and (I think) an arb_vertex_program or something like that, that also goes beyond opengl1.3), but that is a completely different topic!

            My personal quess about future dx9 games will be that they:

            1: have a fallback for dx8 pixelshaders, if dx9 pixelshaders isn´t present, OR use the dx9 HLSL, in which case it will be up to matrox to write a shader-compiler that can utilize parhelias dx8.1 shaders in the best possible way.
            2: doesn´t require more of the VS 2.0 than a geforce fx 5200(non-ultra) can handle, which is significanly slower than the parhelia.
            Last edited by TdB; 10 May 2003, 07:06.
            This sig is a shameless atempt to make my post look bigger.

            Comment


            • #66
              I believe that we, as customers, stand to benefit from the present implementation of VertexShader2.0. As such, would like to see it implimented now.
              Let us return to the moon, to stay!!!

              Comment


              • #67
                I believe that we, as customers, stand to benefit from Matrox optimizing the drivers (and removing any current bugs) first, and as such worrying about implementing VertexShader2.0 when real programs as opposed to a benchmarking suite actually require them.
                Juu nin to iro


                English doesn't borrow from other languages. It follows them down dark alleys, knocks them over, and goes through their pockets for loose grammar.

                Comment


                • #68
                  Originally posted by joonie
                  Why don't you stop trolling Indiana.
                  I don't read responses of your bull shit but I read others.
                  If you want to troll and badmouth others, go home and take care of your mama.
                  You got rhythm dude!!

                  I believe that we, as customers, stand to benefit from Matrox optimizing the drivers (and removing any current bugs) first, and as such worrying about implementing VertexShader2.0 when real programs as opposed to a benchmarking suite actually require them.
                  But we have been waiting for bug fixes (especially Linux driver) for sooooo long. And still buggy.
                  P4 Northwood 1.8GHz@2.7GHz 1.65V Albatron PX845PEV Pro
                  Running two Dell 2005FPW 20" Widescreen LCD
                  And of course, Matrox Parhelia | My Matrox histroy: Mill-I, Mill-II, Mystique, G400, Parhelia

                  Comment


                  • #69
                    I believe that no matter how silly the benchmarking suite may be, people use it. And by enabling VS2 Matrox could potentially gain sales.

                    Comment


                    • #70
                      And you would prefer to wait longer so we get vs2.0?
                      Juu nin to iro


                      English doesn't borrow from other languages. It follows them down dark alleys, knocks them over, and goes through their pockets for loose grammar.

                      Comment


                      • #71
                        @bsdgeek - I'm not saying that we shouldn't get them, but right now they are not my main concern. real *nix drivers, other bugs sorted bother me more. If futuremark worried Matrox so much, I'm sure we'd see optimized *cough* ATI *cough* type drivers...
                        Juu nin to iro


                        English doesn't borrow from other languages. It follows them down dark alleys, knocks them over, and goes through their pockets for loose grammar.

                        Comment


                        • #72
                          Originally posted by joonie
                          Why don't you stop trolling Indiana.
                          I don't read responses of your bull shit but I read others.
                          If you want to troll and badmouth others, go home and take care of your mama.
                          what? are you blind? go to www.rage3d.com and to the forums.... AT LEAST as many different complaints about ATI drivers/hardware as there are about Matrox', and multiply the amount of whine posts with the number of times of more users of ATI cards than of Parhelias.

                          examples:
                          Half-Life engine runs like SHIT on the R300 (I stopped playing CS because of terrible performance), all the things Indiana said, horrible TV-Out on the R300, etc...

                          Due to the complexity of the design of current videocard architectures, all products will have flaws that need to be decided on by the manufacturer weather they are worth fixing in drivers (if they can be fixed at all).

                          And this is something you can fully expect to happen for almost any product in the current market.

                          Comment


                          • #73
                            Just to add to that, nVidia's not innocent either, just read this thread.

                            Comment


                            • #74
                              Originally posted by dZeus
                              Half-Life engine runs like SHIT on the R300 (I stopped playing CS because of terrible performance)[/B]
                              You might want to have a look at the post Catalyst3.2 driver that has been recently released by Dell. This one has the CS bugs fixed (both, the low-fps bug and the ESC-crash bug) and features better performance overall in FSAA and anisotropic filtering, plus working hardware-assisted deblocking in the new DivX-player. (Note: you'll have to install that driver manually through device manager, since the setup will only accept DELLs cards)

                              horrible TV-Out on the R300, etc...
                              Hmm, the TVOut of my Radeon9700 here is definitely not bad. You have to make sure that you use the 800x600 resolution and enable overscan via one of the tweakers.
                              But then I'm normally using the VGA-in of our Philips 104" TFT-TV, anyways.


                              Back to the topic of promised, but unsupported features:
                              I'm a bit mad about ATI not supporting the promised supersampling FSAA since this (although a stupid "brute force" AA approach) has advantages over the technically more advanced multisampling they use for older games with crappy textures or for games with alpha textures.
                              I was mad at nVidia for their faked anisotropic filtering of the TNT (although anisotropic was clearly stated on the feature-list of the box) at that time.
                              I was mad at Matrox for their somewhat "lacking" OGL support of the G400.

                              But I would've never started myriads of whine-threads because of this, those are IMO all minor faults and I'm clearly aware that this is the way the PC industry works (unfortunately).

                              The lacking of VS2.0 doesn't do any Parhelia user any harm, so what's all this fuss about. I do have a Radeon9700 and the only thing that uses PS2.0 / VS2.0 are the (admittedly great looking) ATI demos and the (less great looking) 3DMark03. No game needs or even uses these - so I really can't understand the sense in whining about this.
                              Last edited by Indiana; 10 May 2003, 15:22.
                              But we named the *dog* Indiana...
                              My System
                              2nd System (not for Windows lovers )
                              German ATI-forum

                              Comment


                              • #75
                                well, in PAL mode my 9500 Pro has satisfactory tv-out quality... the problem is when I change it to NTSC mode... it looks like complete crap.... weird 'dithering/checkerbox'-like effects. With my G400 the output in NTSC mode was perfect.

                                I'm running in overscan, and desktop resolution should not matter since I'm using theater mode (btw. 16:9 scaling craps out completely here too).

                                as for the Half-life bug... I'll wait for the official Catalyst 3.4 and see how well it will perform with that one.

                                Comment

                                Working...
                                X