Announcement

Collapse
No announcement yet.

Don't buy Parhelia for Doom3

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #31
    Or maybe with that HUGE transistor count, there are some things still disabled in the card, waiting for proper driver support (deluded hope anyone ?)

    Comment


    • #32
      I certainly can understand JC's lack of desire to implement a backend specific to the Parhelia when its OpenGL drivers are still fairly buggy. i suppose that with a year left it could be considered proper time management.

      I am however suprised at how little he apparently cares about doing it. I would think that the card could offer fairly decent performance in an accelerated mode because of some of the features on the card. it might not be as fast as some other cards, but i could see it being faster than a GF3 or Radeon 8500.

      some interesting quotes...

      Q - What kind of play length are you looking at for Doom III?

      A - Well, it will not be real long and if you are an experienced gamer, you might be able to beat it in a weekend, however the game will feature a lot richer content so that alone will add to the playability.

      Q - What about the new 3D Labs video cards?

      A - From what I have seen so far, the new 3D Labs cards are not going to be competitive with the highest end video cards. The card has good drivers, however, and the virtual texture mapper is good, and I think other companies will adopt this. I don't think that it will be a card that will be adopted by consumers.
      kinda a shame it will be short (supposedly), and also that he has focused on the 3d labs card, which he admits is not going to be that popular.

      oh well, such is life...
      "And yet, after spending 20+ years trying to evolve the user interface into something better, what's the most powerful improvement Apple was able to make? They finally put a god damned shell back in." -jwz

      Comment


      • #33
        i guess D3 is so short because he wants to show off his engine. id sells engines not games, this has been discussed here already a few times before.
        and i guess he wrote a back end for the 3DLabs card because the has excellent openGL drivers and it was probably a piece of cake to do so.
        Last edited by thop; 19 August 2002, 03:14.
        no matrox, no matroxusers.

        Comment


        • #34
          Originally posted by thop

          and i guess he wrote a back end for the 3DLabs card because the has excellent openGL drivers and it was probably a piece of cake to do so.
          I was of the understanding that this was because the 3DLabs P10 had OpenGL 2.0 drivers. This being the first card with them (even though the spec is not final) it gives JC the chance to write a generic OpenGL 2.0 backend which most cards will eventually support

          So it's mainly for future cards
          "I contend that we are both atheists. I just believe in one fewer god than you do. When you understand why you dismiss all the other possible gods, you will understand why I dismiss yours." - Stephen Roberts

          µße®LørÐ - A legend in his underwear
          Member of For F*ck Sake UT clan
          DriverHeaven administrator
          PowerVR Network administrator

          Comment


          • #35
            Originally posted by UberLord


            I was of the understanding that this was because the 3DLabs P10 had OpenGL 2.0 drivers. This being the first card with them (even though the spec is not final) it gives JC the chance to write a generic OpenGL 2.0 backend which most cards will eventually support

            So it's mainly for future cards
            from what i remember from his .plan, the 3dlabs only had preliminary support for OpenGL 2.0, and from the way he talked about it, it was only in the form of OpenGL 1.3 extensions. the backend would still have to be 3dlabs specific. and, as there is a fair amount of language differences between GL1.3 and GL2.0, he would wind up having to write a whole new renderer around GL2.0, which he has commited to as it evolves.


            I got a 3Dlabs P10 card in last week, and yesterday I put it through its
            paces. Because my time is fairly over committed, first impressions often
            determine how much work I devote to a given card. I didn't speak to ATI for
            months after they gave me a beta 8500 board last year with drivers that
            rendered the console incorrectly. :-)

            I was duly impressed when the P10 just popped right up with full functional
            support for both the fallback ARB_ extension path (without specular
            highlights), and the NV10 NVidia register combiners path. I only saw two
            issues that were at all incorrect in any of our data, and one of them is
            debatable. They don't support NV_vertex_program_1_1, which I use for the NV20
            path, and when I hacked my programs back to 1.0 support for testing, an
            issue did show up, but still, this is the best showing from a new board from
            any company other than Nvidia.
            "And yet, after spending 20+ years trying to evolve the user interface into something better, what's the most powerful improvement Apple was able to make? They finally put a god damned shell back in." -jwz

            Comment


            • #36
              I think "Don't buy Parhelia for Doom3" is a bit extreme. Maybe D3 lacks one optimization or two, but I think the general idea is that everyone will need a super-high end system to run it. It won´t be so. That´s just marketing, it´s smart to associate a game with high-end hardware, so it makes it a high-end game.

              Carmack himself said that the game engine was built around NV10 hardware. That´s a GF1, for God sake. Of course Parhelia will run it just fine. If you can live with like 70 fps instead of 90, that is.

              This remembers me of all the hype around Unreal Tournament 2003/Unreal 2 games. The developers stated clearly that it wouldn´t run on a non-T&L card, blah, blah, and they actually even insinuated KyroII users were balantly ripped of because their new card won´t even run next-gen games.

              Guess what? I´m sure many of you have tried that *cough*demo*cough* that´s floating around the net. What I can say is that a beta, non-optimized, almost non-tweakable UT2003 demo runs at *almost* playable framerates on my POS laptop S3 integrated Savage4 video, that should have a 3d performance roughly equal to a G200...

              Comment


              • #37
                One year too early...

                To people who want to buy a card now that fits best for Doom3 can be said the following: Guys, you are one year too early.
                Of course there are some rumours that say Doom3 will be released early 2003, but I somehow doubt that.
                If you want the best card for Doom3 you should wait for the real next-gen chips, which would be R400, NV40 or whatever they might be called.

                Now you should only search for a card that fits best for UT2003, and I think our Parhelia is prepared to run that game well.
                Specs:
                MSI 745 Ultra :: AMD Athlon XP 2000+ :: 1024 MB PC-266 DDR-RAM :: HIS Radeon 9700 (Catalyst 3.1) :: Creative Soundblaster Live! 1024 :: Pioneer DVD-106S :: Western Digital WD800BB :: IBM IC35L040AVVN07

                Comment


                • #38
                  Well if the game will only take a weekend to beat, I probably wouldn't buy it anyhow. I'm looking more towards the games that Raven will make using the Doom3 engine. Just like SoF2 and JK2 are way cooler than Q3A IMHO. So go ahead, followers of Carmack, buy Doom3, tell me how it plays, and hopefully they'll release it for linux around the time the Parhelia drivers (2d and 3d) THEN I'd probably buy it. Not neccesarily for the game itself, but to support linux gaming, lord knows they need all the help they can get

                  Once again though, as someone else posted, couldn't Matrox themselves create the backend? If all goes well the Parhelia will totally kick ass in the Next generation of games.

                  Leech
                  Wah! Wah!

                  In a perfect world... spammers would get caught, go to jail, and share a cell with many men who have enlarged their penises, taken Viagra and are looking for a new relationship.

                  Comment

                  Working...
                  X