Announcement

Collapse
No announcement yet.

Carmack on Parhelia

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #46
    On the other hand we should keep in mind that JC is quite conservative about technologies.
    The fact that he doesn't use most recent technological gadgets doesn't mean they're bad.

    JC loves polygons, does it automaticly mean that voxels are bad ?
    JC uses OpenGL, does it mean that DX sucks ?

    The funniest thing is that even people who hate FPS still take everything he says as granted.

    Since Unreal came out, I like their work better than the Quake/Doom series. Unreal uses DX and they might decide to use some other features JC so easily dismisses as non relevant.

    Comment


    • #47
      Could it be, I'm not sure, but probably having a nice software running on about the 40% of italian medium and big companies and another specialized software sold worldwide make me qualify somehow as a programmer.
      I'm qualified as a programmer too, but keep in mind that medium and big companies aren't running games. The type of programming we are talking about has to run on current systems. Businesses want something that will fulfill their needs at moment.

      Gamers are a very different target market. No one wants to buy a game that looks just as good as everyone else's. They want a game that looks better than anything they've ever seen before. When I'm at work, I target my solutions at businesses. When I get home, I want to play games that look like Doom 3.

      Brand loyalty can take us a long way. When the new cards are out, it Matrox offers a solution that's nearly as good as thoe from NVidia and ATI, I'll buy the Matrox in a heartbeat. At the moment, MAtrox is asking us to pay a Geforce4 price for something delivering Radeon 8500 performance. No amount of brand loyalty can justify that.......
      Last edited by Cheesekeeper; 26 June 2002, 03:17.

      Comment


      • #48
        Hi Drizzt,

        thanks, it's clearer now. (and don't worry about your English, I can understand you just fine.)

        I think you are slightly mistaken, none the less, regarding the system requirements question of a games engine. The New Doom engine is not only driving DooM3, but is also supposed to be sold as middle ware to other developers. Hence, it should be top-notch for some time to come--the New Doom engine will be driving high-end games for the next 4-5 years, after all. So Carmack really has to push the envelope somewhat.

        Also, he's a CG nerd, so there you go.

        Generally, I am not too happy about JC's last couple of postings. He phrases his statements in a rather, well, close-minded way: if a feature doesn't help people run DooM3 faster or better, the feature is uninteresting and a waste of time talking about. This was especially apparent in the latest .plan file's rather ham-fisted disregard of higher order surfaces.

        ta,
        -Sascha.rb
        Visit www.3dcenter.de

        www.nggalai.com — it's not so much bad as it is an experience.

        Comment


        • #49
          Well let's look on Doom 3's release date: Q3 2003, if we include the usual delays in the gaming industry, I would say christmas 2003. At this time, the R400(?) and the NV40(?) should be available, and the R300/NV30 will be about one year old. Who knows how well the game actually runs on a R300? It seems like all the people who consider buying a card with this chip are thinking that it's guaranteed for them to get 60+fps in Doom 3 with all the eyecandy. But maybe it runs much slower....

          I just want to say with that that it's till sooooo much time till we see just the box of Doom 3, everybody who think to buy a card this year which offers best performance for a game which takes over another year to hit the stores should think again.... If you really want to play Doom 3 at it's best, then you should wait for the cards of the next year, instead thinking about cards of this generation.
          Specs:
          MSI 745 Ultra :: AMD Athlon XP 2000+ :: 1024 MB PC-266 DDR-RAM :: HIS Radeon 9700 (Catalyst 3.1) :: Creative Soundblaster Live! 1024 :: Pioneer DVD-106S :: Western Digital WD800BB :: IBM IC35L040AVVN07

          Comment


          • #50
            Good point, I never even thought about Doom3 release date. Hopefully by that time a revision of the Parhelia will be out by then.
            Fenrir(AVA)
            "Fearlessness is better then a faint-heart for any man who puts his nose out of doors.
            The length of my life and the day of my death were fated long ago"
            Anonymous lines from For Scirnis

            Comment


            • #51
              Originally posted by 103er-Fan
              Well let's look on Doom 3's release date: Q3 2003, if we include the usual delays in the gaming industry, I would say christmas 2003. At this time, the R400(?) and the NV40(?) should be available, and the R300/NV30 will be about one year old. Who knows how well the game actually runs on a R300? It seems like all the people who consider buying a card with this chip are thinking that it's guaranteed for them to get 60+fps in Doom 3 with all the eyecandy. But maybe it runs much slower....

              I just want to say with that that it's till sooooo much time till we see just the box of Doom 3, everybody who think to buy a card this year which offers best performance for a game which takes over another year to hit the stores should think again.... If you really want to play Doom 3 at it's best, then you should wait for the cards of the next year, instead thinking about cards of this generation.
              Full ACK.

              Both buying or damning a card on grounds of current DooM3 "performance" is a rather pointless thing to do.

              ta,
              .rb
              Visit www.3dcenter.de

              www.nggalai.com — it's not so much bad as it is an experience.

              Comment


              • #52
                i just would like to point out that software tune up improvement optimisation has virtualy fallen into dust:

                hardware improvement is the chosen way: it cost so much less to improve or design new hardware than it cost to do the same on software... there is no point beeing pissed about it is just a question of monney. (you do not wonder why you arent running windows XP on a P133 wit 64 meg of ram ?)

                there you are ...

                the only trouble with the pargelia is that in 6 month time it is highly likely that you wont be able to play games with fun .

                more over pro arent likely to spend 400$ to get a parhelia, because they already have a G400....

                so what do i get for 400$
                mediocre game perf: i m a gamer = not for me, i have a pro use = i do not play anyway so G400 is ok

                10bit color
                gamer = not for me, pro = let see what they say 400 $ ?

                3 head
                in steat of 2 video card i can have one... 400 $?


                what is pharhelia screaming is: if you have a pro use and have shit load of monney to spend , buy me.

                remember haig : he said we espect to sell 1000 cards...

                Pharhelia = optical illusion ..... No wonder

                Comment


                • #53
                  Pharhelia = optical illusion ..... No wonder
                  ouch, that hurts...

                  I for my self read all reviews and think that the truth of the card lies a little above the middle of all said. I think that matrox did a good job with parhelia, they just released it a bit too early. But I don't care. On Linux everything will behave very different, and all I want are just these great features not especially FPS. I don't have any problem playing UT in 800*600 with AA because it still looks very good.
                  A second card for triplehead is just a waste of pci-bandwith. And i need all bandwith for DVB-T and RAID-5.

                  Comment


                  • #54
                    Originally posted by 103er-Fan
                    Well let's look on Doom 3's release date: Q3 2003, if we include the usual delays in the gaming industry, I would say christmas 2003. At this time, the R400(?) and the NV40(?) should be available, and the R300/NV30 will be about one year old. Who knows how well the game actually runs on a R300? It seems like all the people who consider buying a card with this chip are thinking that it's guaranteed for them to get 60+fps in Doom 3 with all the eyecandy. But maybe it runs much slower....

                    I just want to say with that that it's till sooooo much time till we see just the box of Doom 3, everybody who think to buy a card this year which offers best performance for a game which takes over another year to hit the stores should think again.... If you really want to play Doom 3 at it's best, then you should wait for the cards of the next year, instead thinking about cards of this generation.
                    Sadly the UT2003 demo will be out in the next month and the P runs that like a (sun)dog as well.

                    Comment


                    • #55
                      People...I say calm down.

                      I have been running a Radeon 7500 the last 8 months and I have found it more than adequate for every single game out there. It will probably even run UT2003 OK aswell.

                      The Parhelia promised a lot more than it delivered, but it did deliver a lot that hasn't been delivered before, triplehead, FAA and gigacolor for example. That's more than plenty and makes it the most complete card around followed by Radeon 8500 and then maybe some GF4 Ti card.

                      In my personal ranking I do consider quality, not only 2D but also 3D and TV-out just to mention few.

                      Comment


                      • #56
                        Problem is that it's performance in 90% of areas is equivalent to 1 1/2 year old tech. For $400, I really hoped that it would run games better than "OK". I have my fingers crossed that my card will be able to overclock to 300mhz and give me near ti4600 performance.

                        Comment


                        • #57
                          I was reffering to the Radeon 7500 when I said OK, point is that we will probably all be more than satisfied with the cards performance.

                          To me it seems that the only thing that seems to be nagging people is that The Ti's and the 8500 are faster...

                          Comment


                          • #58
                            And almost the entire GF3 series. (ti200 plays at its level in 32 bit) As far as the Radeon comment, the Parhelia is right around the performance of the 7500 too.

                            Comment


                            • #59
                              Sadly the UT2003 demo will be out in the next month and the P runs that like a (sun)dog as well
                              Excuse me?
                              Meet Jasmine.
                              flickr.com/photos/pace3000

                              Comment


                              • #60
                                Not a real comment on Parhelia (good or bad) from me, instead, this is about Carmack's comment about quad approaches-

                                I think he's trying to make a point that so far those approaches, while being one method of adding much detail to a 3D scene, don't lend themselves well for the final solution (reference his comments about shadows in relation to the Radeon and Truform)..

                                For more realistic and accurate shadowing, he's NOT using those sorts of effects. I think he sees the future of hardware as continuing to get more powerful (features AND performance), which will allow future games based on future versions of the engine to be even more realistic (more polies, more realistic shadows and lighting in general). If he used one of the high-order-surface methodologies as the basis for the engine, he'd possibly have to rewrite much of the code when the hardware gets to the level that it can do the greater workload (meaning that it could use more polies and show more accurate lighting / shadows without resorting to HOS.).

                                Not being a graphics programmer, I don't have an opinion on the issue, but at least Carmack seems to be fairly consistent in his responses.
                                Last edited by Snake-Eyes; 26 June 2002, 09:33.
                                "..so much for subtlety.."

                                System specs:
                                Gainward Ti4600
                                AMD Athlon XP2100+ (o.c. to 1845MHz)

                                Comment

                                Working...
                                X