Announcement

Collapse
No announcement yet.

Carmack on Parhelia

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #76
    I think you got a point, Thop.

    Personally, I never bothered for Q3 (unless for doing some benchmarking, while I was trying to extract performance from my own system).

    But I was gladly gifted by ID's engine, cause I just loved to play RTCW and MOHAA... and even want to buy me a copy of Jedi Knight...

    The "lazyness" from the games programmers does exist, but that complain for sure (in my opinion) does not goes to JC.
    <p><font face="Verdana, Arial, Helvetica" size="1">"Dadinho o C@r@$, meu nome agora � Z� Pequeno" - City Of God</font></p>
    <p><font face="Verdana, Arial, Helvetica, sans-serif" size="1">A64 @ 2,25 + 1GB + GT6600</font> </p>

    Comment


    • #77
      Hi there,
      Originally posted by xortam
      Christmas 2003 I was reading that as the end of this year. I'm not going to hold off 18 months to buy a replacement for my Max. I'm surprised its still that far away considering the demos that were recently shown. I can't be too concerned about what Doom III will require if it won't appear until 12/03. The graphics industry moves too fast for me to plan that far off. I don't game much and what I do play is the Myst/Riven and Quake/Doom games. I bought my Max with Q3A in mind. I was hoping to find a snazzy immersive single player game like Doom III to enjoy next but I guess I'll need to enter the world of UT or something. This concern over Doom III is meaningless to me know.
      I'm one of the blokes toting the Christmas 2003 number. Let me take the sting out of it.

      The Christmas 2003 date has been mentioned by CNN in their (TV) E3 recap. The only official "date" that has ever been directly communicated by id software is "2003", no month (cf. the huge E3 ad they had up). It's possible that the Christmas bit was an extrapolation by CNN or an overly excited and inaccurate reporter who felt like providing something more than a mere year.

      Considering past id titles, though, it would make sense--release the doom3 test somewhen in Q2-3 2003, and the final game in Q4. id always aimed at Christmas releases, for good reasons, so, eh. I think it's highly probable.

      Mark both the "I" and the "think" in that last sentence, please.

      ta,
      .rb
      Visit www.3dcenter.de

      www.nggalai.com — it's not so much bad as it is an experience.

      Comment


      • #78
        Secondly, it sounds to me like JC may well be developing something that is just too demanding. There is no way that the majority of PC Gamers are going to spend $400 or so just to play one game!!! It may look beautiful, but I doubt that many people have vast sums of money hanging around in their pockets!
        That's what people said about Quake when we were all playing Doom and DN3D. Then it was said about Q2, and said a lot about Quake3 (and its infamous curved surfaces).
        Gigabyte P35-DS3L with a Q6600, 2GB Kingston HyperX (after *3* bad pairs of Crucial Ballistix 1066), Galaxy 8800GT 512MB, SB X-Fi, some drives, and a Dell 2005fpw. Running WinXP.

        Comment


        • #79
          Originally posted by xortam
          This concern over Doom III is meaningless to me know.
          Oops ... corrected that typo.

          Originally posted by nggalai
          ... and the final game in Q4.
          Still too far away. I'd have to look at HW that's available around the time of its release if I expect it to play the game quite well.
          <TABLE BGCOLOR=Red><TR><TD><Font-weight="+1"><font COLOR=Black>The world just changed, Sep. 11, 2001</font></Font-weight></TR></TD></TABLE>

          Comment


          • #80
            Originally posted by Jon P. Inghram
            Just curious, what does he mean by "quad based approaches"?
            He seems to be assuming that displacement mapping is done on quad-based primitve types, which isn't the case [with Parhelia] -- he's human, everyone makes mistakes from time to time.

            Higher-order surfaces don't fit in the scope of the DOOM3 engine because of the lack of standard in implementation and lack of support in some higher end cards (nVidia for example). I think John would rather stick to technology that is proven to be standard and/or widely adopted to ensure his engine performs as intended on all platforms without the hassle of chipset specific features that require redesigning his rendering architecture at every turn.

            Just my 2 cents worth.

            Comment


            • #81
              Originally posted by Frost
              He seems to be assuming that displacement mapping is done on quad-based primitve types, which isn't the case [with Parhelia] -- he's human, everyone makes mistakes from time to time.

              Higher-order surfaces don't fit in the scope of the DOOM3 engine because of the lack of standard in implementation and lack of support in some higher end cards (nVidia for example). I think John would rather stick to technology that is proven to be standard and/or widely adopted to ensure his engine performs as intended on all platforms without the hassle of chipset specific features that require redesigning his rendering architecture at every turn.

              Just my 2 cents worth.
              Not so much that, but its also the fact that you cannot read back what polys are given to a scene as its done later in the engine. the doom3 engine makes multiple passes, and without knowing where the polys are, later passes are effected. in Doom3, the major thing is the lighting.

              If matrox *really* wanted to give JC something to work with, make it so there is a higher order surface implementation that allows the GPU to recalculate the verticies and *then* let you read it back, so that it didn't break his lighting engine.

              the thing that scares me about Carmacks comments about the Parhelia is that the Parhelia can do 4 textures on 4 pixels each clock. He has stated that it basically does 6-8 texture ops each frame. the fact its not performing well doing the thing its designed to do *really* disturbs me. just like the relatively poor vertex and pixel shading support.

              about JC being and idiot and not optimizing.

              I challenge *anyone* here who has the balls to say that he is an idiot and doesn't optimize his engines to write an engine capable of doing *real time* shadow casting with 10k polys on screen at any given time that can pull 30fps on an 'old' GF3.

              and not just crappy stencil shadows like the Q3 engine has. i mean 100% shadows. every single poly being affected by every single light.

              do the math. it takes a lot of resources. there is a reason that no one has bothered to do it before now. its not a lack of knowing how to, its a lack of hardware power to do it in real time.
              "And yet, after spending 20+ years trying to evolve the user interface into something better, what's the most powerful improvement Apple was able to make? They finally put a god damned shell back in." -jwz

              Comment


              • #82
                OT: UT2003 performance

                Hi there,
                Sadly the UT2003 demo will be out in the next month and the P runs that like a (sun)dog as well
                Excuse me? Have you actually compared the UT2003 benchmark results?

                The Parhelia is on par with a Radeon8500 at normal settings (sometimes 10% faster, even). Whith 16xFAA, it's a lot faster than the R8500 and even beats the GF4Ti4600's 4xMSAA scores. I think 43fps with unoptimised drivers is quite impressive for 1024x768x32 / 16xFAA; considering this is still not the final build of the engine, things can only get better.

                Also, according to Anand the Parhelia delivers the best visual quality of the three boards with FAA and AF enabled, and different from the Radeon8500 Catalyst drivers produces hardly any visual artefacts.

                So, what's the problem?

                I am constantly surprised how people bitch about Parhelia on grounds of it being too expensive for the raw performance it delivers. A stretch limo can be more expensive than a Porsche 911, too--is it a bad car for that reason alone?

                ta,
                .rb
                Visit www.3dcenter.de

                www.nggalai.com — it's not so much bad as it is an experience.

                Comment


                • #83
                  Regarding Drizzt's post.


                  I seem to remember me that I am a programmer.
                  Could it be, I'm not sure, but probably having a nice software running on about the 40% of italian medium and big companies and another specialized software sold worldwide make me qualify somehow as a programmer.
                  I'm sorry to disagree with you. I am a consultant. What does that say to you about my job? Nothing really, unless I state the area of consulting I work in. Likewise, programming has many different areas, which can go from databases, financial, networking, 3D, etc. Therefore, just by saying you're a qualified programmer doesn't automatically qualify you as a programmer in every single area

                  Now, I have two way to solve the problem.
                  1) Optimize the code.
                  2) Say to da bozz: "It's not really a problem. I'm targeting the software for 2004-2005 processor and Database engines. There will be no problems on a Athlons 8000Mhz".

                  The results are
                  Case 1: nobody says me "Good Job!", simply because I've done what I'm payed for
                  Case 2: I'm fired.
                  Wrong again. It all boils down to cost. If it's cheaper to get more powerful hardware than to optimize the code, it isn't really viable to go into optimizing, is it?

                  JC is an idiot. The task of a programmer is to write code NOW , to optimize it NOW and to work with the hardware wich we have NOW.
                  As stated in a previous post, that's your job, not Carmack's

                  Where JC has made innovation?
                  Where JC has made innovative effects?

                  ...

                  When there was only texturing on the video cards, all the programmers wanted more texturing power. After the implementation of T&L, all that programmers want is more powerful T&L units so they can add polys.
                  For what are they paid, so?
                  "A.A.A: searching for a programmer.
                  Requirements: Ability to buy the most powerful cards avalaible in shops. Without coding capacities should be better.
                  I disagree once more (Isn't that a surprise? ). Do you honestly believe you could run Doom3 on Quake 3's engine? Or Quake 3 on Quake 2's for that matter? And isn't the engine the main concern of the programmers when it comes to FPS games?

                  (a man who say that says only 2 bit color more is more than an idiot. He's a poor mathematician. THEY ARE 2 BIT MORE BUT THEY ARE THE RIGHMOST BIT!!!! IDIOT!!!)
                  I think you misunderstood what he meant (And I really am a mathematician ). Before the Parhelia was launched I read an explanation of 10 bit color (which I can't really recall where, so no, I can't give you the link), and it goes something like this (sorry for being so sketchy. Maybe someone can provide a fuller explanation):

                  32 Bit color is organized in 8 bits for each of the RGB color plus 8 bits for an additional purpose which I really can't remember what it was. These last 8 bits are used in some operations when it comes to 3D programming. By implementing 10bit color, Matrox is still working in 32 bits. So you have 10 bits for each of the RGB colors and 2 bits remaining. And these are the 2 bits Carmack refers to as being not enough, not the difference from 8 to 10 bits.
                  My Rig:
                  Duron 600@1Ghz on Asus A7V, Alpha Pal 8045, 384Mb RAM, GF4 Ti4200, SB Audigy, 3com Ethernet, Maxtor 40Gb, Creative DVD 5x, CDRW LG 24x10x4x, Intellimouse Explorer USB, Studio PCTV Pro, 21'' Samsung 1100p, Videologic Digitheatre LC.

                  Comment


                  • #84
                    Hi tenebrus,
                    Originally posted by tenebrus
                    (sorry for being so sketchy. Maybe someone can provide a fuller explanation):

                    32 Bit color is organized in 8 bits for each of the RGB color plus 8 bits for an additional purpose which I really can't remember what it was. These last 8 bits are used in some operations when it comes to 3D programming. By implementing 10bit color, Matrox is still working in 32 bits. So you have 10 bits for each of the RGB colors and 2 bits remaining. And these are the 2 bits Carmack refers to as being not enough, not the difference from 8 to 10 bits.
                    The Parhelia renders with an internal precision of 10bit per colour channel plus an alpha channel : 10+10+10+10 = 40bits internal precision. That's what the pipeline does at all times. A GF4, on the other hand side, has a precision of 9+9+9+9 = 36bits. Internal precision, that is.

                    Now, there are good reasons to keep the framebuffer at 32bit, mainly because of cost. So the final image will have to fit into the 32bit precision range of the framebuffer: 10:10:10:10 is, without "GigaColor", reduced to 8:8:8:8 = 32bits. You get the usual 24bit colour + 8bit alpha channel image, albeit rendered at a higher precision and thus showing less colour banding in extreme situations than the GF4 which also reduces its internal precision to 32bits, but loses less precision (that wasn't there to begin with, though).

                    With GigaColor enabled, you keep the 10bits of internal precision for each colour channel all the way through to the framebuffer. As the Parhelia "only" features a 32bit framebuffer, the 8bits used for alpha have to be reduced to 2bits: 10:10:10:2. You get the full render quality actually onscreen, without interpolation or capping. Unless you need an 8bit alpha channel that is:

                    The issue with the DooM3 engine is that Carmack uses these 8 alpha bits for multipass blending operations, hence the Parhelia won't work with GigaColor enabled.

                    ta,
                    .rb

                    Edit: LOL, constantly mistyped 11 for 10. corrected.rb
                    Last edited by nggalai; 27 June 2002, 05:21.
                    Visit www.3dcenter.de

                    www.nggalai.com — it's not so much bad as it is an experience.

                    Comment


                    • #85
                      amazon says this . not sure though if this information is still up to date.

                      Comment

                      Working...
                      X