Announcement

Collapse
No announcement yet.

The tree demo Nvidia used to show off GeForce is available for dowload.

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #61
    We can all bash on nVidia all we want, but the simple fact remains as nVidia says that bumpmapping is a inferior type of adding relief to something. Think of how stupid it looks that the reflection on the water makes us think that there are big waves but the waterlevel near a wall stays flat.

    I can't wait to see real waves.

    Comment


    • #62
      Yeah, I will not dissagree with that fact. But at the moment, it is not required, nor will a huge number of developers bother making water (as in your example) out of thousands of polygons when they could bump map a handful. If they want, add a couple of dozen of polygons to the edge where the water hits the wall, and bump map them.

      ------------------
      Cheers,
      Steve

      Comment


      • #63
        wizz,

        I am sorry but when you can only see the awesome performance of the GeSpot's Geometry processor at low resolutions, fill rate has a huge impact. I, for one, love to play my games at resolutions of 1024 and above, and fillrate has a hell of a lot to do with a card's ability to do so. No amount of CPU processing can help these higher resolutions, the poly and tri. count remain the same throughout all resolutions.

        Rags



        ------------------
        That's your wife on the back of my horse!!

        Comment


        • #64
          But isn't everyone forgeting one thing? Won't games have to be written to take advantage of T&L and the GPU?

          Joel
          Libertarian is still the way to go if we truly want a real change.

          www.lp.org

          ******************************

          System Specs: AMD XP2000+ @1.68GHz(12.5x133), ASUS A7V133-C, 512MB PC133, Matrox Parhelia 128MB, SB Live! 5.1.
          OS: Windows XP Pro.
          Monitor: Cornerstone c1025 @ 1280x960 @85Hz.

          Comment


          • #65
            OpenGL supports it with little effort. DirectX is a different story. DX7 has support, but the game will either have to be patched or written specifically to take advantage of the video card's "new" capabilities.

            Rags

            Comment


            • #66
              "Himself. First you said how much 3dfx and other competitive companies must hurry up and get something out to compete with befor Nvidia has taken over."

              "Then, you say you can wait for the Savage 2000 or wathever else comes along, and you don't trust Nvidia to deliver what the spec sheet suggest."

              "Please make up your mind, and don't post yes and no in the same post"

              What I personally like or not is beside the point, I am a disgruntled former NVIDIA owner, most people are not. So, yes, NVIDIA is going to make a killing, and no, I am still not interested. I also know the Celeton is a faster cpu for games than my AMD K6-3 cpu, but I am also not interested in that. That doesn't change the fact that there are a lot of Celeron owners out there and AMD would be out of business by now without the confidence inspired by the specs of the Athlon, if not actual profits yet.

              Nyah, nyah, nyah!

              Comment


              • #67
                Rags,
                I get your drift, who wants to play games at such low resolutions. What i like about the Geforce is that it is trying something new, the same way we praised Matrox for hardware embm, and now the voodoo4 with hardware T-buffer 32bit, the Geforce has hardware T&L and a new geometry engine, believe me its going to get a lot of praise, so things can only get better. Remember the card is only running at 120Mhz with 480 pixels. I am in favour of cards which are not just chasing fps, but new areas of visual quality, cue Matrox and embm. I would rather have a photorealistic game at 1024 at 30fps than what we have now at 90fps.

                Tony
                To understand life we should remove complexity and find simplicity.
                Tony 1999

                Comment


                • #68
                  I think matrox should simply buy out Bitboys oy and make the new Glaze3D a matrox card.
                  It has T&L, great fillrate (on the 2400 model), accumulation buffer ("T" buffer like V4) and u can make a few of them work in parallel on the same adapter.
                  I want a Matrox Mill Quad Glaze 2400 with 256MB ram on board !!!
                  Oh baby, oh baby, yes !!!

                  ------------------
                  Cloudy
                  Asus P2B-DS, 2 x Celeron 400@75Mhz, 128Mb Ram, Xitel Storm Platinum,
                  2 x IBM 4.3Gb scsi, Pioneer DVD rom scsi, ati rage fury.
                  Cloudy
                  Asus P2B-DS, 2 x Celeron 450 (400@75Mhz), 192Mb Ram, SB Live! Platinum,
                  2 x IBM 4.3Gb scsi,IBM 22GB IDE, Pioneer DVD ROM scsi, G400 32MB DH.

                  Comment


                  • #69
                    The Glaze3D is really a marketing hype. Everyone quotes that it has a fillrate of 1200 MPixels/s. Just read a bit through their site and you might read that it only gets that fillrate when FLAT SHADING is used (think of the old game X-Wing from LucasArts).

                    When gouroud shading or a texture is used the fillrate drops to 600 MPixels/s. And then the well know equation again applies:

                    clock speed * pipelines = MPixels/s.

                    In this case 150 * 4 = 600 MPixels/s. So don't get fooled by their marketing. Who wants 1200 MPixels/s only with flat shading?

                    With regards
                    Frank Schoondermark

                    Comment


                    • #70
                      You're all overlooking one very simple thing - we don't want an nVidia card, be it the G-Spot, the Prostate, or any other vaguely sexual model.

                      Why?

                      Because the image quality BLOWS. It's unacceptable. It's blurry in 2D at high resolutions. The colors are washed out. 3D is fast but icky-looking.

                      I'd prefer a Rage128 Pro over ANY nVidia card - and that's saying a lot since ATI sucks hardcore.

                      I mean c'mon, my wife's Voodoo3 is better than the nVidia cards.

                      - Gurm

                      P.S. Yes, they're fast. No question. Really really fast. And ugly.

                      ------------------
                      G. U. R. M. It's not hard to spell, is it? Then don't screw it up!
                      The word "Gurm" is in no way Copyright 1999 Jorden van der Elst.
                      The Internet - where men are men, women are men, and teenage girls are FBI agents!

                      I'm the least you could do
                      If only life were as easy as you
                      I'm the least you could do, oh yeah
                      If only life were as easy as you
                      I would still get screwed

                      Comment


                      • #71
                        Hmmm, you sure that a Voodoo3 is better looking than a TNT2? Maybe in 2D, but in no way 3D.

                        ------------------
                        Cheers,
                        Steve

                        PS: Some or all of the above message may be wrong, or, just as likely, correct. Depends on what mood I'm in. And what you know. ;¬)

                        Comment


                        • #72
                          I must tell you all something you don't know.
                          Windows uses the card driver for showing everything. Since I'v installed BeOS on my computer and the Rage128 isn't supported, I use 2D software mode (eats 50% cpu from time to time). The interesting part is that I'v never seen ANY card showin such vivid colors. Matrox comes closest but still, this is truely amazing. The important thing is that this could eb the best way for testing your hardware and RAMDAC quality without any driver tweaks.
                          Anyway, I think I'm going to reinstall my old G200 since BeOS supports it or I'll just get a G400.. I hope the G400 is supported, haven't seen any post about G400 BeOS compatibility yet.

                          ------------------
                          Cloudy
                          Asus P2B-DS, 2 x Celeron 400@75Mhz, 128Mb Ram, Xitel Storm Platinum,
                          2 x IBM 4.3Gb scsi, Pioneer DVD rom scsi, ati rage fury.
                          Cloudy
                          Asus P2B-DS, 2 x Celeron 450 (400@75Mhz), 192Mb Ram, SB Live! Platinum,
                          2 x IBM 4.3Gb scsi,IBM 22GB IDE, Pioneer DVD ROM scsi, G400 32MB DH.

                          Comment


                          • #73
                            This file is gone, the tree demo. Anyone else have it?

                            ------------------
                            Ami Y. Koriuchi - foxyviolet@hotmail.com
                            Asus P2B 1010 - P3-500
                            256MB 6NS 70 GB of 10k RPM SCSI UW

                            A laptop is advertised with a 'durable magnesium case'. Does that mean if I let it get too hot it'll flash into a blinding sphere of light?


                            Ami Y. Koriuchi - MY EMAIL IS DEAD

                            SYSTEM1
                            Asus K7V266 - Athlon XP 1800+ - GeForce 4 TI 4600 128MB -
                            1024 MB PC2100 DDR -
                            200 GB UDMA100 7200 RPM - 60GB LVD 160 10K RPM

                            SYSTEM2
                            Asus A7V133 - Athlon 1.4 - G400Max
                            768MB PC133 - 75 GB of 10k RPM SCSI UW

                            HI SOMETiMES I GO AWAY FOR LONG TIME AND COME BACK YEARS LATER HI!

                            Comment


                            • #74
                              It is really annoying that some people never will get another product from some company because one of their former products wasn't that good. Because the TNT2 may have bad image quality at very high resolutions, absolutely doesn't mean that the GeForce 256 has.

                              Try to judge a product on what is known and not on things you think because of a previous product. And BTW, the image quality at the resolutions I use the G400 has the same image quality as my intel 740 based card.

                              Comment


                              • #75
                                It's flying through the electronic ether on its way to you right now. Well, give it 5 mins to send... not all of us can have cable modems.

                                ------------------
                                Cheers,
                                Steve

                                PS: Some or all of the above message may be wrong, or, just as likely, correct. Depends on what mood I'm in. And what you know. ;¬)

                                Comment

                                Working...
                                X