Announcement

Collapse
No announcement yet.

bad news for non-NVidia Users? say it aint so!

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • bad news for non-NVidia Users? say it aint so!


    SANTA CLARA, CA — March 23, 2000 — In a move that will help bring stunning 3D graphics to internet users, NVIDIA™ Corporation (Nasdaq: NVDA) and Microsoft® Corp. today announced the adoption of NVIDIA's technology for Volume Texture Compression Format (VTC) for Microsoft DirectX APIs. Today's 3D internet sites are plagued with inadequate image quality due to bandwidth constraints of transmitting high-resolution textures over standard communications systems. Even for high-performance PCs, the limited amount of texture storage forces game developers to use lower-resolution textures, resulting in imagery that lacks detail. Scheduled for release in 2000, NVIDIA's VTC format enables a superior level of image quality that allows web and content developers to produce 3D objects that depict their natural characteristics.
    "NVIDIA has clearly demonstrated they are the technology leader for the 3D industry," said Tony Barkans, program manager for DirectDraw at Microsoft. "By incorporating NVIDIA's technology into DirectX applications, developers are empowered with the tools to develop and deploy more complex and visually compelling 3D applications."

    Limited texture storage has historically been a problem for game application developers, forcing compromises in image quality and performance. The texture-storage problem is exacerbated by volume textures, which are truly 3D data, unlike traditional 2D textures. Volume textures are so much larger than 2D textures that the texture compression format becomes extremely important. NVIDIA's volume texture compression format organizes 3D volume-texture data to take advantage of the 3D nature of the data, which increases the effective texture bandwidth by an enormous factor. NVIDIA has developed a proprietary method to reorder the 3D data within a volumetric image cube to account for the linear accessing required for the optimal use of the memory system of a typical computer system.

    "By incorporating NVIDIA's volume texture compression technology into DirectX, developers will be able to unleash a new level of realism," said David Kirk, chief scientist at NVIDIA. "3D textures are enormous in size, therefore a high quality volume texture compression format is needed in order to make 3D texture data manageable and usable for high-performance rendering. NVIDIA's 3D volume texture compression technology enables developers to incorporate cinematic realism in interactive 3D applications."

    Will everyone be able to take advantage of this or just people with NVidia cards?
    Enlighten the less intelligent!!
    Drugs are Bad
    Hugs are Good

  • #2
    I'm not positive, but it might be like when S3 had S3TC put into Direct X as DXTC. Basically, that just opened up S3TC in Direct X for any card manufacturer to use if their hardware supported it. This will probably be the same.

    Sen

    Comment


    • #3
      aha !...like EAX or other...wait, we will see..

      after all Im a HAPPY non Nvida user right now...& surly will be with NEXT.GEN Matrox product...

      ------------------
      PIII450@504(fsb@112),P2b,128mb,Matrox Mill-G400 32SH,Diamond MONSTER Sound II ,IntelliMouse Explorer (yahoooo! no fu..hm hm. ball any more ...)

      PIII650@806(fsb@124),ASUS P3B-F,128mb,Matrox Mill-G400 32SH,SB.Live!vlue ,IntelliMouse Explorer

      Comment


      • #4
        It ain't so!
        Asus A7V, Duron 600@900, 192MB PC133@100, G200, Guillemot MUSE, etc.

        Comment


        • #5
          can anyone give us a definitive answer on this one? (HINT ant or kruzin!)
          Drugs are Bad
          Hugs are Good

          Comment


          • #6
            Isnt Nvidia worth more that South America nowdays?

            It will all work out don't worry

            Comment


            • #7
              Let's put it this way, with the exception of being fast, and being the first to put T&L in hardware, the nVidia chipsets are NOT "the best". Their image quality isn't as good as that of the G400. The G400 has many features that no other cards have. Texture compression being important? I've yet to see anything that shows that texture compression is all that important. 32 megs on there now, 64 megs in the future. Do we really need texture compression, or better designed games? Do we need yet another first person shooter to be the reason why people would need texture compression?

              Comment


              • #8
                If something is incorporated into DirectX than that means that every manufacturer may include it in their hardware without paying any loyalties, since Microsoft has bought them for DirectX. So this technique can be used by any manufacturer for free, at least for DirectX. OpenGL, etc. is another matter.

                Targon, texture compression is getting more and more important. When using it you can use more detailed textures which still use less memory. It certainly helps image quality as textures don't look as blurry anymore but look sharp.

                Then imagine that 3D texture, where this nVidia compression technology is meant for, adds another dimension, increasing memory usage drasticly.

                Let's say we use a 256x256 32 bit 2D texture. That would take up 256 * 256 * 4 = 262144 bytes = 256 kB. That's not that much.

                Now we will use a 256x256x256 32 bit 3D texture. That would take up 256 * 256 * 256 * 4 = 67108864 bytes = 64 MB.

                I say we indeed need 3D texture compression. ;-)

                Comment


                • #9
                  Was nVidia the first to implement T&L in hardware? I thought Glint-based boards have had this feature for a while.

                  Paul
                  paulcs@flashcom.net

                  Comment


                  • #10
                    so basically even though its licenced by NVidia, Matrox could use this technology in say the G800?
                    Drugs are Bad
                    Hugs are Good

                    Comment


                    • #11
                      Hey franksch, I could be wrong, but I don't think there is really such a thing as a "3d Texture". When dealing with a 3d world you take a 2d teture and map it onto to multiple sides of a 3d object.
                      The real problem is this. A 256 x 256 x 16 or 32 is not a very good texture. you can't get a whole lot quality and detail into such an image. However, if you use a 1024 x 1024 x32 image you get much better quality.
                      This is where you're calculations really come in. ((1024 pixels) x (1024 pixels) x (32 bits per pixel)) / (8 bits per byte) = 4,194,304 bytes = 4mb.
                      Now stop and think, if you increase that further, to say a 2048 x 2048 itexture, you can get even better quality, and if you run through the calulations again you get a 16mb texture. Now think about how many textures you use in a single level of Q3 or UT. 10? 20? more? less? even doing a 512x512 image you get a 1mb image. So if you have a 16mb card, then thats 16 images you can fit in the graphics memory. I expect most levels use way more than that many textures.

                      That is where the real need for texture compression comes in.

                      Ian
                      Primary System:
                      MSI 745 Ultra, AMD 2400+ XP, 1024 MB Crucial PC2100 DDR SDRAM, Sapphire Radeon 9800 Pro, 3Com 3c905C NIC,
                      120GB Seagate UDMA 100 HD, 60 GB Seagate UDMA 100 HD, Pioneer DVD 105S, BenQ 12x24x40 CDRW, SB Audigy OEM,
                      Win XP, MS Intellimouse Optical, 17" Mag 720v2
                      Seccondary System:
                      Epox 7KXA BIOS 5/22, Athlon 650, 512 MB Crucial 7E PC133 SDRAM, Hercules Prophet 4500 Kyro II, SBLive Value,
                      3Com 3c905B-TX NIC, 40 GB IBM UDMA 100 HD, 45X Acer CD-ROM,
                      Win XP, MS Wheel Mouse Optical, 15" POS Monitor
                      Tertiary system
                      Offbrand PII Mobo, PII 350, 256MB PC100 SDRAM, 15GB UDMA66 7200RPM Maxtor HD, USRobotics 10/100 NIC, RedHat Linux 8.0
                      Camera: Canon 10D DSLR, Canon 100-400L f4.5-5.6 IS USM, Canon 100 Macro USM Canon 28-135 f3.5-5.6 IS USM, Canon Speedlite 200E, tripod, bag, etc.

                      "Any sufficiently advanced technology will be indistinguishable from magic." --Arthur C. Clarke

                      Comment


                      • #12
                        Ian is quite right, except that he forgot that part of that 16 mb of memory on our theoretical video card is need for the frame buffer.

                        So, you'd probably have more like 12 mb left for textures, depending on the resolution you're running in...

                        Anyone who doubts texture compression hasn't been up close and personal with the S3TC Unreal levels.

                        To be honest, though, I still play most of my games in 640x480 because I want the fps, so my most wished-for feature is FSAA right now.

                        Texture compression and T&L are great to have, but let's please get rid of those nasty jaggies first, mmmkay?

                        ------------------
                        ------------------------
                        "The iMac is for conformists who think they are non-conformists. They are also for morons."
                        John Misak

                        Cory Grimster
                        <A HREF="http://www.houseofhelp.com"TARGET=_blank>www.houseofhelp. com</A>
                        <A HREF="http://www.2cpu.com"TARGET=_blank>www.2cpu.com</A>

                        Comment


                        • #13
                          Actually, franksch3 understands it perfectly, except, it sounds like they'll only use 128^3 textures. They ARE volumetric (3D textures).
                          A good example of how it works would be to take a cube of wood 128x128x128 dimensions and carve it into some arbitrary shape. Obviously the patterns in the wood are all seemless and realistic. That's about how the 3D texture would be applied to an object.

                          I've seen the DX8 docs and pre-beta demos, etc. (We develop games where I work.)

                          AlgoRhythm

                          Comment


                          • #14
                            Paul,
                            You are right. nVIDIA was the first to implement it to high-end home user boards while professional adapters had them long before (as well as memory sizes like 96MB )

                            Mattias,
                            Basically yes but if Matrox didn't see this long coming I'd guess that it's too late to implement it in G800.

                            What it comes to 3D textures, I have no idea what nVIDIA means by that term. I wouldn't be too surprised if they would "cheat" on this one as well, but the term really means a volumetric texture.

                            _
                            B

                            [This message has been edited by Buuri (edited 25 March 2000).]

                            Comment


                            • #15
                              Isn't this one of the things that ATI is saying its Charisma engine does?
                              Workhorse Athlon 1GHz, G400MAX
                              Gamebox Athlon 1.3GHz Gforce3

                              Comment

                              Working...
                              X