Announcement

Collapse
No announcement yet.

nForce

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #16
    The chip in the Xbox is almost identical to the Geforce3, with 2 vertex shaders in stead of 1 (or something like that). I read the specs on the beyond3d forum (I think).

    Comment


    • #17
      Paulr...

      The Geforce 3 is the NV 20
      The Geforce 2 is the NV 15
      The Geforce 2 MX is the NV 11

      Here is a little clip from

      http://gamespot.com/gshw/stories/fla...689981,00.html

      Instead of such a brute-force approach, Nvidia has moved in a new direction with the GeForce3, code-named the NV20. Immediately, you'll notice the impact of several features that dramatically improve the chip's efficiency, breaking through the bottleneck in memory bandwidth.
      AMD Phenom 9650, 8GB, 4x1TB, 2x22 DVD-RW, 2x9600GT, 23.6' ASUS, Vista Ultimate
      AMD X2 7750, 4GB, 1x1TB 2x500, 1x22 DVD-RW, 1x8500GT, 22" Acer, OS X 10.5.8
      Acer 6930G, T6400, 4GB, 500GB, 16", Vista Premium
      Lenovo Ideapad S10e, 2GB, 500GB, 10", OS X 10.5.8

      Comment


      • #18
        OK, so I got the codenames wrong.
        I'm not professing to know all the code names!
        Either way, the X-Box is featuring a chipset although not disimilar to the GF3, it is still different and more advanced.
        NV25 jumps to mind as a code name but again as said above I can't remember them all.
        The next GeForce based cards for the PC will be the ones to own as they will contain exactly the same chipset as featured in X-Box.
        Meaning that as games are ported over from X-Box to PC (Which they will, very easy process) you'll have all the right hardware to run them at their best.
        It cost one penny to cross, or one hundred gold pieces if you had a billygoat.
        Trolls might not be quick thinkers but they don't forget in a hurry, either

        Comment


        • #19
          Hey Paulr...

          I didn't mean to sound like a prick... just meant it to be a minor correction.

          If the next chip released by Nvidia is what is in the X-Box I will be dissapointed.

          There is no other difference beside the extra T/L junk. They probably could have incorperated the same in the GF3.. but chose not to so that they could milk consumers out of more money.. more than likely with and "enhanced ultra" version of the GF3.

          Anyways... However they take it... its their decision... as long as people buy into it they will get away with it.

          A GF2 embedded wouldn't be too bad... but if it turns out like the TNT2 did when it was intergrated... that could be a poor decision... maybe it won't be so bad with the enhanced internal bus bandwidth... but if it leeches from Main Memory... the bonus could be severly hampered.



          [This message has been edited by cbman (edited 08 June 2001).]
          AMD Phenom 9650, 8GB, 4x1TB, 2x22 DVD-RW, 2x9600GT, 23.6' ASUS, Vista Ultimate
          AMD X2 7750, 4GB, 1x1TB 2x500, 1x22 DVD-RW, 1x8500GT, 22" Acer, OS X 10.5.8
          Acer 6930G, T6400, 4GB, 500GB, 16", Vista Premium
          Lenovo Ideapad S10e, 2GB, 500GB, 10", OS X 10.5.8

          Comment


          • #20
            No, I think that the new board would come out with a gfmx integrated board. Simple reasons:

            1: Dont want to annoy the OEM companies that sell GF2/GF3 (GF3 Ultra by the time these boards are released). If they integrate a basic MX gfx card then they will still have a market for the gf3 as an addon.....

            2: Cheaper. They dont want to put a huge expenseive gfx card on a board which will be built into base machines....

            There is also a couple of reasons for going down the Athlon route. Remember that the xbox is based in PIII technology, to be easily portable to PC's the work in converting from PIII to Athlon may not be much different to PiV in terms of workload. They dont have any licencing agreement with Intel, which is why they will modify the chipset to suit the athlon. Reasons:
            1. Intel wants people to move to Pentium iV class processors (better margin)
            2. They wont want a new DDR chipset (if they can avoid it) on PIII in competition with their Rambus PIV.


            Does this make any sense?

            Red
            Sneakilly at work


            [This message has been edited by RedRed (edited 08 June 2001).]
            Dont just swallow the blue pill.

            Comment


            • #21
              S'ok, I didn't mean to sound upset or nasty!! :-)
              Damn code names, they can get a tad annoying, but I started using them ever since I was an official beta tester for 'Chicago'.
              I don't really know if the NV25 is going to offer much extra really.
              As X-Box has been in design for a long time, and the fact it would be using the NV25 has also been known for quite a while, I see the NV20 very much as a filler.
              When I was looking for a replacement for my G400MAX the GF3 was an option for me, but with the ncie discount on the GF2 Ultra's I decided to go for that instead.
              NVidia supports get upset anyway when their product is outdated 6 months later, I just have this feeling there are going to be even more so once NV25 is released - I can really see some funky features in that chipset - me thinks the 3DFX technology etc.
              It cost one penny to cross, or one hundred gold pieces if you had a billygoat.
              Trolls might not be quick thinkers but they don't forget in a hurry, either

              Comment


              • #22
                the x-box' graphics chip is called nv2a, and the xbox will ship with a special version of DirectX that will support the added vertex shader.

                Comment

                Working...
                X