Announcement

Collapse
No announcement yet.

Oh no, GeForceFX to appear on street in April quarter

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #16
    I remember once... once long ago when the 9700 was announced... and when all reviewers were allowed to do is issue percentage figures of performance against a Ti4600...

    50% faster is easy to attain the the competition is only getting 20fps...
    "And yet, after spending 20+ years trying to evolve the user interface into something better, what's the most powerful improvement Apple was able to make? They finally put a god damned shell back in." -jwz

    Comment


    • #17
      You apparently never have seen a Gf4Ti4600 and a R9700 run head to head in modern game titles with 1024x768, 4xFSAA and anisotropic filtering enabled.
      Otherwise you wouldn't claim such things, but know that while the GF4 is in fact struggling with the 30 frames barrier sometimes, the Radeon mostly stays well over 80fps. This is the difference between playable and not playable we're talking here.

      And at the time of these R9700 previews at least a few reviewers had actual cards to do actual tests.
      If reviewers had actual GF FX cards and ran tests on them against a R9700 in the same system and then gave results like
      QIII, Quaver 1600x1200, 6xFSAA, 8xaniso: R9700 100%, GF FX 170%
      then this would have at least some worth (if the reviewing site was not THG or HardOCP, of course )

      But all there is are only some overhead slides from the marketing dept that seem to compare speeds against the GF4 (without stating the exact conditions, not even WHICH GF4 card they used for comparision...)
      Last edited by Indiana; 24 November 2002, 08:45.
      But we named the *dog* Indiana...
      My System
      2nd System (not for Windows lovers )
      German ATI-forum

      Comment


      • #18
        its nice that your radeon does so well, but quite frankly that has nothing to do with the topic on hand or the thread of the discussion... it doesn't change the fact that ATI was making reviewers post numbers that look better to them (50% faster is a lot better looking that simply scoring 30fps). Nor does it change the fact that while reviewers had the cards, they were under fairly strict guidelines as to what they could do and how they could publish figures...

        as far as NVidia... they are marketing slides... you can't just draw conclusions from them... saying that its gonna be fast just because nvidia said so is the stupidest thing to do... even toms hardware would be more trustworthy than those...

        I know that NVidia said that they were comparing it against a Ti4600...

        on the other hand, saying that the GeForceFX its gonna be fast because it has an insane fill rate and the ability to do 8pixels/clock (and as such stay up with its fill rate) with 4 op pixel shaders is good. thats certainly a lot better than the 9700 or the GeForce 4 can do...
        "And yet, after spending 20+ years trying to evolve the user interface into something better, what's the most powerful improvement Apple was able to make? They finally put a god damned shell back in." -jwz

        Comment


        • #19
          Originally posted by DGhost
          it doesn't change the fact that ATI was making reviewers post numbers that look better to them (50% faster is a lot better looking that simply scoring 30fps).
          I was just correcting your wrong statement (that you just felt necessary to repost - see in the quote), still repeating it doesn't make it more correct.

          In short terms:
          The performance lead of the R9700 in those (admittedly ATI friendly) FSAA+anisotropic settings was not 50%, but 100-200%.
          And this is NOT a 30 vs. 20 fps scenario here as you like to imply, but more a 70-80 fps vs. 30 fps one.

          saying that the GeForceFX its gonna be fast because it has an insane fill rate and the ability to do 8pixels/clock (and as such stay up with its fill rate) with 4 op pixel shaders is good. thats certainly a lot better than the 9700 or the GeForce 4 can do...
          The NV30 and the R300 both can do 8pixels/clock and 16Textures per pass, the basic design is very similar here. The higher fillrate of the GF FX is derived only from its high projected clockspeed. But you can see from looking at the Parhelia what a high fillrate is worth if you haven't got the bandwidth to back it up.
          The NV30 shaders are more flexible, but this doesn't automatically mean that they're faster when doing the same tasks. If there would be games actually using shader operations that complex and the R300 had to do it in multipass, then the GF FX would have a big advantage, but I really doubt we will see this in the life time of both cards. Even DX8 pixelshader use is very spare nowadays.
          Last edited by Indiana; 24 November 2002, 15:45.
          But we named the *dog* Indiana...
          My System
          2nd System (not for Windows lovers )
          German ATI-forum

          Comment


          • #20
            Originally posted by DGhost
            on the other hand, saying that the GeForceFX its gonna be fast because it has an insane fill rate and the ability to do 8pixels/clock (and as such stay up with its fill rate) with 4 op pixel shaders is good. thats certainly a lot better than the 9700 or the GeForce 4 can do...
            Ain't that exactly what the 9700 does?

            Comment


            • #21
              Novdid> Yes, more or less... See Indiana's post above yours...

              Comment


              • #22
                He edited that in after I posted my comment. But that is basicly what I meant, I just forgot the: ...

                Comment


                • #23
                  Originally posted by Novdid
                  He edited that in after I posted my comment.
                  Yes, as can be seen by the "last edited"-time.
                  After I had edited the post and pushed the submit button it I saw that meanwhile Novdid has been a bit faster (and much shorter )
                  But we named the *dog* Indiana...
                  My System
                  2nd System (not for Windows lovers )
                  German ATI-forum

                  Comment


                  • #24
                    how many shader execution units does the 9700 have per pipeline?

                    what my post above in regards to performance of the GeForce FX was saying is that with 4 op long shaders you can do 8 pixels/clock. With 3 op long shaders you can get 10 pixels/clock. with 8 op long shaders you can get 4 pixels/clock. with the Parhelia's (and its fairly traditional pixel shader pipeline) you can get 4 pixels/clock if you use a max of 5 instructions. if you use 6-10 instructions you get 2 pixels/clock.

                    since this has a very big impact on the performance of a card when pixel shaders are in use, i have to ask again... how many ops can each pixel pipeline execute in a clock on the 9700?

                    Why have people not been using these features? they are too slow... cards are simply not powerful enough to do it at any sort of decent pace, as well as being incredibly inflexible when it comes to it.

                    as far as the 8 pixels/clock, 16 texture/pixel thing goes... you can get 8 single textured pixels a clock, 16 textures on 1 pixel in 2 clocks (or 16 textures on 8 pixels in 16 clocks). Because of this a higher clock speed makes a world of difference in multitextured applications.


                    and about my 50% numbers, i was simply looking at ATI launch figures. Look at Anand's R300 write up... granted, when it was acctually released it was showing more like 60fps vs 100fps...

                    my 20 fps vs 30 fps was merely a cynical mark...
                    "And yet, after spending 20+ years trying to evolve the user interface into something better, what's the most powerful improvement Apple was able to make? They finally put a god damned shell back in." -jwz

                    Comment


                    • #25
                      Originally posted by Indiana
                      Novdid has been a bit faster (and much shorter )
                      Always keeping it simple.

                      Comment


                      • #26
                        I think we will see the Geforce FX by late January or early Febuary

                        Comment


                        • #27
                          Corstorphine scum! Don't believe ya!
                          Meet Jasmine.
                          flickr.com/photos/pace3000

                          Comment


                          • #28
                            My m8 is a reviewer for PCZONE and he said that is when it is set for release for the Uk market and that is when I'll be getting mine

                            Comment

                            Working...
                            X