Announcement

Collapse
No announcement yet.

Looks like someones already decided that the new card sucks

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #16
    I have had no problems at all getting FSAA to work on my Radeon8500 - just turn it on and set the wanted level, that's all.. Besides personally I find the Voodoos solution to look a bit "overfiltered". The Matrox solution sounds (and, in the press release papers also looks) much better IMO as it does exactly what Antialiasing is supposed to do: smooth edges, all the stupid "full screen" AA algorithms are just "smoothing" the whole image out as if it was bilinear/bicubic filtered.
    And the R8500s FSAA while really looking nice at the 4x quality setting and up, is just too slow to be usable even on an AthlonXP2000+ with a R8500@300/300. Besides I think anisotropic filtering / high LOD settings are much more important than AA, cause you can kind of simulate the latter with ultra-high resolutions, but this won't help NVidia with it's sometime very blurry textures.

    To those NVidiots/FanATIcs over there: this is just the common "what I have in my computer now is simply the best, and has ever been and will ever be" syndrome - this is also true for some Matrox supporters, hell even Creative seems to have a handful of such "supporters".

    It's absolutely irrelevant if a card gets 85fps or 85000fps, you simply can't see this. What IS important, however, is that the card can deliver about 50-60fps minimum framerate at the highest-quality settingswhich the Parhelia seems to do but the GF4 fails miserably. At best the frame-rate should be a constant 60fps regardless of the quality settings, this is much better than 150fps average with maybe 300fps max but only 29 fps minimum.
    Still I'm sure we will find some certain writers for THG that are gonna bench the Parhelia in 320x240@2Bits to obtain a victory for the GF4. And no wonder since they're running a NVidia site since the old TNT times. How's that for being "unbiased" and "neutral"...

    P.S.: Another word to the oh so important "culling": There were quite some driver revisions out for both the old Radeon and the 8500 that had major parts of HyperZII broken and thus disabled. Guess what: the performance delta was only a few percents.... The culling technologies didn't really bring a major breakthrough for the GF3, either, just a few percent. What did improve speed a lot however was the optimized crossbar controller on the GF4 and the Parhelia seems to have a similar if not better implementation.
    Hellbinder doesn't have a clue anyway, he's just another man without head (and the important parts inside, as it seems). Who has a clue however, is Humus and his words are the only correct ones: It's useless to do any speculations until there are any real-card benchmarks out.
    Last edited by Indiana; 16 May 2002, 15:17.
    But we named the *dog* Indiana...
    My System
    2nd System (not for Windows lovers )
    German ATI-forum

    Comment


    • #17
      @Joel:
      The GF4 anisotropic "bug" was fixed on opengl several driver releases ago, giving something like 20-30% fps improvement with 64-tap anisotropic on Quake 3.
      I already stated it to you, BTW. So let´s forget that GF4 anisotropic bug for now, ok?

      Comment


      • #18
        Originally posted by Wombat
        One thing that I dare not post over there (because I'll get flamed to death) is that Matrox doesn't jitter the way other cards do. My G400 doesn't have nearly as large a delta between min and max fps as GeForce cards tend to have. Also, when I got my GF3, I did some really quick before & after tests.
        Using fraps, Deus Ex, fairly complex scene - G400: FPS in the low 30s, dropping in to mid-20s if I spun around.

        GF3Ti - 56fps. If I did a 180 in the game, the display still said 5xFPS, but my G400 was a lot easier on my eyes. The GF3 seemed to jump a lot more, as if it only got 50FPS because it would blit the same frame 3 times before actually having something new to render.

        Unfortunately, there doesn't seem to be a way to quantify that. nVidia knows it, so they go for highest FPS benchmarks.
        hmm! i think you are right here, i noticed a huge difference in the minimum fps in serious sam second encounter,

        this isn´t based on a timedemo, but on my observations when i played the first level:
        G400max 640*480*16 P3-866(speed setting): min = 25 fps
        kyro2 640*480*16 P4-2000(speed setting): min = 3 fps

        yeah i know what you are thinking, that the p4 is f*cked up somehow, but im getting the same average framerate as all those reviewers on the net, at least according to their benchmarks. Of course that level could be an extreme worst case scenario for the kyro2(all the grass and trees and transparent textures and almost no overdraw), but i still think its impressive that a G400max can run it(well it doesn´t exactly fly, but considering its age, its still impressive).

        of course it could just be a driver bug.

        edit: it must be a driver bug, because in all other games(except jedi knight) the kyro2 is 2-3 times faster than the g400max. but the g400max still has a rather small delta in framerates. IIRC
        Last edited by TdB; 16 May 2002, 15:49.
        This sig is a shameless atempt to make my post look bigger.

        Comment


        • #19
          Originally posted by Cheesekeeper
          If anyone knows any girls who are impressed by 16-TAP Anisotropic filtering and low-bandwidth 16X FAA, I'd be much obliged if you could tell me where you found them
          hehehe, thats what my girlfriend is like... and i managed to find her at school.... and no, you cant have her

          beyond that... DX9 should be out by end of year at latest... should be out probably Q1 or Q2 2003..

          major reason that i am personally looking forward to DX9 is that they supposedly have been reworking the API to make it easier for developers... the changes between DX7 and DX8 were impressive, and microsoft is going to keep it up. Also, i believe vertex shaders 2.0 (dunno about pixel shaders) will have a high level language to ease coding them...

          from what i understand the motivation on release DX9 and then DX9.1 fairly shortly after is that they were wanting to test the waters with DX9, and then implement some other stuff in DX9.1

          for a developer, having a card that fully supports the DirectX version you are working on is almost a nessecity. for a gamer, its not gonna make a difference (until a few years from now).

          between UT2003 and Doom3 there is not gonna be any competition... there are a lot of effects you can do in a single player game that doing in a multiplayer game would be bad...

          a large waterfall in the middle of the level, for instance... looking at one could cause horrible frame rates, so you have to be careful when using things... DM map construction should ideally have most rooms in it performing similarly, and you don't want it to get choppy when looking at one side or corner of the room...

          between unreal 2 and doom 3... mmm, hard call... i look forward to it though - those are gonna be the games with all the eyecandy
          "And yet, after spending 20+ years trying to evolve the user interface into something better, what's the most powerful improvement Apple was able to make? They finally put a god damned shell back in." -jwz

          Comment


          • #20
            gah, looks like the db has some jacked up time stamps... your posts are a good 10+ hours from now
            "And yet, after spending 20+ years trying to evolve the user interface into something better, what's the most powerful improvement Apple was able to make? They finally put a god damned shell back in." -jwz

            Comment


            • #21
              It's a joke people are stilling comparing quake3 FPS for a benchmark. The game is so darn old that any new video card can make it go over 200fps on a fast enough machine no problem.

              One game that looks interesting is never winter nights, that has a lot of eye candy.
              If a card can do those games at least 100fps, you need at least 100fps if not more so if you're playing with people you don't get slowed down a lot.

              I have owned matrox cards since the matrox millenium, II, G200, and G400. I switched to Gainward geforce3 cause the G400 just wasn't cutting it and the problems with openGL. When I swtched all the problems were gone.

              I find this new card extremely interesting. It's funny people are worried about Dx9, guess people don't remember history very well. It will be at least a year from the time Dx9 comes out till you'll get a game that makes use of it. I expect Dx9 to be around in early 2004. Hell Dx8 has just starting getting used late last year.

              I'll sit and wait, it sure will be interesting

              Comment


              • #22
                Originally posted by Galvin
                I expect Dx9 to be around in early 2004.
                LOL. DX9 will probably be released at the beginning of 2003 at the latest. I am betting it will be released a little earlier than that though.
                System Specs:
                Gigabyte 8INXP - Pentium 4 2.8@3.4 - 1GB Corsair 3200 XMS - Enermax 550W PSU - 2 80GB WDs 8MB cache in RAID 0 array - 36GB Seagate 15.3K SCSI boot drive - ATI AIW 9700 - M-Audio Revolution - 16x Pioneer DVD slot load - Lite-On 48x24x48x CD-RW - Logitech MX700 - Koolance PC2-601BW case - Cambridge MegaWorks 550s - Mitsubishi 2070SB 22" CRT

                Our Father, who 0wnz heaven, j00 r0ck!
                May all 0ur base someday be belong to you!
                Give us this day our warez, mp3z, and pr0n through a phat pipe.
                And cut us some slack when we act like n00b lamerz,
                just as we teach n00bz when they act lame on us.
                For j00 0wn r00t on all our b0x3s 4ever and ever, 4m3n.

                Comment


                • #23
                  Yes true, but by the time a game comes out that make use of it. It will be 2004, so it may as well just be 2004.

                  new DX releases takes forever to come to mass market in games. That's why I find it funny that people are so worried about DX9 support. Only people that need to worry about DX9 support are devolpers and they can use Nvidia cards for full compliance if they need it.

                  Comment


                  • #24
                    Originally posted by Wombat
                    Can't speak for ATi, haven't used them in any of my machines. Shit drivers.
                    New drivers are flying out of the door very rapidly and seem pretty good now. The only problem I've got is there DVD player is broken has been I think for me since 9021 drivers. Yes I am using the latest DVD player.
                    Oh I forgotten the biggest bug. ATi for some reason disable the windoze plug and play and set all displays to 60hz. You either have to use Omegadrivers or another set can't remember the name or edit the inf file (Hence ATI fault) or edit the registry. Omegadrivers now given you a default set of refresh rates. Note this is excluding the winxp stuck at 60hz problem.
                    Last edited by The PIT; 17 May 2002, 13:58.
                    Chief Lemon Buyer no more Linux sucks but not as much
                    Weather nut and sad git.

                    My Weather Page

                    Comment


                    • #25
                      id software seem to provide the benchmark game - Doom III should give a fairer comparison. UT2003 as well I guess.

                      How does D3 & UT2003 compare to each other - with regards to technology?

                      P.
                      Meet Jasmine.
                      flickr.com/photos/pace3000

                      Comment


                      • #26
                        Besides UT2003 being D3D 8.1 and Doom3 OpenGL, I believe Carmack is doing a lot more lighting stuff, creating thay 'eery' feeling. UT2003 still has those stupid lights without a source and stuff as far as i've seen. I was watching gameplay more than graphics though with UT2003.

                        Comment


                        • #27


                          MURC exclusive
                          no matrox, no matroxusers.

                          Comment


                          • #28
                            A little off-topic perhaps, but can anyone here explain why those guys are so obsessed with 250FPS? I can't speak for anyone else, but I have a frikkin expensive 21" Sony monitor, and it won't do anything even approaching that sort of refresh rate. When the debate was over whether a card could consitiently turn out 30-60 frames, that I could understand, but this 250FPS thing is just getting silly.

                            I suppose they may want to be able to boast a card with 300+ FPS to impress the girls. If this is the case, I'd be less interested in which video card they had, and more interested in where they found a stash of girls who are impressed by frame rates.

                            If anyone knows any girls who are impressed by 16-TAP Anisotropic filtering and low-bandwidth 16X FAA, I'd be much obliged if you could tell me where you found them

                            Comment


                            • #29
                              Originally posted by Cheesekeeper
                              A little off-topic perhaps, but can anyone here explain why those guys are so obsessed with 250FPS? I can't speak for anyone else, but I have a frikkin expensive 21" Sony monitor, and it won't do anything even approaching that sort of refresh rate. When the debate was over whether a card could consitiently turn out 30-60 frames, that I could understand, but this 250FPS thing is just getting silly.
                              well, it is because they can´t afford a penis-enlargement operation.
                              This sig is a shameless atempt to make my post look bigger.

                              Comment


                              • #30
                                Originally posted by Cheesekeeper
                                ... I'd be less interested in which video card they had, and more interested in where they found a stash of girls who are impressed by frame rates. ...
                                Funny!
                                <TABLE BGCOLOR=Red><TR><TD><Font-weight="+1"><font COLOR=Black>The world just changed, Sep. 11, 2001</font></Font-weight></TR></TD></TABLE>

                                Comment

                                Working...
                                X