Announcement

Collapse
No announcement yet.

poor g400max 3dmark2000 score

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #16
    Corwin the Brute

    I agree, factory default settings is the only way to compare benchmarks. But strangely enough I never gave that a thought in Q3?!? I've been busy comparing 'bi' vs. 'tri' and that sort of things.
    I'll make some new benchmarks for comparison sometime this weekend.

    Abit BE6-2 (Rev 2.), P3-1000E@1050(10x105/3), 768mb Kingston 7.5ns CAS3, G400MAX, SBLIVE, AHA 2940AU, IBM GXP75 60Gb (*2), IBM IBM GXP75 45Gb, Mitsumi FX48, Yamaha 4416S, Zyxel Prestige 200, 3Com 905C-TX On W98SE Lite, DX 8.1, PD 6.82

    Comment


    • #17
      SundriedGrapes: the cpu test isn't measuring fps. there's some kind of cpu 3dmark for this test, most likely there is a certain amount of frames that are displayed, and the score is derived from the amount of time required to draw all the frames (ala quake2 timedemo)

      Comment


      • #18
        Sorry, Rob;
        'there is a certain amount of frames that are displayed, and the score is derived from the amount of time required to draw all the frames' = performance = frames/time.
        ...supposedly as opposed to fps = performance = frames/time. Sorry, my friend - you're going to have to think of a new explanation.

        ..Either that or see the current thread in 'General Hardware'. It really does look as through the CPUMark is either a fake or even more poorly put together a benchmark than the rest of that infamously erh...'weighted' app! :-)

        Comment


        • #19
          Rob don't bother... although there are issues with this benchmark, SDG just doesn't get it.
          "Be who you are and say what you feel, because those who mind don't matter, and those who matter don't mind." -- Dr. Seuss

          "Always do good. It will gratify some and astonish the rest." ~Mark Twain

          Comment


          • #20
            I certainly don't, at that...:-(

            Comment


            • #21
              I wasn't clear, SDG is right. What I meant to say is that most fps benchmarks vary in the number of frames drawn. The cpu test is different as it's more of a movie. The number of frames stays the same, so the thing that changes is the amount of time the 'movie' runs. This is how quake2 did its benchmarks.

              This is why the fps counter in this case sticks at 7.5. Most likely when they made the helicopter movie they limited it to 7.5 fps.

              Rob

              Comment


              • #22
                Does anyone know when the G800 is due,because we're starting to see a couple of games that really make use of T.l(quake3,soldier of fortune,etc...)and those games really do show an improvment in frame rate regardless of fill rate advantage that existing Geforce cards have,and both ATI and Nvidia are anouncing their new cards next week(april 24th and 25th)which are rumored to be at least twice as powerful as the existing Geforce, ceative's alredy stated that as soon as Nvidia makes the anouncement,they'll have cards on retail shelves as early as the second week of may,on a friends system that has a Geforce in it(a DDR)and a p3 500 he has faster frame rate's than i do and i,m running with a p3 700,and that's regarless of resolution,i like matrox because of the visual quality and the extra features,but i'm also a gamer,and it would be nice if they would at least admit to the existence of the G800(or whatever it will be called)and even better if they would state their perfomance goals with the new card,even if the card won't show up in stores for a couple more months still.
                note to self...

                Assumption is the mother of all f***ups....

                Primary system :
                P4 2.8 ghz,1 gig DDR pc 2700(kingston),Radeon 9700(stock clock),audigy platinum and scsi all the way...

                Comment


                • #23
                  I seriously doubt that 3DMarks CPUMarks test actually tests anything...
                  After doing a few tests (look at http://forums.murc.ws/ubb/Forum3/HTML/001736.html for results - and feel free to try this for your CPU) it seems as if 3DMark merely checks for CPU-type and speed at start-up, then pretends to measure CPU-speed while simply playing a movie and afterwards gives a result that this CPU/MHz combo is supposed to give according to 3DMark (shouldn´t it be better called NVidiaMark or IntelMark?).
                  Just a big "Let´s see if we can fool everyone with a completely bogus benchmark score"... (makes me wonder if those FutureMark/3DMark/Madonion - or whatever they might call themselves by now - guys are doing that "for free" or if they are on NVidia/Intel`s payroll)
                  But we named the *dog* Indiana...
                  My System
                  2nd System (not for Windows lovers )
                  German ATI-forum

                  Comment


                  • #24
                    Superfly : could you please explain the etc... in "(quake3,soldier of fortune,etc...)", because Q3 is only T, not L, and I'm not aware of any other game on the shelves using T&L... I thought a tenth or so would be out for last Christmas ;-) BTW, a G400 is more than enough for today's gaming requirement.

                    As of 3DMark, MadOnion make money by being a standard benchmark (you have to pay the full version to post 3DMark results in an article, I think). I'm not sure they are paid by Intel or NVidia, because they use a game engine (Max Payne actually), and make use of both 3DNow! and SSE instruction sets. Also, remember the horror of true NVidia fans when 3DMark 2000 revealed that, under certain specific conditions, the T&L could actually hurt performance... Sorry, but I can't help laughing.


                    ------------------
                    Corwin the Brute

                    Corwin the Brute

                    Comment


                    • #25
                      well...o.k quake 3 uses only the tranform features of the Geforce,the lights aren't real hardware assisted lights,they're lightmaps(fancier textures),but on soldier of fortune they really do use both transform and hardware lights(the light sources actually change intensity,depending from what distance and angle you're viewing them,also if any object or caracter is in the direct path of the light,he or it will cast a believable shadow that changes depending on your viewing angle)and regardless of resolution,on any cpu,there's a good 15fps improvement when the Geforce is handling the lighting...there's also starlancer coming up that has full support for it(it's gone gold already),i'm getting as soon as it hits store shelves(the sreenshots look awsome )
                      note to self...

                      Assumption is the mother of all f***ups....

                      Primary system :
                      P4 2.8 ghz,1 gig DDR pc 2700(kingston),Radeon 9700(stock clock),audigy platinum and scsi all the way...

                      Comment

                      Working...
                      X