Announcement

Collapse
No announcement yet.

Got my MAX

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #31
    Fear not fellow Celery owners. 3dMark is the culprit here. My roomates p2-464 scores the same as mine on the 3dMark bench. The test needs to be less processor intensive. The Max is sitting back and yawning, I promise. There still are several games that will show a significant increase in performance with the Max and a Celery. Just not 3DMark 99.

    Comment


    • #32
      I currently have a G200, hoping to get the G400 Max in a few months. My CPU is the K62-300 (@350), and just purchased a Celeron 366a, guaranteed to run at 550.

      Are you guys saying it *still* won't be enough to drive the G400?

      Comment


      • #33
        Well, for what it's worth, I was looking at the numbers on Tom's last night and noticed that the Celeron 400 scores disproportionately lower than the PII-450, which scores about as much lower than the PIII-550 as you'd expect. The Celeron's scores were about 25% lower than the PII's most of the time, even though there's only a 12.5% difference in clock speeds. I think the G400s must hit the cache pretty heavily to show this kind of difference.

        3DMark scores much higher with a PIII because there are SSE optimizations in 3DMark's code itself -- the CPUMarks shoot up by like 50% on a PIII compare to a PII of the same clock speed.

        Comment


        • #34
          The G400 is a documentated chip: according to the documentation, if the memory clock is 200, the engine cannot possibly be 166. It could be 200, 150, 133 or even less, but not 166. Unlike the TNT or Savage4, Matrox chips do not have asynchronous clocks for engine and memory.

          Comment


          • #35
            Its the cash you spend on the cache. For $3600 you can get a xeon with 2 meg of cache right on the chip. I wonder how multitaskfast that would be?

            ------------------
            p3 450 underclocked to p2 266 Mill2 pci card running @ VGA 16 colors

            Comment


            • #36
              I too wondered why a cerery won´t cut it, but that´s only because you are comparing it to a PIII in 3dmark max. As far as I know, 3dmark max has 3dnow and SSE optimizations. When properly used (and that should´t be difficult in a bechmark) SSE optimizations could give a performance boost around 30-40%. That´s why celeron scores are so "low". Just look at the cpu 3d speed: A celery 450 gives about 4k, a pIII 500 gives 7k-8k...

              If you search for game benchmarks, you can see Q2 timedemos in celerons and PII/III and you´ll see that the diference in the fps is very proporcional to the cpu speed in mhz, despite the cache size/speed.


              Comment


              • #37
                Oh, and BTW G400 are available now, here in Portugal and I just ordered a OEM G400 Dual head 16 Mb. Yes 16 Mb... I will pay about $180-190 for it, the 32 Mb dual head was about $250... I was afraid to ask about the retail versions...

                I will get it next week.

                It´s hard to live in a country like this, prices are just too expensive for what people earn...

                [This message has been edited by Nuno (edited 07-15-99).]

                Comment


                • #38
                  Still, with the classy AGP implementation the G400 has, 16mb shouldn't hold you back too much!

                  Cheers

                  Steve

                  ------------------
                  Yeah, you know the score...
                  (ICQ: 29468849)

                  Comment


                  • #39
                    Steve,

                    It will. The memory response times are at least twice as high when using main memory. At least 2 times, but could be much more (4-5 maybe).

                    Yeager,

                    About your "3d Mark is a pathetic benchmark" flame, you must realize that there will be no perfect balance between CPU and graphics adapter so that neither one wouldn't perform better than the other. There will always be a bottleneck somewhere.
                    I just can't see how you can hold a benchmarking program responsible for not giving the maximum results with a certain graphics adapter if CPU is holding it back. Especially when that program is not ment for display adapter benching, but overall system 3D performance testing. 'System 3D performance' depends on the CPU performance as well.


                    B

                    [This message has been edited by Buuri (edited 07-15-99).]

                    Comment


                    • #40
                      As a junior member of the fourm, I find the discussions most intererting.

                      Is there a super computer system located some place that is being used as the bench mark "standard"? A system which gives the highest possible test results or are we searching the Web, trying to find that super computer? Even if such a computer exist; the bench mark results would change through out a given day.

                      My point is, which system do we use for the bench mark testing? No matter, the final results are a point of reference, ball park figures or a way of keeping score and to see how hw are systems compare.

                      Can I assume that we are doing this, because it is fun?

                      designer

                      Comment


                      • #41
                        That was an intelligent criticism Buuri. I apologize. I was not focusing on 3dMark as a "system" benchmark , but rather a 3d card bencher. I just don't like it when people are getting worried about their graphics card performance when in reality two generations of video cards turn in nearly identical scores if the CPU is the limiting factor. My roomate was pissed when his shiny new TNT2 was outperformed by my TNT. The only difference between our systems is he has a P2 running 450 and I have a Celery running 464.

                        Thank for the clarification.
                        Yeager

                        Comment


                        • #42
                          That was an intelligent criticism Buuri. I apologize. I was not focusing on 3dMark as a "system" benchmark , but rather a 3d card bencher. I just don't like it when people are getting worried about their graphics card performance when in reality two generations of video cards turn in nearly identical scores if the CPU is the limiting factor. My roomate was pissed when his shiny new TNT2 was outperformed by my TNT. The only difference between our systems is he has a P2 running 450 and I have a Celery running 464.

                          Thank for the clarification.
                          Yeager

                          Comment

                          Working...
                          X