Announcement

Collapse
No announcement yet.

Retail Parhelia not at retail clock speeds...

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #46
    Dunno if I am stupid or something, but I wouldn't give a rats ass if it ran @ 210/530...
    It's only 4% for heavens sake! 1% in your case @ 217/532
    Clock it up a bit, or do a BIOS flash with "correct" PINS values and of you go...
    And what is the difference anyway? 1 fps in some shooter? Stop whining! And don't go into "I was cheated" arguments now. How many of you that aren't overclocking your CPU's know the exact number of Mhz it is currently running? I bet ya that in 90% of cases it ain't exactly what is written on CPU but a few Mhz lower... Go sue AMD and Intel... (although it's not their fault)

    C'mon...
    _____________________________
    BOINC stats

    Comment


    • #47
      Originally posted by Goc
      Dunno if I am stupid or something, but I wouldn't give a rats ass if it ran @ 210/530...
      It's only 4% for heavens sake! 1% in your case @ 217/532
      Clock it up a bit, or do a BIOS flash with "correct" PINS values and of you go...
      And what is the difference anyway?
      A voided waranty for a device the meets advertised spec
      Or a working warranty for a device that doesn't meet the advertised spec

      1% may not mean much to the real world of gaming, but that's not whats under dispute here.

      And you can't compare CPU's as they're apples to oranges. CPU's have clock rate X FSB - so the math is easy to work out if there's a discrepancy of a few Mhz compared to the advertised working clock rate and FSB. I know that my 1.4 t/bird is actually clocked at 1396.5 Mhz as it's advertised at 133FSB and 10.5 clock rate.

      If there's a clock rate in the GPU - I have no idea. And if there is, we are not told. So we have no way of working out if the Mhz lost is through bad marketing or Matox stiffing customers by lowering the clock rate to get better chip yields.

      However, we know now that some customers are being stiffed by these missing Mhz as we're not being told at point of purchase - and more importantly we're not being told why.

      Why are some Parhelia's shipped @ 220 and some @ 217?

      My guess is so that chip yield is increased per wafer. And if that's the case and Matrox won't allow me to re-flash or RMA to get 220 without voiding the warranty, then a lot of people will get pissed off very quickly.

      IANAL, but this may result in a class action lawsuit (I've only got the vaguest idea what this actually means, but I think I'm right) which is even worse for Matrox if they take the hard assed line.
      "I contend that we are both atheists. I just believe in one fewer god than you do. When you understand why you dismiss all the other possible gods, you will understand why I dismiss yours." - Stephen Roberts

      µße®LørÐ - A legend in his underwear
      Member of For F*ck Sake UT clan
      DriverHeaven administrator
      PowerVR Network administrator

      Comment


      • #48
        Hrmph... I simply can't believe that 3 Mhz would mean that much to Matrox...
        Anyone tried overclocking the cards that were shipped with 217/542 settings? If they fail to reach 220 Mhz then you have a point, but if they overclock just as well as those cards that were shipped with 220/550, and reach 230 or more Mhz on the core, then I fail to see where's the problem.

        Anyway, I can't find a single logical reason that would confirm that Matrox is deliberatly cheating their customers. And also, as Haig hinted, they can't find out if you flashed a card BIOS using original PINS or someone elses if you modified the serial number line in it and then flashed the card's BIOS.
        Well I'm confused here, but frankly don't care about those missing Mhz. If it turns out that lower clocked cards are in some way defective, I will be the first one who'll go ranting...
        _____________________________
        BOINC stats

        Comment


        • #49
          And you can't compare CPU's as they're apples to oranges. CPU's have clock rate X FSB - so the math is easy to work out if there's a discrepancy of a few Mhz compared to the advertised working clock rate and FSB. I know that my 1.4 t/bird is actually clocked at 1396.5 Mhz as it's advertised at 133FSB and 10.5 clock rate.
          Well, it depends on the model of your mobo... Some mobos use 133 Mhz and others use 133.33 Mhz... Can't remember which are which but the same processor actually works at different speeds on different mobos...

          And you can't compare CPU's as they're apples to oranges.
          I agree. Just wanted to stress the oppinion above. Have chosen a pretty stupid way to do that...

          Edit: typos
          _____________________________
          BOINC stats

          Comment


          • #50
            Originally posted by Goc
            and reach 230 or more Mhz on the core, then I fail to see where's the problem.
            230 means instant lock up on 3D apps. 220 seems stable. If this is a pattern all across the 217 boards, it does seem to be a yield issue.
            Tyan Thunder K7|2x AMD AthlonMP 1.2GHz|4x 512MB reg. ECC|Matrox Parhelia 128|Full specs

            Comment


            • #51
              difference with processors is that the processor tells the motherboard what speed it wants to run at... its up to the motherboard to properly set the clock speed... some do and some don't, but that depends on the manufacturer of the motherboard. With a motherboard that set the bus speed correctly (and not too high or too low), you would get the advertised clock speeds.

              However, all motherboards at least claim to be setting the processor to the right speed. The problem here is not the clock speed differences or the yields, its the fact that the bios isn't even telling the card to set it to the right speed - especially when they advertised the card being 220mhz core and 550mhz memory...

              this may be so horribly insignificant that it would be impossible to start a class action lawsuit... and personally, that little percent of a difference isn't even a noticable difference... it would be nice to get some sort of fix on this or some sort of comment from Matrox about why this is set that way, thats all i really care about.

              personally, i have been using the overclocker in the support utility for a bit with my card and it runs at the real speeds fine. it runs fine and without any stability problems.
              "And yet, after spending 20+ years trying to evolve the user interface into something better, what's the most powerful improvement Apple was able to make? They finally put a god damned shell back in." -jwz

              Comment


              • #52
                personally, i have been using the overclocker in the support utility for a bit with my card and it runs at the real speeds fine. it runs fine and without any stability problems.
                That's what I wanted to hear...

                And enough about CPU's already... I'm embarassed enough for bringing it up in the first place. Stop rubbin' it in...
                _____________________________
                BOINC stats

                Comment


                • #53
                  "And yet, after spending 20+ years trying to evolve the user interface into something better, what's the most powerful improvement Apple was able to make? They finally put a god damned shell back in." -jwz

                  Comment


                  • #54
                    I have one of the 217Mhz cards and it overclocks extremly well. Without any mod I tried 8% (217*1.08 = 234.36 Mhz) and it ran stable. This weekend, I put on a 50mm fan (stock HS) and some ram sinks. Tried 11% (217*1.11 = 240.87 Mhz) without any problems. 240Mhz was my goal, but I may try a higher percentage soon.

                    Not all cards are the same. So don't think every card will perform this well.
                    I should have bought an ATI.

                    Comment


                    • #55
                      The reason this small discrepancy angers me is that it is yet another sign of Matrox falling short of providing an all-round quality product.

                      FAA - It's been stated by Haig that the algorithm used falls short of his expectations for this feature. Many feel this is one of the key selling points for justifying the cost of the card.

                      DH - I can't set my display for dual-head without it breaking Powerdesk, i.e., Powerdesk crashes after the first use, forcing a reinstall of the drivers. I am currently running a single head display. I personally blame M$ for this. Matrox most likely had to agree to utilize the .net framework in order to get their displacement mapping technology into the DX9 specs. This meant creating Powerdesk from scratch opposed to using the tried and true Powerdesk of old.

                      Gaming - regardless of how they market the Parhelia <i>now</i>, the initial focus of the product launch (at least for me and quite a few others) was the amazing 3D gaming capabilities. The instinctive reaction to this was performance, presumably achieved from a <i>clock speed</i> at least equivalent or better than cards currently on the market. It didn't take long for Haig to dissuade this belief. Fortunately, the Matrox driver team has shown their skill and proven their worth by giving a rather significant boost in performance in the very first revision. This is the only reason I purchased this first generation of the card.

                      Bundled software - this is the first product since the original Millennium that there has been <i>no</i> bundled software. At least the G550 provided a DVD player, the key selling point for me in purchasing the retail version.

                      Clock speeds - now it has been realized that a healthy percentage of the cards shipping today don't match the advertised specs. I can't fathom any significant reason for this, except poor quality control.

                      I can only wonder what will be discovered next.

                      </end rant>
                      Last edited by ravalox; 22 October 2002, 20:12.
                      Waiting on tech support...

                      Comment


                      • #56


                        Is this a typo, or have they changed memory clock??
                        Waiting on tech support...

                        Comment


                        • #57
                          memory clock is correct on that page...
                          "And yet, after spending 20+ years trying to evolve the user interface into something better, what's the most powerful improvement Apple was able to make? They finally put a god damned shell back in." -jwz

                          Comment


                          • #58
                            Then why has this whole thread been stating 250 as the retail memory clock instead of 275??

                            Edit: OK, now I remember...275x2 = 550. My mistake.
                            Last edited by ravalox; 22 October 2002, 21:46.
                            Waiting on tech support...

                            Comment


                            • #59
                              275/550, not 250/500.
                              "And yet, after spending 20+ years trying to evolve the user interface into something better, what's the most powerful improvement Apple was able to make? They finally put a god damned shell back in." -jwz

                              Comment


                              • #60
                                shameless bump... wonder if Matrox has forgotten it already....
                                "And yet, after spending 20+ years trying to evolve the user interface into something better, what's the most powerful improvement Apple was able to make? They finally put a god damned shell back in." -jwz

                                Comment

                                Working...
                                X