Announcement

Collapse
No announcement yet.

g400 v g450mms performance issues

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • g400 v g450mms performance issues

    Hello,

    I have been playing around with a G400 card in a P4 2.4 GHz HTPC set up, and a G450MMS in a Via EPIA 1GHz Mini-ITX computer which will be used as a streaming client, with the first computer as the server.

    Anyway, I noticed, when trying out many different codecs on the HTPC, that the only one that gave me full hardware decoding with the G400 was that used by VideoLan. Even the Ravisent software (with update patch) that Matrox supply still uses around 35% CPU for Mpeg2 playback, whereas VLan only uses about 5%.

    I'm not sure whether this is how it should be, but as I may end up sticking with my XCard in the HTPC, it might not be an issue. I would though like to get the same hardware decoding in the Via EPIA/ G450MMS combination running VLan, but at the moment, the CPU load averages about 60%, which is not that much less than other software decoders- and I think that VLan is always less CPU intensive anyway.

    I know the Via computer is much less powerful than the P4, but by way of comparison, decoding using the same Ravisent software averages around 65% CPU load in the Via, compared to the 35% in the P4, so the big difference between the results for VideoLan does suggest to me that the G450MMS is not providing the same hardware decoding that the G400 is.

    So the question is, should it be, or is there something I have to set differently? (I do already have the hardware accleration set to maximum). Should the MMS version of the G450 perform exactly the same as the normal G450, or is there even perhaps some difference in performance between the G450 PCI and AGP versions which may explain the difference I'm getting between the AGP G400 and the PCI G450MMS? (and I need PCI for the EPIA system).

    Thanks for any advice anyone can give,

    Frazer

  • #2
    G-series graphic cards don't have any kind of video/DVD decoding engine, bulk of work will always be done in software.
    The reason why you see lower usage in VLC is propably simply because VLC uses some optimised algorithms...

    BTW, you don't have any performance issues. Everuthing plays nicely, yes? So what's the problem?

    Comment


    • #3
      First, thanks for the reply.

      The only "problem" so to speak, is that if watching video on the client is going to take a constant 60% or more CPU load, there won't be any chance to use it for anything else at the same time, and more importantly, as I am using a fanless mini-ITX case, the CPU gets quite hot if it's under such load for extended periods.

      I have run it successfully with a Sigma Netstream card at very low CPU loads, but unfortunately the picture quality on the VGA output of this card is not particularly good. What I don't really understand is why, if the VLC is only gaining from using optimised algorithms, the CPU load is so much lower with the G400- i.e. about 12% of an average Mpeg2 codec- whilst with the G450MMS, it's using about 75% of the average codecs CPU load?

      Thanks again for your help,

      Frazer

      Comment


      • #4
        Are you outputting to TV or to the monitor? I seem to remember that the G450 (or maybe it was the G550) didn't use hardware to scale the output to fit the TV-out which caused higher CPU usage.

        Comment


        • #5
          I'm outputting from the G450MMS to a monitor; the G400 in the HTPC (with the very low CPU load) is going into a TV.

          Thanks,

          Frazer

          Comment

          Working...
          X