I made a similar post in another thread but thought this worth it's own thread. If you didn't know, until now without hardware decode you would need about 2.4 or more GHz of Core 2 Duo power to decode H.264.
With a new nVidia 8500 or 8600 graphics adapter, both of which are very affordable, even an E6420 can do the decode with less than 30% CPU usage.
According to the above article even with the relatively slow E6420 (1.86GHz) max CPU usage with the 8600GTS is only 32.9%.
You can get a card like this for less than $200 at NewEgg. The 8500 series are less than $100 and supposedly decode as nearly as well. HD decode is now really not a big deal. http://www.newegg.com/Product/Produc...82E16814125062
With a new nVidia 8500 or 8600 graphics adapter, both of which are very affordable, even an E6420 can do the decode with less than 30% CPU usage.
According to the above article even with the relatively slow E6420 (1.86GHz) max CPU usage with the 8600GTS is only 32.9%.
You can get a card like this for less than $200 at NewEgg. The 8500 series are less than $100 and supposedly decode as nearly as well. HD decode is now really not a big deal. http://www.newegg.com/Product/Produc...82E16814125062