Howdy!
I just spent two days in front of my browser teaching myself about playing AVI files. I'm now able to use WMP to play streams with DivX and XviD video encoding, plus MP3 and AC3 audio encoding. I've also managed to figure out how to have my sound card take care of the AC3 decoding, and to pass the AC3 undecoded to an external receiver/decoder. Whew! Lots o' reading and tired eyeballs!
Now, my question ... for fun, I monitored my CPU utilization while playing an AVI with AC3-encoded audio. With the AC3 DirectShow filter/codec from Nimo's Codec Pack doing the AC3 decoding work, my utilization hovered in the range from 20%-25%.
Next, I forced WMP (6.4, not 8.0) to spit out undecoded AC3 ("Enable SPDIF output") to my Audigy sound card (via the PCI bus, of course) and enabled AC3 decoding for the Audigy. Well, that worked too, and to my ears there was perhaps an audible difference. But to my surprise, the CPU utilization in this case went *UP* to 25%-30% while playing back the same file and passages.
Intuitively, this doesn't make much sense to me. I expected CPU utilization to *DROP* when the Audigy card was doing the AC3 decoding.
Could one of you guys/gals with the colorful icons and stars provide some insight into this paradox?
My guess, and this is *ONLY* a guess, is that when the Audigy card is being used to decode AC3, what's happening is the DirectShow filter is still chugging away decoding AC3, but then WMP takes on the added task of re-encoding it back to AC3 to be sent out to the sound card for decoding! That added re-encoding task would account for the extra 5% CPU utilization.
Hydrogin
I just spent two days in front of my browser teaching myself about playing AVI files. I'm now able to use WMP to play streams with DivX and XviD video encoding, plus MP3 and AC3 audio encoding. I've also managed to figure out how to have my sound card take care of the AC3 decoding, and to pass the AC3 undecoded to an external receiver/decoder. Whew! Lots o' reading and tired eyeballs!
Now, my question ... for fun, I monitored my CPU utilization while playing an AVI with AC3-encoded audio. With the AC3 DirectShow filter/codec from Nimo's Codec Pack doing the AC3 decoding work, my utilization hovered in the range from 20%-25%.
Next, I forced WMP (6.4, not 8.0) to spit out undecoded AC3 ("Enable SPDIF output") to my Audigy sound card (via the PCI bus, of course) and enabled AC3 decoding for the Audigy. Well, that worked too, and to my ears there was perhaps an audible difference. But to my surprise, the CPU utilization in this case went *UP* to 25%-30% while playing back the same file and passages.
Intuitively, this doesn't make much sense to me. I expected CPU utilization to *DROP* when the Audigy card was doing the AC3 decoding.
Could one of you guys/gals with the colorful icons and stars provide some insight into this paradox?
My guess, and this is *ONLY* a guess, is that when the Audigy card is being used to decode AC3, what's happening is the DirectShow filter is still chugging away decoding AC3, but then WMP takes on the added task of re-encoding it back to AC3 to be sent out to the sound card for decoding! That added re-encoding task would account for the extra 5% CPU utilization.
Hydrogin
Comment