HD Video Decode Quality and Performance Summer '07
by Derek Wilson on July 23, 2007 5:30 AM EST- Posted in
- GPUs
The Test
Our test setup consisted of multiple processors including a high end, low end, and previous generation test case. Our desire was to evaluate how much difference hardware decode makes for each of these classes of CPU and to determine how much value video offload really brings to the table today.
Performance Test Configuration:
We are using PowerDVD Ultra 7.3 with patch 3104a applied. This patch fixed a lot of our issues with playback and brought PowerDVD up to the level we wanted and expected. We did, however, have difficulty disabling GPU acceleration with this version of PowerDVD, so we will be unable to present CPU only decoding numbers. From our previous experience though, only CPUs faster than an E6600 can guarantee smooth decoding in the absence of GPU acceleration.
As for video tests, we have the final version of Silicon Optix HD HQV for HD-DVD, and we will be scoring these subjective tests to the best of our ability using the criteria provided by Silicon Optix and the examples they provide on their disk.
For performance we used perfmon to record average CPU utilization over 100 seconds (the default loop time). Our performance tests will include three different clips: The Transporter 2 trailer from The League of Extraordinary Gentlemen Blu-ray disc (H.264), Yozakura (H.264), and Serenity (VC-1). All of these tests proved to be very consistent in performance under each of our hardware configurations. Therefore, for readability's sake, we will only be reporting average CPU overhead.
Our test setup consisted of multiple processors including a high end, low end, and previous generation test case. Our desire was to evaluate how much difference hardware decode makes for each of these classes of CPU and to determine how much value video offload really brings to the table today.
Performance Test Configuration:
CPU: | Intel Core 2 Extreme X6800 (2.93GHz/4MB) Intel Core 2 Duo E4300 (1.8GHz/2MB) Intel Pentium 4 560 (3.6GHz) |
Motherboard: | ASUS P5W-DH |
Chipset: | Intel 975X |
Chipset Drivers: | Intel 8.2.0.1014 |
Hard Disk: | Seagate 7200.7 160GB SATA |
Memory: | Corsair XMS2 DDR2-800 4-4-4-12 (1GB x 2) |
Video Card: | Various |
Video Drivers: | ATI Catalyst 8.38.9.1-rc2 NVIDIA ForceWare 163.11 |
Desktop Resolution: | 1920 x 1080 - 32-bit @ 60Hz |
OS: | Windows Vista x86 |
We are using PowerDVD Ultra 7.3 with patch 3104a applied. This patch fixed a lot of our issues with playback and brought PowerDVD up to the level we wanted and expected. We did, however, have difficulty disabling GPU acceleration with this version of PowerDVD, so we will be unable to present CPU only decoding numbers. From our previous experience though, only CPUs faster than an E6600 can guarantee smooth decoding in the absence of GPU acceleration.
As for video tests, we have the final version of Silicon Optix HD HQV for HD-DVD, and we will be scoring these subjective tests to the best of our ability using the criteria provided by Silicon Optix and the examples they provide on their disk.
For performance we used perfmon to record average CPU utilization over 100 seconds (the default loop time). Our performance tests will include three different clips: The Transporter 2 trailer from The League of Extraordinary Gentlemen Blu-ray disc (H.264), Yozakura (H.264), and Serenity (VC-1). All of these tests proved to be very consistent in performance under each of our hardware configurations. Therefore, for readability's sake, we will only be reporting average CPU overhead.
63 Comments
View All Comments
DigitalFreak - Monday, July 23, 2007 - link
Based on Derek's results, I ordered the parts for my new HTPC.C2D E6850 (Newegg are bastards for pricing this at $325, but I didn't want to wait)
Intel P35 board
2GB PC2-800
Gigabyte 8600GTS (passive cooling)
Wished there was an 8600 board with HDMI out, but oh well...
SunAngel - Monday, July 23, 2007 - link
You confirmed my suspicions all along. I always wondered if the true motive for SLI and Crossfire was to double the benefits of GPU processing rather than separate the graphics performance of 3D and video acceleration. In my eyes, I see SLI and Crossfire being a "bridge" for 3D graphics and Video accleration cards. What I am referring to is the PCIex16(16 lane) slot been for high powered 3D GPUs and the PCIex16(8 lane) slot being for video accleration GPUs.It is obvious between the HD2900XT and the HD2600XT that one is great at rendering 3D game graphics while the other is great at acceleration motion picture movies.
Personally, this is an okay tactic by the card manufacturers. It segments the performance a little bit better. I do not game the least bit, so the high end cards are something I don't want. But, my taste are different than others that do want it. But those that desire both, can have their cake and eat it too, but using a dual PCIex16 motherboard and installing each type of card.
Overall, good article. You enlightened my purchasing decision. With all the talk about futureproofing that was going around for a while, buying a dual PCIex16 motherboard makes a lot of sense now.
TA152H - Monday, July 23, 2007 - link
I don't think you understand the point of the cards.If you buy the 2900 and a high end processor, you will not have any problems with HD playback, that's the whole point. You don't need a 2600 to go with it. The number of people that buy something as expensive as the 2900XT and a low end processor that is incapable of playing back HD is very, very low to the point where ATI decided it was a mistake to buy it.
So, no, you wouldn't get a 2600 to go with it, you'd get a good processor and the 2900 and that's all you'd need to have the best of both worlds.
Chunga29 - Monday, July 23, 2007 - link
Yes, if by "best" you mean:- Higher CPU utilization when viewing any HD video content, compared to 8800
- Generally lower price/performance in games compared to 8800
- More flaky (in my experience) drivers than 8800 (though I believe AMD might actually be better on Vista - not that I really care at this point)
Don't pat AMD on the back for skipping UVD on R600. NVIDIA didn't bother to work on VP2 for G80, and yet no one is congratulating them on the decision. I agree that the omission is not the end of the world, mostly because I don't think people running 8800/X2900 cards are really all that concerned with H.264 video. If I were looking to go Blu-ray or HD-DVD, I'd be looking at a set-top box to hook up to my HDTV.
My PC is connected to a 24" LCD that I use for work, not watching movies, and trying to put it next to the TV is more effort than it's worth. Unless H.264 suddenly makes a difference for YouTube and the like (hey - I'd love to see higher quality videos online), I can't say I'm all that concerned. Seems to me there's just a vocal minority whining about the latest features that are used by less than 10% of people.
UVD, PureVideo HD, and a partridge in a pear tree: it's all moot to me!
TA152H - Tuesday, July 24, 2007 - link
OK, do you understand the meaning of the word "context"?I'm not going into the merits of Nvidia and ATI. I have used both, I consider Nvidia junk, and I do not buy them. If you have had better luck, then go with them. That's not the point, but anyone with any reading comprehension should have figured that out.
He was talking about putting a 2600 and a 2900 on the same motherboard to get the best of both worlds, meaning having all the performance of the 2900 yet getting the HD capabilities of the 2900. Do you understand that?
My point is you don't need the 2600 to get "the best of both worlds", you just need a good processor and you will not miss that feature. I think Nvidia made the right choice too. Most people are morons, and they want just because they want, and they fail to realize nothing is free. Including useless features at a cost is a bad idea, and ATI did the right thing not to, even though you'll have the idiots that think they are missing out on something. Yes, you are, you're missing out on additional cost, additional electricity use, and additional heat dissipation. You don't need it if you buy a reasonable processor for the item. That's the point. Try to understand context better, and realize what he meant by the best of both worlds.
strikeback03 - Wednesday, July 25, 2007 - link
Assuming your "good processor" falls somewhere between the two tested C2D processors, dropping UVD boosts average processor usage around 42% in Transporter2, 44% in Yozakura, and 24% in Serenity. So which uses more electricity and generates more heat - the additional transistors needed for UVD on the 2900, or moving your CPU off idle to do the work?TA152H - Tuesday, July 24, 2007 - link
Previous post should have said "HD capability of the 2600".Chunga29 - Tuesday, July 24, 2007 - link
For someone trying to act superior, you need to take a look in the mirror (and the dictionary) for a moment. I agree it's silly to use something like a 2600 and 2900 in the same system. However, if you want the "best of both worlds", let's consider for a minute what that means:Best (courtesy of Mirriam Webster):
1 : excelling all others
2 : most productive of good: offering or producing the greatest advantage, utility, or satisfaction
So, if you truly want the best of both worlds, what you really want is:
UVD from ATI RV630
3D from NVIDIA G80
Anything less than that is not the "best" anymore (though I'm sure some ATI fans would put R600 3D above G80 for various reasons).
Try ditching the superlatives instead of copping an attitude and constantly defending ever post you make. If you want to say that R600 with a fast CPU is more than sufficient for H.264 playback as well as providing good 3D performance, you're right. The same goes for G80. If you want to argue that it may have been difficult and not entirely necessary to cram UVD into R600, you can do that, but others will disagree.
Since they were at something like 700 million transistors, they may have been out of room. That seems very bloated (especially considering the final performance), but how many transistors are required for UVD? I'd say it was certainly possible to get UVD in there, but would the benefit be worth the cost? Given the delays, it probably was best to scrap UVD. However, the resulting product certainly isn't able to offer the best possible feature set in every area. In fact, I'd say it's second in practically every area to other GPUs (G80/86 and RV630, depending on the feature). As others have pointed out in the past, that's a lot like the NV30 launch.
autoboy - Monday, July 23, 2007 - link
What??scosta - Monday, July 23, 2007 - link
I think this sentence in page 1 is wrong!Dont you mean ...
Regards