HD Video Decode Quality and Performance Summer '07
by Derek Wilson on July 23, 2007 5:30 AM EST- Posted in
- GPUs
HD HQV Image Quality Analysis
We have already explored Silicon Optix HD HQV in detail. The tests and what we are looking for in them have not changed since our first round. Fortunately, the ability of NVIDIA and AMD hardware to actually perform the tasks required of HD HQV has changed quite a bit.
Both AMD and NVIDIA told us to expect scores of 100 out of 100 using their latest drivers and hardware. We spent quite a bit of time and effort in fully evaluating this test. We feel that we have judged the performance of these solutions fairly and accurately despite the fact that some subjectivity is involved. Here's what we've come up with.
The bottom line is that NVIDIA comes out on top in terms of quality. We've seen arguments for scoring these cards differently, but we feel that this is the most accurate representation of the capabilities offered by each camp.
On the low end, both AMD and NVIDIA hardware begin to stumble in terms of quality. The HD 2400 XT posts quite a lack luster performance, failing in noise reduction and HD deinterlacing (jaggies). But at least it poorly deinterlaces video at full resolution. We excluded tests of NVIDIA's 8500 series, as their video drivers have not yet been optimized for their low end hardware. Even so, we have been given indications not to expect the level of performance we see from the 8600 series. We would guess that the 8500 series will perform on par with the AMD HD 2400 series, though we will really have to wait and see when NVIDIA releases a driver for this.
With video decode hardware built in as a separate block of logic and post processing being handled by the shader hardware, it's clear that the horrendous 3D performance of low end parts has bled through to their video processing capability as well. This is quite disturbing, as it removes quite a bit of potential value from low cost cards that include video decode hardware.
Both AMD and NVIDIA perform flawlessly and identically in every test but the noise reduction test. AMD uses an adaptive noise reduction algorithm that the user is unable to disable or even adjust in any way. NVIDIA, on the other hand, provides an adjustable noise reduction filter. In general, we prefer having the ability to adjust and tweak our settings, but simply having this ability is irrelevant in HQV scores.
The major issue that resulted in our scoring AMD down in noise reduction was that noise was not reduced significantly enough to match what we expected. In addition to the tests, Silicon Optix provides a visual explanation of the features tested, including noise reduction. They show a side by side video of a yellow flower (a different flower than the one presented in the actual noise reduction test). The comparison shows a noisy video on the left and a video with proper noise reduction applied on the right. The bottom line is that there is almost no noise at all in the video on the right.
During the test, although noise is reduced using AMD hardware, it is not reduced to the level of expectation set by the visual explanation of the test. Based on this assessment, we feel that AMD noise reduction deserves a score of 15 out of 25. Silicon Optix explains a score of 15 as: "The level of noise is reduced somewhat and detail is preserved." In order to achieve a higher score, we expect the noise to be reduced to the point where we do not notice any "sparkling" effect in the background of the image at all.
By contrast, with NVIDIA, setting the noise reduction slider anywhere between 51% and 75% gave us a higher degree of noise reduction than AMD with zero quality loss. At 75% and higher we noticed zero noise in the image with no detail loss until noise reduction was set very high. Tests done with the noise reduction slider at 100% show some detail loss, but there is no reason to crank it up that high unless your HD source is incredibly noisy (which will not likely be the case). In addition, at such high levels of noise reduction, we noticed banding and artifacts in some cases. This was especially apparent in the giant space battle near the end of Serenity. It seems to us that computer generated special effects seemed to suffer from this issue more than other aspects of the video.
While, ideally, we would like to see artifacts avoided at all cost, NVIDIA has provided a solution that offers much more flexibility than their competition. With a little experimentation, a higher quality experience can be delivered on NVIDIA hardware than on AMD hardware. In fact, because NVIDIA sets noise reduction to default off, we feel that the overall experience provided to consumers will be higher.
We have already explored Silicon Optix HD HQV in detail. The tests and what we are looking for in them have not changed since our first round. Fortunately, the ability of NVIDIA and AMD hardware to actually perform the tasks required of HD HQV has changed quite a bit.
Both AMD and NVIDIA told us to expect scores of 100 out of 100 using their latest drivers and hardware. We spent quite a bit of time and effort in fully evaluating this test. We feel that we have judged the performance of these solutions fairly and accurately despite the fact that some subjectivity is involved. Here's what we've come up with.
Silicon Optix HD HQV Scores | ||||||
Noise Reduction | Video Res Loss | Jaggies | Film Res Loss | Stadium | Total | |
AMD Radeon HD 2900 XT | 15 | 20 | 20 | 25 | 10 | 90 |
AMD Radeon HD 2600 XT | 15 | 20 | 20 | 25 | 10 | 90 |
AMD Radeon HD 2600 Pro | 15 | 20 | 20 | 25 | 10 | 90 |
AMD Radeon HD 2400 XT | 0 | 20 | 0 | 25 | 10 | 55 |
NVIDIA GeForce 8800 GTX | 25 | 20 | 20 | 25 | 10 | 100 |
NVIDIA GeForce 8600 GTS | 25 | 20 | 20 | 25 | 10 | 100 |
NVIDIA GeForce 8600 GT | 25 | 20 | 20 | 25 | 10 | 100 |
The bottom line is that NVIDIA comes out on top in terms of quality. We've seen arguments for scoring these cards differently, but we feel that this is the most accurate representation of the capabilities offered by each camp.
On the low end, both AMD and NVIDIA hardware begin to stumble in terms of quality. The HD 2400 XT posts quite a lack luster performance, failing in noise reduction and HD deinterlacing (jaggies). But at least it poorly deinterlaces video at full resolution. We excluded tests of NVIDIA's 8500 series, as their video drivers have not yet been optimized for their low end hardware. Even so, we have been given indications not to expect the level of performance we see from the 8600 series. We would guess that the 8500 series will perform on par with the AMD HD 2400 series, though we will really have to wait and see when NVIDIA releases a driver for this.
With video decode hardware built in as a separate block of logic and post processing being handled by the shader hardware, it's clear that the horrendous 3D performance of low end parts has bled through to their video processing capability as well. This is quite disturbing, as it removes quite a bit of potential value from low cost cards that include video decode hardware.
Both AMD and NVIDIA perform flawlessly and identically in every test but the noise reduction test. AMD uses an adaptive noise reduction algorithm that the user is unable to disable or even adjust in any way. NVIDIA, on the other hand, provides an adjustable noise reduction filter. In general, we prefer having the ability to adjust and tweak our settings, but simply having this ability is irrelevant in HQV scores.
The major issue that resulted in our scoring AMD down in noise reduction was that noise was not reduced significantly enough to match what we expected. In addition to the tests, Silicon Optix provides a visual explanation of the features tested, including noise reduction. They show a side by side video of a yellow flower (a different flower than the one presented in the actual noise reduction test). The comparison shows a noisy video on the left and a video with proper noise reduction applied on the right. The bottom line is that there is almost no noise at all in the video on the right.
During the test, although noise is reduced using AMD hardware, it is not reduced to the level of expectation set by the visual explanation of the test. Based on this assessment, we feel that AMD noise reduction deserves a score of 15 out of 25. Silicon Optix explains a score of 15 as: "The level of noise is reduced somewhat and detail is preserved." In order to achieve a higher score, we expect the noise to be reduced to the point where we do not notice any "sparkling" effect in the background of the image at all.
By contrast, with NVIDIA, setting the noise reduction slider anywhere between 51% and 75% gave us a higher degree of noise reduction than AMD with zero quality loss. At 75% and higher we noticed zero noise in the image with no detail loss until noise reduction was set very high. Tests done with the noise reduction slider at 100% show some detail loss, but there is no reason to crank it up that high unless your HD source is incredibly noisy (which will not likely be the case). In addition, at such high levels of noise reduction, we noticed banding and artifacts in some cases. This was especially apparent in the giant space battle near the end of Serenity. It seems to us that computer generated special effects seemed to suffer from this issue more than other aspects of the video.
While, ideally, we would like to see artifacts avoided at all cost, NVIDIA has provided a solution that offers much more flexibility than their competition. With a little experimentation, a higher quality experience can be delivered on NVIDIA hardware than on AMD hardware. In fact, because NVIDIA sets noise reduction to default off, we feel that the overall experience provided to consumers will be higher.
63 Comments
View All Comments
erwos - Monday, July 23, 2007 - link
Does it? Because I thought that was only for MPEG-2. Link?smitty3268 - Monday, July 23, 2007 - link
Most drivers only support it with MPEG-2, but that doesn't mean it isn't capable of more. Looking again, I'm a little unclear about how much work would be required to get it working. I'm not sure if it is completely done and just requires support from the hardware vendors or if it also needs some additional work before that happens.http://www.mythtv.org/wiki/index.php/XvMC">http://www.mythtv.org/wiki/index.php/XvMC
http://en.wikipedia.org/wiki/X-Video_Motion_Compen...">http://en.wikipedia.org/wiki/X-Video_Motion_Compen...
Per Hansson - Monday, July 23, 2007 - link
Hi, it would be really interesting to see similar tests done in Linux alsoFor example how cheap of a HTPC rig can you build, with free software too, and still provide betters features than any of the commercial solutions.
I think we are many that have some old hardware laying around. And when seeing this article it brings up ideas. Pairing the old computer with a (AGP?) ATI 2600 card would provide an ideal solution in a nice HTPC chassi under the TV perhaps?
jojo4u - Monday, July 23, 2007 - link
Linux is not practical. You would have to crack AACS and dump the disc first.Per Hansson - Monday, July 23, 2007 - link
Hmm, I did not realize thatHowever a HTPC can still be built to be a player for satellite data for example, granted configuring all that up with a subscription card will not be for the faint of heart. But then again the Dreambox 8000 is not available yet, only a new decoder from Kathrein UFS910 with no decent software (yet)
jojo4u - Monday, July 23, 2007 - link
Hi Derek,good review. However, based on a review of the german written magazine C't I have some suggestions and additions:
PowerDVD patch 2911, Catalyst 7.6, Nvidia 158.24
- the Geforce G84/85 miss not only VC-1 but also MPEG-2 bitstream processing.
- the HD 2400 does not have MPEG-2 bitstream processing, frequency transform and pixel prediction or it is not activated.
- A single core Athlon is significantly worse than a single core Pentium IV. The reson is AACS. Decryption puts a hudge load on the CPU and is optimized for Intel CPUs (9%->39% H.264, Pentium IV, Casino Royale). Perhaps later patches made the situation better (like your Yozakura shows?)
- VC-1 on the Radeons and Geforces showed picture distortions, but based on your review this seems to be fixed now
Combinations of Athlon 3500+, X2 6000+, Pentium IV 3,2 GHz, Pentium E2160 and HD 2400/2600, Geforce 8600 GTS which resulted in lagging in MPEG-2 or VC-1 or H.264
3500+ + 690G/2400/2600/8600
6000+ + 690G
Pentium IV + 8600
Chunga29 - Monday, July 23, 2007 - link
Why run with older drivers? If these features are important to you, you will need to stay on top of the driver game. Would have been interesting to see AMD chips in there, but then that would require a different motherboard as well. I think the use of a P4 560 was perfectly acceptable - it's a low-end CPU and if it can handle playback with the 2600/8600 then Athlons will be fine as well.8steve8 - Monday, July 23, 2007 - link
nice article..but, while i usually think anandtech conclusions are insightful and spot on,
it seems odd not to give props to the 2600xt which dominated the benchmarks.
for the occasional gamer who often likes watching videos, it seems the 2600xt is a great choice, better than the 8600gts.
for example for VC1, on a low end c2duo the difference between 7% and 19.2% matters, esp if the person likes watching a video while working or browsing or whatever...
can amd add noise reduction options later w/ a driver update?
defter - Tuesday, July 24, 2007 - link
How can that matter? Even in worst case you have 80% of idle CPU time.
Besides, how can you "work" while watching video at the same time? And don't try to tell me that a web browser takes over 80% of CPU time with Core2 Duo system...
drebo - Monday, July 23, 2007 - link
We all know why this is.
I'll give you a hint: look at the overwhelming presence of Intel advertising on this site.
It doesn't take a genius to figure it out. That's why I don't take the video and CPU reviews on this site seriously anymore.