Pas tout à fait sur, ca c'est ce que dit nvidia mais zdnet dit le contraire.
Dans quel sens Ajds?
|
Modérateurs: Modération Forum Home-Cinéma, Le Bureau de l’Association HCFR • Utilisateurs parcourant ce forum: apocalipse08 et 20 invités
Pas tout à fait sur, ca c'est ce que dit nvidia mais zdnet dit le contraire.
What the GeForce 8600 GTS has that its higher-end cousins and those older cards don't is Nvidia's PureVideoHD 2.0 technology. Unlike our experience with Nvidia's original PureVideo HD decoding software, we were very impressed with the quality of HD video output under PureVideo 2.0. One of our main complaints with Nvidia's original version of PureVideo HD was that it would lose almost all detail in heavily shadowed areas. That wasn't the case here at all.
have an 8600 GTS on XP SP2. The DirectX capabilities on XP only claim MPEG-2 support. But this is done in the drivers. The CPU utilisation is insanely high for MPEG-2 decoding. SD MPEG-2 takes as much CPU as HD MPEG-2 used to take with the 7600 GT. The deinterlacing is better, but I don't think it's done on the card. Most of the CPU utilisation is in the kernel so the drivers are doing the decoding but 'tis not being done on the card.
For example 1080i HD MPEG-2 on my machine (Sempron 3400 (2.0 GHz), XP SP2, 1.5 GB DDR400) with the 7600 used to take around 40% CPU on average with the nVidia Purevideo decoder. When the 8600 GTS is put in, SD MPEG-2 (576i25 PAL MCE recordings, DVR-MS MPEG-2) take 40-45% of the CPU. The deinterlacing is better but 'tis definitely not being done on the card even though the Purevideo decoder reports that 'tis in DXVA modes A/B.
A mon avis, le codec MPEG2 qu'il a utilisé n'est pas prévu pour tirer partie de l'accélération matérielle.KILMER a écrit:Q'est-ce t'en pense!!?!!For example 1080i HD MPEG-2 on my machine (Sempron 3400 (2.0 GHz), XP SP2, 1.5 GB DDR400) with the 7600 used to take around 40% CPU on average with the nVidia Purevideo decoder. When the 8600 GTS is put in, SD MPEG-2 (576i25 PAL MCE recordings, DVR-MS MPEG-2) take 40-45% of the CPU.
KILMER a écrit:
Q'est-ce t'en pense!!?!!
The deinterlacing is better but 'tis definitely not being done on the card even though the Purevideo decoder reports that 'tis in DXVA modes A/B.
ajds a écrit:Purevideo ne se limite pas à l'accélération matérielle.
Purevideo HD = accélération matérielle, désentrelacement matériel, scaling matériel (netteté), réduction de bruit matérielle, adaptation de framerate matérielle ect.
Il est donc tout à fait normal que le désentrelacement soit fait en soft sur sa config puisque purevideo n'est pas dispo, il n'y a rien de mystérieux la dedans !
Note that the offload NVIDIA has built into the G84/G86 GPUs is hardwired for H.264 decoding only, you get none of the benefit for MPEG-2 or VC1 encoded content. Admittedly H.264 is the more strenuous of the three, but given that VC1 content is still quite prevalent among HD-DVD titles it would be nice to have. Also note that as long as your decoder supports NVIDIA's VP2/BSP, any H.264 content will be accelerated. For MPEG-2 and VC1 content, the 8600 and 8500 can only handle inverse transform, motion compensation and in-loop deblocking, the rest of the pipe is handled by the host CPU.
Although we haven't been terribly impressed with the gaming performance of the GeForce 8600, it is currently the best option for anyone looking to watch Blu-ray or HD-DVD on their PCs. The full H.264 offload onto the GPU makes HD movie playback not only painless but also possible on lower speed systems.
Even more interesting isn't the GeForce 8600, but the $100 GeForce 8500 that we'll be looking at in the coming weeks. According to NVIDIA, the GeForce 8500 will have the same H.264 decoding power as the 8600, so if you don't need the added 3D gaming performance then the 8500 will be an even better solution for HTPCs.
|
Retourner vers Matériel PC Home-cinéma |