Since I haven’t seen this mentioned anywhere else really (it’s very well hidden in NVidia’s documentation), I thought I should mention it here in the interest of helping people fixing odd playback problems (read: it’s part of my evil plan to make people stop using hardware acceleration).
Anyway, getting to the point, certain NVidia GPU’s have a PureVideo decoding chip that does not support decoding H264 videos of certain widths. More specifically, the broken width ranges are 769-784, 849-864, 929-944, 1009-1024, 1793-1808, 1873-1888, 1953-1968 and 2033-2048 pixels. Any H264 video with a width in these ranges will fail to decode using CUDA or VDPAU or DXVA or whatever you please. This does not, of course, affect software decoding in any way whatsoever.
Particularly interesting and commonly used resolutions that are affected by this are, for example, 852×480, 864×480 and 1024×576.
Affected GPU’s
A list of the affected GPU’s follow.
- GeForce 9300 GE
- GeForce 9300 GS
- GeForce 8400 GS
- GeForce 9300M GS
- GeForce G100
- GeForce 9200M GS
- GeForce 9300M GS
- GeForce G 105M
- GeForce G 103M
- GeForce 9100M G
- GeForce 8200M G
- GeForce 9100
- GeForce 8300
- GeForce 8200
- nForce 730a
- GeForce 9200
- nForce 980a/780a SLI
- nForce 750a SLI
- GeForce 9400M G
- GeForce 9400M
- GeForce 9300 / nForce 730i
- GeForce G102M
- GeForce G102M
- ION
Fuck GPU decoding; GPU decoding sucks; GPU decoding is dying; GPU decoding is dead to me; GPU decoding hit WTC.
Source
This actually is mentioned in NVidia’s official documentation, but it’s pretty well hidden. First, go to Appendix A in the VDPAU documentation and look at the GPU’s with an “1” in the “VDPAU features” column. Then go to Appendix H in the same documentation, and in the “Implementation Limits” section, go to the “VdpDecoder” subsection, and at the very bottom you’ll find a small note saying:
VDPAU Features Note 1
GPUs with this note may not support H.264 streams with the following widths: 49, 54, 59, 64, 113, 118, 123, 128 macroblocks (769-784, 849-864, 929-944, 1009-1024, 1793-1808, 1873-1888, 1953-1968, 2033-2048 pixels).
Thanks Lord for pointing this out to me.
Comments (11)
U MAD?
Why all the hate on GPU decoding?
People still use that shit? Outside of frame-serving to avisynth that is.
Retardus vincit omni.
How would you be able to tell whether this issue is of the Linux driver or the GPU itself? I can’t see any reason for this to be a software limitation, but it’s Nvidia we’re talking about here.
I remember reading about this in the Wiki about a year ago while researching what Nvidia GPU I needed to best support VDPAU.
http://en.wikipedia.org/wiki/Nvidia_PureVideo
The third generation PureVideo HD line was one line to stay away from.
oh lol
Hey, it should’ve been “NVidia” instead of “GPU decoding”.
That’s what you get when you buy nvidia garbage: crappy hardware that doesn’t work as it’s supposed to.
Hm… why should this affect CUDA? CUDA is just a stream processing framework and has nothing to do with PureVideo… For me, a decoder implemented on CUDA (like CoreAVC) should work just fine…
The 8400GS can still be found with a VP2 chip in it IIRC, the one I have at least is capable of 1024×576 without a problem using VDPAU.
Those look like low-grade GPUs to me.
I don’t even recall them supporting this stuff in the first place.
the support starts from 8600GT and up iirc
Trackback/Pingback (1)
[…] (769-784, 849-864, 929-944, 1009-1024, 1793-1808, 1873-1888, 1953-1968, 2033-2048 pixels). Source I found this to be interesting and thought to spead the info to everyone else. Reply With […]