I need some help figuring out what could be the reasons why a pc and/or VLC would use the CPU for playing 4k video instead of GPU. (I think it always uses the CPU for rendering video, regardless of video, because system performance graphs always show maybe 2-5% GPU usage and 30-80% usage on some or all cpu threads)
Here's what I know:
- playing this video in the browser on youtube at 2160p 60fps, works just fine with 80% cpu usage on half the cores - still not much GPU usage in my system monitor though. Playing the same video in VLC however, uses 100% of my ryzen 5 6 core 12 thread 4.2GHz CPU, and stutters at best and stands still usually. [EDIT] I got the video in 3 formats: mp4, mkv, and webm, all 4k@60. MKV and WEBM use the same resources (cpu) as in the browser/yt - however, the FPS is lower, definitely at sub 20fps instead of 60fps. The MP4 however uses 100% of my cpu and stutters as I explained.
- VLC is set to use hardware acceleration, and I tried all options (Automatic, direct3D 11, directx DXVA 2.0), restarting vlc (and even my PC) after the changes, and no difference.
- I am using the latest VLC (3.0.11)
- the GPU is a gtx 1660Ti, which should be more than overkill horsepower wise, can do 120Hz VR etc, and uses latest nvidia drivers. I don't have any other/integrated graphics cards.
Is it possible that because I don't have some kind of codec / driver on my system, then VLC defaults to CPU rendering? And the same question for my browser which runs at better fps than in VLC but still appears to use the CPU only. Also how can I find out what decoding is supported on what hardware and what codecs are needed? How can I find out a log or something where VLC tells me if it is or it isn't rendering the current video using the GPU and why?
Why is it so hard to decode video on the GPU? If you can write a compute shader that does GPGPU computations for any old thing, like calculating physics, or generating meshes, or sculpting pointclouds, or prime number factorization, or running an 100% entire demoscene shadertoy style videogame, why not a video? I understand some codecs are very peculiar and maybe designed for CPUs, but surely a 2019-2020 gaming GPU can brute force it better than a 6 core cpu or a crappy smartphone's ARM chip video module.
[Update] I discovered it is encoding related. The video in the MP4 container does not use the GPU at all, while MKV and WEBM uses the GPU:
It appears that the youtube MP4 uses AV1 codec encoding and the other 2 use VP9 ecoding. Both of these are Open Source formats so VLC should have no issue dealing with them.
It also appears that there aren't any videocards right now that incorporate the decoder into their chip. But whether or not there's an ASIC for it doesn't mean that the gpu itself couldn' decode it in a GPGPU manner (as is also mentioned on the wikipedia entry for AV1). Either way, this doesn't solve my problem, but explains the reasons.