1

I need some help figuring out what could be the reasons why a pc and/or VLC would use the CPU for playing 4k video instead of GPU. (I think it always uses the CPU for rendering video, regardless of video, because system performance graphs always show maybe 2-5% GPU usage and 30-80% usage on some or all cpu threads)

Here's what I know:

  • playing this video in the browser on youtube at 2160p 60fps, works just fine with 80% cpu usage on half the cores - still not much GPU usage in my system monitor though. Playing the same video in VLC however, uses 100% of my ryzen 5 6 core 12 thread 4.2GHz CPU, and stutters at best and stands still usually. [EDIT] I got the video in 3 formats: mp4, mkv, and webm, all 4k@60. MKV and WEBM use the same resources (cpu) as in the browser/yt - however, the FPS is lower, definitely at sub 20fps instead of 60fps. The MP4 however uses 100% of my cpu and stutters as I explained.
  • VLC is set to use hardware acceleration, and I tried all options (Automatic, direct3D 11, directx DXVA 2.0), restarting vlc (and even my PC) after the changes, and no difference.
  • I am using the latest VLC (3.0.11)
  • the GPU is a gtx 1660Ti, which should be more than overkill horsepower wise, can do 120Hz VR etc, and uses latest nvidia drivers. I don't have any other/integrated graphics cards.

Is it possible that because I don't have some kind of codec / driver on my system, then VLC defaults to CPU rendering? And the same question for my browser which runs at better fps than in VLC but still appears to use the CPU only. Also how can I find out what decoding is supported on what hardware and what codecs are needed? How can I find out a log or something where VLC tells me if it is or it isn't rendering the current video using the GPU and why?

Why is it so hard to decode video on the GPU? If you can write a compute shader that does GPGPU computations for any old thing, like calculating physics, or generating meshes, or sculpting pointclouds, or prime number factorization, or running an 100% entire demoscene shadertoy style videogame, why not a video? I understand some codecs are very peculiar and maybe designed for CPUs, but surely a 2019-2020 gaming GPU can brute force it better than a 6 core cpu or a crappy smartphone's ARM chip video module.


[Update] I discovered it is encoding related. The video in the MP4 container does not use the GPU at all, while MKV and WEBM uses the GPU:

enter image description here

It appears that the youtube MP4 uses AV1 codec encoding and the other 2 use VP9 ecoding. Both of these are Open Source formats so VLC should have no issue dealing with them.

It also appears that there aren't any videocards right now that incorporate the decoder into their chip. But whether or not there's an ASIC for it doesn't mean that the gpu itself couldn' decode it in a GPGPU manner (as is also mentioned on the wikipedia entry for AV1). Either way, this doesn't solve my problem, but explains the reasons.

3
  • I tried to repro on Mac, just in case there was something I could see in VLC, but I can't repro. Youtube runs about 'one core' at nearly full speed, VLC only about 25% of one core. GPU usage goes to about 30%. Playing both together, one on each screen, doesn't seem to affect those figures. It's a powerful Mac, but it's an old Mac, 2012, 12-core Xeons.
    – Tetsujin
    Commented Jul 17, 2020 at 15:32
  • @Tetsujin I realized it can vary depending on what version of the video you have locally. MP4 and MKF and WEBM can give very different performance results Commented Jul 17, 2020 at 15:33
  • Sorry, I just noticed I only can get the 1080p 60Hz versions, so it's not a valid test. My bad. from my YTDL I can get 2160p max. Can't get the 8k at all it seems
    – Tetsujin
    Commented Jul 17, 2020 at 15:34

2 Answers 2

2

I discovered the problem is encoding related. Youtube packages the video in 3 containers: the MP4 container when played in vlc does not use the GPU at all, while the other 2 containers MKV and WEBM use the GPU:

enter image description here

It appears that the youtube MP4 uses AV1 codec encoding and the other 2 use VP9 ecoding. Both of these are Open Source formats so VLC should have no issue dealing with them but for some reason can't play AV1 on the gpu. So the question becomes why can't vlc / windows play av1 on the gpu.

It also appears that there aren't any videocards right now that incorporate the AV1 decoder into their chip (like they did with h.264 and h.265). But whether or not there's an ASIC etc for it doesn't mean that the gpu itself couldn't decode it in a GPGPU manner (as is also mentioned on the wikipedia entry for AV1). Either way, this doesn't 100% solve my problem, but explains the reasons and I know what to look for now (how to play AV1 on the gpu - but I don't know how often I'll encounter AV1 in my daily life so I'm happy for now without it)

1

The source of the difference is that VLC uses its own codecs, while the browser uses the system-installed codecs.

I have downloaded the video in several resolutions. It uses a codec identified as "isom (isom/iso2/mp41)" and tried it out:

  • With VLC, 8K version - no video
  • With VLC, 4K version - video jumpy with large intervals between frames (freezes)
  • With MP-HC (x64), 8K & 4K versions - fluid, but used all my cores at 100% and about 30-50% of GPU
  • With Chrome YouTube - fluid with little use of CPU & GPU, but 720p is the largest resolution offered
  • With Firefox YouTube - 4K version, fluid, little CPU, 20-50% GPU
  • BUT when the 8K & 4K video files were dropped directly into Firefox or Chrome, video play basically didn't work.

My conclusion is that this codec needs lots of resources and is pretty troublesome for most players. The browsers when using YouTube cheat by using a much lower resolution than the one they are supposed to be using.

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .