0

On Windows 10, I have an NVIDIA GTX 970 GPU, and a 4 core Intel CPU.

When I play a video up to 4K (any codec, any player), the GPU usage increases a little (about 10%), and CPU usage stays pretty much unchanged; everything runs quiet and smoothly.

But, if I play a video larger than 4K (any resolution larger than 4096 × 2160, no matter if it's just a few more pixels) then the GPU stays idle and the CPU usage increases up to 98%. This results on a choppy playback and a loudly experience as the CPU fan goes crazy.

It seems to me that the GPU has a lot of power unused and should be able to handle >4K videos, but for some reason it refuses to play/decode them, and instead all the work is passed to the CPU.

Is there a configuration / player / decoder or something that would allow me to play >4K videos taking advantage of the GPU?

3
  • You need a player that supports offloading video decoding. There's no other way this could work.
    – Daniel B
    Commented Apr 14, 2023 at 17:29
  • 1
    @DanielB more than that you also need a GPU that contains a hardware decoder capable of decoding the codec at the resolution and bandwidth supplied. Not all hardware supports all codecs and even if it does support a codec that does not mean it can decode all videos of that format. Many cheaper Android streaming devices might support AVC (h.264) level 5.1 meaning that they might only just be able to do 4K @ 30Hz and behave very badly beyond that. There is only so much work any given blob of electronics can do....
    – Mokubai
    Commented Apr 14, 2023 at 19:46
  • While that is all true, hardware support doesn’t mean all video player software is magically going to use it. That was my point: Software support is required, too.
    – Daniel B
    Commented Apr 14, 2023 at 21:11

2 Answers 2

2

Your GPU has a fixed function video decoder called nvdec. It does not use the CUDA cores for video decoding so while your GPU might be powerful it is not able to use that power for this specific task.

From that page: enter image description here

Depending on the video codec you may simply not be able to decode it using your graphics card. If it is encoded using h.265 or AV1 instead of h.264 you simply lack the hardware to do decoding on the graphics card.

In theory you should be able to decode 4K h.264 video. From Nvidia Purevideo Gen 6:

The sixth generation of PureVideo HD, introduced with the Maxwell (microarchitecture), e.g. in the GeForce GTX 750/GTX 750 Ti (GM107) and also included in the Nvidia GeForce 900 (Maxwell) series GPUs has significantly improved performance when decoding H.264 and MPEG-2. It is also capable of decoding Digital Cinema Initiatives (DCI) 4K resolution videos at 4096 × 2160 pixels and, depending on the driver and the used codec, higher resolutions of up to 4096 × 4096 pixels. GPUs with Feature Set E support an enhanced error concealment mode which provides more robust error handling when decoding corrupted video streams.

The sixth generation PureVideo HD is sometimes called "PureVideo HD 6" or "VP6", although this is not an official Nvidia designation. This generation of PureVideo HD corresponds to Nvidia Feature Set E (or "VDPAU Feature Set E").

2

Your GPU was described as 4K on a budget and was/is an extremely successful one, although today it is outdated. But it was never designed to play video larger than 4K. At its time it was at the forward point of the technology.

If you have installed codecs that can handle more than 4K video, then all you can do is to force the video player to use the GPU.

You may force any program to unconditionally use the GPU (or the CPU) using Windows 10 Settings:

  • Run Settings > System > Display
  • Click "Graphics settings"
  • Click the drop-down menu and select your app, or click "Browse" to navigate to its .exe file
  • Click "Options"
  • Choose "High performance" and press "Save".

If this won't help, then the GTX 970 just wasn't designed for such video.

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .