1

I have a low power desktop (ASRock Mod 1150 H97M with integrated graphics, Intel Pentium G3258 CPU, 1920*1200 monitor, Windows 10) which acts as a media server in my home. My TV is Sony/Android TV/ 55 inch/full HD (kdl55w805C). I have Virgin fibre 350mbps connection, and both TV and desktop is physically connected to the Virgin Wi-Fi router through cat 6 cable.

I have movies/videos stored in the desktop. In the TV I use Kodi to watch the movies which is shared in the desktop.

All good until I recently came across a movie which is 4K HEVC 10bit. Kodi in my TV just refused to open the file. Since my TV is only full HD and the processor in TV is presumably not up to decode HEVC it is understandable. So my next target is to somehow use the desktop to process the video and stream to the TV.

When opening the video file, Windows 10 movies asked me to buy an HEVC decoder from store -which I did. After this (to my surprise) my relatively weak computer played the movie in the computer without any noticeable issues! However when I try to cast the movie to the TV, it just failed. The TV showed "unsupported video error". I tried using Windows Media Player, but the result was same — it can play the movie in computer but casting to TV failed.

This got me a bit confused. I thought when casting/streaming, the computer decodes the video and sends the decoded signals to the player. So when the TV is saying unsupported video what does that mean?

I tried VLC, but it cannot even play the video in the computer.

Tried Plex making the computer the Plex server. The TV Plex app can show all 1080p videos beautifully, but the HEVC playing failed. I tried playing the HEVC file using the Plex server in my Samsung S8 smartphone (which is 4K) and it played it without any issues. My question here is which device is decoding the HEVC file? Plex media server in the desktop or the playing device?

How can I solve this issue? Getting a dedicated graphics card(with HEVC decoding) is going to solve the problem? I am doubtful if it is because my system is able to play the video using Windows Media Player so it has the processing power to decode the video. But still it failed to cast to the TV.

Any insights are much appreciated

1 Answer 1

2

Video streamed across a network is always compressed. Even if you are only streaming what your TV is capable of, with a low film/cinema frame rate of 24 fps, that's 1920 pixels wide * 1080 pixels high * 24 bits per pixel * 24 frames per second = about 1.2 gigabits per second, which is more bandwidth than most consumers have on their home networks, wired or wireless.

Uncompressed 4K UHD (2160p) with 10 bits-per-RGB-color-channel at a high frame rate like 60fps approaches 15 gigabits per second.

So you need a casting solution that will decode HEVC, downsample/convert from 2160p and 10-bit color to 1080p and 8-bit color, and recompress as something like H.264 AVC that your TV probably understands, and cast that to the TV. That seems like a lot of work, and not something I would expect any free video player to automatically do on the fly.

1
  • Actually Emby server (free) does that - transcoding on demand - when properly configured.
    – user931000
    Commented Dec 12, 2018 at 20:10

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .