Forum Discussion

kmayur's avatar
kmayur
Visitor
10 years ago

Observing tearing artifacts on Roku-4 with 4k @ 60fps HEVC stream

I am playing a 4K@60fps HEVC stream on a Roku-4 device and observing tearing artifacts.

I am using "customvideoplayer" sample application present in Roku SDK package (Link: http://wwwimg.roku.com/static/sdk/RokuSDK.zip). The "customvideoplayer" uses "roVideoPlayer" and "roImageCanvas" for player and user interface respectively. I have connected Roku-4 to a 4K@60fps TV.

I have some questions.

1) How does video rendering happens in Roku? Does it use double buffering logic? Is there any difference in rendering logic for 4K OR 60fps compare to other resolutions/fps?

2) Do I need to provide any extra parameters/flags to player if the stream is 4K OR 60fps?

3) As I mentioned, the sample application uses "roImageCanvas" for user interface. I am not sure whether the player also uses it internally for rendering. In SDK documents, I found "roScreen" component which is used in gaming application and there is an optional flag to enable double buffering. Can I use "roScreen" with player? Will replacing "roImageCanvas" with "roScreen" help to render @ 60fps?

20 Replies

  • Hi EnTerr,
    EnTerr:
    "It doesn't matter" 
    [img=15x15]https://forums.roku.com/images/smilies/icon_smile.gif[/img] I mean Vsync-ing is a very low level thing in producing video signal (below roScreen, OGL, video decoding). It's not configurable. And if it's that and there were a way to disable the "adaptive" mode of Vsync - mind you that would be worse overall because it will decrease effective FPS.

    As per my understanding, Adaptive V-sync will be turned ON if the stream has variable frame rate. But my streams have fixed 60fps frame rate, so I don't think it is turned ON. However, as you mentioned it is not configurable and hence I don't think that is the issue I am facing.

    "EnTerr" wrote:
    I understand the annoyance when asked to provide sample for something "obvious" but it may help your case. Metaphorically, you know how "class lawsuits" cover everyone from a group? Well they still need a few real humans as "representatives" in the court proceedings, to represent the plaintiff class. The picture you showed is a simulated("fake") image from wikipedia, not the real effect - it would be good if there is something tangible for a Co' person to look at.

    I am using some customer's stream stream for testing. I have checked with them and I am not able to share them with you. According to me, the issues are reproducible with any 4K@60fps stream.


    "EnTerr" wrote:
    That is a good question, though that by itself would not cause tearing.

     I think this is the issue because display is configured for 30fps and rendering is at 60fps. Decoder might be writing to a buffer when the buffer is being used by the display. 
  • "kmayur" wrote:
    "RokuMarkn" wrote:
    What do you have selected under Settings > Display Type?

    --Mark

    Thank you, @Mark πŸ™‚ . We have tested with both 4K-8bit and 4K-10bit options under Settings > Display type. Please note, we are using 4K-8bit content for testing.

    To give a brief note on our testing,
    1) We have tested by connecting Roku to a different HDMI ports on the TV and also tested with different HDMI cables.
    2) We have played the same 4K@60fps content on the same TV with a different media player device (Not Roku). It plays without any issues. So, it is not the issue with the HDMI port or HDMI cable.
    3) We have also tried rebooting Roku while TV is switched ON.

    Please let me know if you think any other tests to be done.

    Hi mark,
    Any update on this issue?
    As I mentioned, I expected "GetVideoMode" would return "2160p60" for 8-bit and "2160p60b10" for 10-bit display type selection. I am not sure what is going wrong here. Also please let me know if you need any other information.
  • After chatting with RokuMarkn I've filed a bug with the media team on your issue, linking to this thread.

     - Joel
  • Hi RokuJoel, I just wanted to know if there is any update from your side on this issue. Is Roku media team able to reproduce the issue? Is it a bug in the the system or some configuration issue in my application? Please let me know if you need any information from my side.

    Regards,
    Mayur
  • This bug is still under investigation, and currently believed to be specific to the Roku 4 platform (i.e software bug on that hardware).

     - Joel
  • Hi Joel,
    Thanks for the update!

    I have some questions. 
    1) Are you able to reproduce the issue on your Roku-4 device? If that is the case, I think I can stop debugging my application.
    2) Is there any workaround for this issue?

    Regards,
    Mayur
  • Joel.
    I conducted some more experiments and made the following observations.

    1. As you know, Roku4 has two 4K configurations under Settings > Display type.
      (a) 4K UHD TV (10-bit): In this mode, GetVideoMode returns 2160p30. Also call to CanPlay4K() prints "HDCP version too low". I didn't find a way to check current HDCP version so I called IsHdcpActive() with standard HDCP versions. I found that my Roku's HDCP version is 1.4.      
      (b) 4K UHD TV: In this mode, GetVideoMode() returns 2160p60. Also call to CanPlay4K() prints "HDCP 2.2 check passed: ".

    2. I have a 4K-UHD-10bit HDR @60fps HLS stream with HEVC video codec stored on a local server. I played this stream using "customvideoplayer" with both the display types.
      (a) 4K UHD TV (10-bit): Video was played at 4K resolution but at around 30fps. Also, the video had tearing issues.       
      (b) 4K UHD TV: Video was played at 4K@60fps. But video quality was not good. It could be because I played 10-bit video in 8-bit configuration.

    3. I converted this 4K-UHD-10bit HDR @60fps HLS stream to MP4 and stored it on the same server. I played it using "customvideoplayer" with both the display types.
      (a) 4K UHD TV (10-bit): Video was played @60fps but in a lower resolution. It appeared as if Roku was re-sizing a 1080p video for 4K display. I set Display Type to 1080p HD TV and played the same video. The video quality appeared to be the same.       
      (b) 4K UHD TV: Video was played at 4K@60fps. But video quality was not good. It could be because I played 10-bit video in 8-bit configuration.

    4. I copied the MP4 file to a pen-drive. I connected the pen-drive to the Roku and played this content using roku-media-player with both the display types. My observations in this experiment are same as MP4 playback using "customvideoplayer".


    Based on the above observations, I have some questions.

    1. As per my understanding, HDCP is related to Hardware+Software media pipeline from media receiver to rendering. In observation#1, I am not able to understand how and why changing display type is affecting the HDCP version.

    2. From observation#2, #3 and #4, my conclusion is that, Roku supports 4K-10bit @30fps and 4K-8bit @60fps.
      (a) Is this the case?       
      (b) If yes, are there any reasons for that?       
      (c) If 4K-10bit@60fps media is in MP4 format, Roku appears to be re-sizing video to 1080p and playing @60fps. If the media is in HLS format, it appears to be playing at same resolution and frame-rate and failing due to some limitations. Why is there a difference in behavior? Is Roku using different profiles/media engines for MP4 and HLS?

    3. I checked GetVideoMode() and found that there are some strings for 10-bit display (Ex: 2160p30b10, 2160p60b10). I don't understand why I am not getting this output even when I set Display type to 4K UHD TV (10-bit).


    The following are my Roku4 specifications.
    Model: 4400X - Roku 4
    Software version: version 7.5.0 - build 4096-17

    I am using Samsung flat smart TV.
    Model Code: UA55KS7000

    Please let me know if you need any other information.

    Regards,
    Mayur K