Community Discussions

Connect with other Roku users to learn more about streaming, cord-cutting, finding your favorite content, or talk about the latest entertainment happenings. It's all on Roku!
Showing results for 
Show  only  | Search instead for 
Did you mean: 
Level 7

The Myth of Dolby Vision HDR & How to perform manual display calibration.

For those new to 4k-UHD, all that specification actually means is that you're getting a specific resolution and refresh rate in its optimal specification.

The Roku devices and the apps supported on them that can provide 4k-UHD content, for the most part (details about popular apps to follow), do a great job of keeping streams in the "rawest" possible form, and that's a fantastic feature for content delivery.

Full disclosure, some aspects of this post are going to cover some complex technical information, and there will be a slight amount of math shown too.  The bottom line is that Dolby Vision isn't necessarily a feature that anyone actually "needs" to be supported because all that Dolby Vision compatibility actually does is automate the "Color Matching" process for content, and when a monitor / display / TV has both CMYK ( "CMYK" means Cyan, Yellow, Magenta, Black, just like a good laser photo printer has for toner cartridges ) adjustment abilities with traditional RGB ( "RGB" means Red, Green, and Blue ) color adjustment features, along with the ability to adjust both Brightness and Contrast, being able to achieve Dolby Vision's quality is possible as long as the display is a high quality 4k-UHD display that has those color adjustment features.  Read on for the fundamentals about how "Color" and actually works on a monitor along with other details about displays, apps, how content is delivered over a stream, how streamed content's delivery is varied depending on the app / streaming service, and more.

First things first; What do you need to perform color calibration on a high quality 4k-UHD display?

To keep this portion of the post simple, simply do a web search for "Dolby 4k pattern generator" , and you'll see results for many hardware tools that vary significantly in cost from a few hundred dollars, to almost two to three thousand dollars.

What would a decent pattern generator have?

A decent pattern generator would have the following:

1> An HDMIv2+ connector.

2> The ability to work with HDCP v2.2 content, and the ability to work with content that doesn't feature any HDCP protection.

Some Questions People Will Likely Have Anticipated and Answered:

Question:  I've heard about software that does the same thing, why wouldn't I use software that's less expensive?

Answer:  Software is an imperfect solution in most cases because software can be interfered with by other software such as nVidia or Catalyst Control Panel that have their own color optimization, so if you're using a computer display as your TV, not only do you have to work with your display's manual color adjustment, you also have to deal with your video drivers' color display optimization features.

Additionally, many displays that feature advanced manual color adjustment also happen to individualize settings based on the source, so if your computer is connected to the HDMI Port 2 connector, and your Roku is connected to your HDMI Port 1 connector, the settings will only be saved by the display for the HDMI Port 2, and there will not be anyway to make those settings save for the HDMI Port you're using for your Roku.  Additonally, content delivery is also a factor, and a computer will not necessarily deliver content to display the content's colors in the exact same way the Roku will.  This last point generally applies not just to different computers with different video hardware, but all devices of any kind that deliver a video signal to a display, and that need to standardize color across all displays and devices is the reason why Dolby Vision was created in the first place.

EXAMPLE: Picture watching your favorite sci-fi movie, and you're seeing a red engine glow out of the back of a huge enemy cruiser.  Somewhere else, somebody will be watching the exact same movie, but they'll see a purple tint in the color of that same engine blast from that same enemy cruiser at the very same time index in the film that you're watching.  Clearly the authors of that sci-fi movie didn't intend for the two different viewers using different displays to see two different colors, and Dolby Vision avoids that problem entirely because it "standardizes 'color matching'."

Question:  Can manual calibration using a Dolby pattern generator insure that I'm going to see the colors the author intended 100% of the time?

Answer:  No, because only a display fully compatible with Dolby Vision can make that kind of guarantee.

Question:  What is the likelihood I'm going to see the colors the author intended after using a Dolby pattern generator across content?

Answer:  About 98.5%, and that's still an A+ grade.  Content delivery varies depending on the content itself, and the apps and/or services being used to receive it, but for most mainstream content delivery apps and services, seeing the colors the author intended is going to happen after having performed manual color adjustment using a Dolby pattern generator.

Question:  Is the advantage of using the kind of Dolby pattern generator with the features mentioned that I can use it with any display that has advanced color customization and an HDMIv2+ port that can handle HDCP v2.2 content?

Answer:  Yes, and that means it's completely possible to continue to get potential use out of a Dolby pattern generator for decades because backward compatibility is always going to be a part of display hardware design whether it's a computer monitor, or a TV.  Doesn't seem so expensive now, does it?  Smiley Wink

Question: Is all content going to be displayed in HDR Color or with Dolby Vision if I use a Dolby pattern generator to calibrate my display?

Answer:  No.  There are two different encoding standards that are essentially the same with little difference between them, and those standards are "HDR10" and "Dolby Vision."  Authors that want their content in "Dolby Vision" have to pay Dolby Laboratories a license fee to be able to have their content present in "Dolby Vision" , but many content authors elect to use "HDR10" because HDR10 is an "Open Source" variant that serves the same purpose of color matching that Dolby Vision is purposed for as well.  Standards that are Open Source do not require the payment of a license fee in most cases.  To learn more about "HDR10" and "Dolby Vision", this article from the professionals at "Light Illusion" gives a nicely comprehensive and short presentation of the differences in the features as well as the features that are common to both the "Dolby Vision" and "HDR10" standards.

Question:  You keep talking about content authors, so I was wondering, what is a "Content Author?"

Answer:  A content author can be a film maker, a publisher, a movie studio, pretty much any organization, company, or even just a single person that makes any kind of video.

Now onto more of the fundamentals about display technology, content, and the apps and services used to receive content.  Fair warning, math ahead.  Smiley Wink

Specifically 4k-UHD means that a display is capable of a resolution of 3,840 pixels across the top of the display from left to right/right to left, and 2,160 pixels from the top to the bottom/bottom to the top of the display.  To get the exact number of pixels that represents, simply multiply for area; 3840x2160 = 8,294,400 pixels.  Depending on a display's size, 4k-UHD resolution is seamless in its appearance, but for the most part invisible because the human eye is actually incapable of seeing in 4k-UHD.  The advantage of 4k-UHD is its seamless appearance, but it will not appear as seamless on a 65" display as it would on a 27" or even a 32" display because there are a finite number of pixels on 4k-UHD displays, and that means the larger the display, the larger each pixel is.  4K-UHD is also called "2160p" , and the "p" means progressive.

For those that didn't grow up with CRT's ( CRT means "Cathode Ray Tube" ), CRT's were designed with magnets and three different color lights in the rear of the tube called "Color Guns" that were RGB "Red" , "Green" , and "Blue."  The fronts (the screens) were coated in phosphor because phosphor can be charged magnetically.  It was the mixture of the magnetization of the phosphor and the speed that the Color Guns operated at that created both the colors, and what people came to know as a "Refresh Rate."

To save money manufacturing CRT's, and to make sure TV's worked well with TV broadcasters' signals (this is in a world without Cable or Satellite TV), the CRT manufacturers standardized the displays to be made in two different ways: "Interlaced" EXAMPLE: The last "Interlaced" resolution largely supported is articulated as "1080i" , and "non-Interlaced" that is now called "Progressive" or "Progressive Scan" and is signified by using a lowercase "p" after citing the last numbers in a resolution being used.  Example:  "2160p" signifies that the display's resolution is 3,840 pixels horizontally (left to right/right to left) by 2,160 pixels vertically (top to bottom/bottom to top), and is "non-Interlaced" a.k.a. as "Progressive".

For computer displays, the standard beginning in the late 1980's became non-Interlaced.  Interlaced displays such as TV's and early computer displays would have one out of every other line of each image drawn by color guns one at a time both vertically and horizontally.  EXAMPLES:  A single image's signal goes to the display from the computer, and the display's color guns proceed to draw the lines beginning with horizontal line 01 and vertical line 01, and would then proceed to draw line 3, line 5, line 7, etc. until both color guns were finished drawing all of those lines horizontally and vertically for their first pass.  The color guns would then draw line 2 horizontally and line 2 vertically, and then would proceed to draw line 4, line 6, line 8, etc. until the color guns were finished drawing all of those lines horizontally and vertically for their second pass.  Interlaced image refreshes are what caused a TV's and older computer displays to appear to have empty spaces horizontally from the top to the bottom/bottom to the top of an image.  Non-interlaced displays ( a.k.a. "Progressive" displays ) would draw every line of an image in a single pass both horizontally and vertically, and it is for that reason that no gaps appear in the image of a non-interlaced / progressive scan capable display.

The number of times an entire image was drawn onto a display using this method came to be known as the "refresh rate" and is now commonly called the "frame rate".  Neither term is wrong, and both articulate and represent what is happening mechanically just fine either way.  If a complete image was drawn 30 times within 1 second, the "refresh rate" / "frame rate" is considered to be 30hz.  Computers actually have to draw the image in the memory of the computer and the computer's video hardware first, and then send the signal to the display.  Not all computers and/or computer's video hardware can draw the images into memory at a particular speed, so if your computer is setup to a high resolution like 2160p, it is possible that you may have a display that can get 60 frames a second at 60hz, perhaps 120hz, 144hz, 240hz, but the computer and/or computer's video hardware might only be capable of 30hz.  That same dynamic applies to all video hardware in the digital age because everything is drawn within the memory of the hardware first before being sent to a display, and if copy protection is a factor ( i.e. " HDCP" ) then each frame has to be either passed by the hardware to the display undecoded, so that the display itself can decode it, decoded by the hardware first, and then sent to the display, or decoded by both the hardware and the display (this double authentication method is presently how HDCP decoding works).

Living in the digital age, is tricky when it comes to display technology because the most popular display technology at this time is called "LCD" ( LCD means "Liquid Crystal Display" ).  LCD's don't actually have any kind of refresh rate / frame rate because for all intents and purposes, and for lack of a better analogy, the image itself when presented on an LCD is basically "burped" up all at once.  The catch 22 is that no matter what, there is a signal being sent by connected video hardware that may vary depending on how many times that hardware can draw the image into its memory, so the signal itself might be 60hz, 120hz, 144hz, 240hz, but the display hardware has to be able to understand how many frames are in the signal that it's receiving.

Most LCD displays can't understand a signal higher than 60hz, and many 4k-UHD displays have two problems: Firstly; most 4k-UHD displays can't interpret a signal higher than 30hz for most of their HDMI ports.  Secondly, those 4k-UHD displays that can interpret a signal of 60hz are still being sold with only the ability to work with content protected with HDCP v1.4 that has been around for nearly 10 years now, and the current standard is HDCP v2.2.  Those two issues aren't as much a factor on high-end 4k-UHD displays that are able to recognize signals at 120hz and above, but because of the way that 4k-UHD displays are sold, it does basically mean that 4k-UHD only means you're guaranteed to get 2160p resolution with any other aspects of having compatibility with content and video hardware only being visible in either the fine print, technical specifications of the display, or both.  Also, "Yes" this does mean that all LCD displays don't have an actual frame rate associated with the LCD panel itself, but rather simply has circuitry (where the HDMI ports are) to deliver the signal to the LCD that either can, or cannot process signals at certain refresh rates.  When buying a new display, always read the fine print, and look up the technical specs too.

Advanced Color configuration that feature CMYK on an LCD will show by the display having the following colors available to adjust:

Red, Green, Blue, Cyan, Magenta, Yellow

About CMYK on LCD's with Questions Anticipated and Answered:

Question:  How is that CMYK?  I thought CYMK meant Cyan, Yellow, Magenta, and Black like on my photo printer.

Answer:  It still does; however we're not mixing with paint or ink, we're mixing with Liquid Crystal.  CRT's only mixed with Red, Green, and Blue (RGB) because CRT's were a light-based technology.  CRT's were actually modeled after the human eye by way of studies performed for medical science.  Our eyes perceive color based on light mixture, but actually draw what we see onto the back of our eye in CMYK (just like mixing paint or ink) after the retina receives the amount of light in an area, and the various spectrums of light that are present in that area.  The most intense byproduct color for CRT's was Red, and that's why CRT's made in the late 1990's and early 2000's had special coatings and shields inside them to protect people from "infrared radiation" because the color Red burned the brightest out of all of the colors made inside a CRT.

Liquid Crystal burns differently than magnetized phosphor, and there are two colors that burn the most brightly in CRT's with the most prominent being Blue, and the second most prominent being Green.  For contrast and brightness, the four colors that have the most effect are Green, Cyan, Yellow, and Magenta (in that order).  Yellow governs the black and white space of the contrast and is present in most colors when Liquid Crystal is burned; however Green light bleeds off of Liquid Crystal into other areas of the LCD, and the same is true of Cyan and Blue because the byproduct colors depend on what color a given pixel is being made to show.  Magenta also has a strong influence, and without being able to adjust the amount of Magenta, you may see Red where you're supposed to be seeing pink, or even orange.  Additionally, dark backgrounds or subtle elements in an image may not appear subtle enough, too subtle, or may even not appear at all without being able to adjust the colors of Cyan, Yellow, and Magenta, so together they essentially make the color Black; hence "CMYK".

That's the science behind the technology, and now people can understand why using a standardized form of color matching, whether HDR10 or Dolby Vision, is so important to authors.  Oh, but what about content?  Well, the following is all about apps and content.  Smiley Happy

Movies, TV shows, etc. can be made using equipment that supports anything from 15 frames a second (15hz), to 30 frames a second (30hz).  A lot also depends on whether or not the content was filmed using digital cameras, film cameras, and also how it was mastered or cut in the editing room.  Most master copies are edited by computer because that is the standard in the present.  The present standard for film making is 30 frames a second, but in the past, the standards varied a great deal depending on both the film maker and the equipment they were using to film.  Typically, the standard for film from the late 1960's through the early 2000's was actually 15 frames a second (15hz).  While this information provides a decent perspective about how films, TV shows, and videos are made, what it truly means for digital content is that digital content is another form of master copy.  Master copies are simply the original content stored in digital format.  That same content may not be have originally been mastered with enough image quality to be effectively presented in 4k-UHD or even HD for that matter, and as for higher frame rates, the most benefit higher frame rates provide on a display when streamed digitally is that that they offer less eye strain, but otherwise are meaningless as to the effect they have on the quality of the original content because the original content, is quite simply, what it is.

Netflix in general operates to present streamed video at 4k-UHD at a rate of 24 frames a second.  Anything beyond that is up to your video hardware to upscale it when running the app or leave it alone.  The Roku platform simply upscales all the content it runs to whatever you've got it set for in the system settings (i.e. 4k-UHD at 30hz that NetFlix will not run 4k-UHD on at all, or 4k-UHD at 60hz that Netflix requires to run 4k-UHD content).

Altogether Netflix's default for content is pretty optimal because Netflix encompasses that average of 15 to 30 frames a second for movies, TV shows, and video in general quite nicely.  It's the copy protection in HDCP v2.2 and the fact that if Netflix doesn't detect that a 60hz refresh rate is the hardware's setting for video that, "gets ya!"  For compatibility with Netflix to present content available in 4k-UHD, must haves are a display that can handle 60hz, and decode HDCP v2.2.  Without those settings on your Roku and without a display capable of those features, "No 4k-UHD for you!"

Compression.  Arrrgh, the least fun topic when it comes to streaming.  Some services use heavy compression, and others barely use it all.

Have you every played a compact disc for music in your computer or laptop in a media player that shows you the bit rate?  If you haven't, try it sometime.  You'll see a bitrate of about 1500+K/b p/s when playing a CD, but if you play that same song from that CD in an MP3 at 128K/b p/s or even 320K/b p/s, that's what you'll see.  That's compression in a nutshell, and streaming video isn't any different.  Even at 320K/b p/s for a "high quality" (*snickers* because we actually like CD's in our family), you're only getting 21 and 1/3 of a percent the quality of the original master copy of the music conservatively estimated; however on average you're actually only getting about 8.5% of the quality of the music you're playing in MP3 format.  That's a massive quality drop.

Roku supports FandangoNow natively for its devices sold as Retail, while Walmart's sales of Roku devices push its VUDU platform on Roku devices natively.  It's easy to see why Roku was willing to broaden its marketshare by making an agreement with Walmart to push its VUDU platform because both platforms are amazing, and use very little compression at an average rate of 25% or less.  Further, they both use the more advanced streaming protocols that allow for excellent speeds without the need for compression.

YouTube is a strange bird because it's owned by Google, so if you use the same account for YouTube and your Google Play purchases, you can make playlists using anything you own using the YouTube app.  YouTube actually doesn't use a lot of compression for the content; however what they do use are some of the heavier protocols based on WideVine (what many internet browsers use for protected content like videos, mostly Chrome based browsers), and one of those protocols is called "VP9".

The protocols Google has to use to protect content from unauthorized use are not fast, and as far as performance goes, pretty lousy, because they're just not able to verify the amount of packets going through streamed content as authorized with any degree of efficiency.  We'd agree with Google that these heavier protocols are certainly necessary given that Google has to worry about the Android Smartphone platform, Tablets running Android, and people using computers that have Linux OS'es, Windows, and other Operating System platforms along with numerous internet browsers of different makes that people use their Google services with, along with Google's own devices that they recently expanded their selection of with the Google Home Hub, and of course there's also ChromeCast.  Google does a lot to prevent unauthorized use of content, and they do a decent job for content authors.  The extent that Google has to resort to, while necessary, slows things down considerably.  In our opinion though, nothing beats being able to make your own playlists out of favorite film series on YouTube while being able to customize them to have some of the films in the playlist be the Director's Cuts while others in it are the originals.  It's great when it works, and when there's a high enough speed internet connection available that can handle the YouTube app.  In short, it's not the YouTube app's lack of compression that causes its slow performance.  The actual cause of YouTube's slow performance are the copy protection methods and protocols used by YouTube in general that are the root cause of its sluggishness on slow ISP's.

Additional note regarding Google Apps on Roku:  There's a very nice Google Play app available on Roku that we actually think very highly of because that particular variant of Google Play seems to operate rather flawlessly on Roku.  Playing movies on the Google Play app does reduce the quality because the compression is not quite half (50%), but it is a lot higher a compression rate than most of the other services.  If you've ever wondered, "Who's that actor that I recognize and can't place?" , well, the Google Play app on Roku has a feature that will show you who the actor is on the screen.  Some older movies and shows don't support that feature, but well over 99% of them too, and it's a joy to use.  Out of all of the apps made by Google, the Google Play app on Roku is honestly Google's best one, and the irony is that it runs best on a device that isn't even made by Google.  Frankly, we like it that way, and with the exception of being able to setup some playlists as a feature we'd like to see in the Google Play app for Roku, and maybe using a lower rate of compression, we really don't see any need for the Google Play app on Roku to change.

Amazon Prime Video.  'nough said!  Smiley Very Happy  Amazon as a video provider also works with the United States Air Force and NASA to beam 4k-UHD video back and forth through space!  Amazon, uses almost no compression on Video streamed on devices intended for that purpose such as their FireTV hardware and the Roku Player hardware, and because most of Amazon's Copy Protection is handled at the level of the Amazon platform itself (very different situation if browsing and watching Prime Video using an internet browser where WideVine is their main form of copy protection that puts them on par with Google) , Amazon is undoubtedly one of the most efficient and proficient video streaming service providers out there, if not the most efficient and proficient of them all.

Amazon Prime Video's audio is heavily compressed unlike their video quality, and that is something to keep in mind for audiophiles.

There is a bug in the Amazon Prime app for Roku that affects how Captions and Subtitles present on Amazon Prime Video.  The bug is a rather simple one in that the Caption / Subtitle styles for the Amazon Prime app are determined by the settings in the Roku's Accessibility settings.  If the Caption / Subtitle style that is setup in Roku's Accessibility is set to a value of either "Off" or "Default" for the "Background opacity", the Amazon Prime Video app will display a solid/opaque black border around Captions / Subtitles.  We've reported the bug to Amazon three times in the past 28 months, and they've done nothing about it.  For those seeking a workaround to this issue, set the Caption / Subtitle style under the "Background opacity" value in Roku's Accessibility Settings to "25%".  It's not a perfect solution, but it's a good enough one to use temporarily while placing palms together that Amazon will some day get its act together to fix this one and only obnoxious bug that the Amazon Prime Video app for the Roku has.

If trying to get 4k-UHD content, Amazon Prime Video is on par with Netflix, so if you're connected to a display via the Roku that isn't operating at 60hz, you will not get 4k-UHD content.

Every single other application that we've used including VUDU and FandangoNow does not limit the availability of 4k-UHD content if using a display that handles 4k-UHD and HDCP v2.2 when operating at 30hz.  Amazon Prime Video, and Netflix are the only ones that seem to insist on having a refresh rate of 60hz to both offer and detect that they should be offering 4k-UHD content.

We hope this has been educational, and that people have learned a lot about how to get the most out of their Roku, their streaming services of choice, and in the process how not to worry so much about Dolby Vision too because at the end of the day, a high quality monitor with some calibration tools, and a little patience, is all anyone needs to get the most out of their Roku.

Happy Streaming, and Happy Holidays to All.

Thank you for reading.  Smiley Happy
- Bravo Family
0 Kudos
1 Reply
Level 7

Re: The Myth of Dolby Vision HDR & How to perform manual display calibration.

This article is extremely misleading in two ways:

1) The reality is that effectively no consumers even tinker with their TV's settings to turn on Movie mode and turn off all of the garbage image "enhancement" settings, let alone spend 100s of dollars to calibrate or have their sets calibrated. If Dolby Vision does this automatically then that's a huge win for consumers.

2) It ignores the biggest difference between static HDR standards like HDR10, and dynamic standards like Dolby Vision, HDR10+, and Advanced HDR. HDR10 uses a static setting for the entire playback of a video that will at best be a compromise for most of the video; while Dolby Vision, and it's competitors, constantly vary the display to optimize the picture quality at any given moment, even down to a frame by frame basis.

I'll bet that the real reason that Roku doesn't provide dynamic HDR is that they've set the price of most of their products down to such a low point, so as to maximize their market share, that they can't afford to license Dolby Vision or even implement the open source HDR10+ standard. In effect they're bargaining on their customers' ignorance. And it's probably the right decision since most consumers won't even notice that their Netflix or Amazon video wasn't streaming in 4K, let alone that the color was a little less than the best that it could be.

What is Dolby Vision? The dynamic HDR format fully explained 

0 Kudos