I just installed my Roku Ultra (yesterday--today is March 2, 2021--latest bios as of this date is installed) and I note that the default 4k HDR mode the Roku detects is 4K HDR 30Hz. TV is a new Scepter (manufactured in Sept 2020) 55" 4k HDR10 set. TV is very nice--I'm embarrassed to say how little I paid for it via Walmart online... (Btw, Walmart online ships all of its big-screen TVs 2-day Fed-EX, free. I bought my first one 4 years ago, also Fed-Ex, free. TVs arrive in flawless, dent-free packaging! Just thought I would mention it. Shipping for these sets is very important to me.)
The TV is a 4k 60Hz HDR10 set, so I am sort of baffled as to where this 30Hz mode comes from, and why it is the default. However, I note that by using the "Force 4K HDR" option, offered in display setup as a configuration option, the Roku Ultra then detects the correct 4k 60Hz HDR10 mode and allows me to set it.
This has got me wondering if the 30 Hz 4k HDR playback is of better quality than the 60Hz 4k HDR10 mode...? I'm wondering what aspect of HDR the Roku detects at 4K 30 Hz that apparently it doesn't see at 4k 60 Hz. At 30Hz, the Roku says, "HDR" but at 60Hz the Roku says "HDR10."
It would just be interesting to know why the Roku assigns 4k HDR 30Hz by default but offers 4K HDR10 60Hz as an available option--why not 60Hz by default and 30Hz the available option? Just curious!
--BTW, I love this device! Vey nice. I know the TV is a 4k 60Hz HDR10 TV, so the 30Hz option is intriguing. I use TV, not projection.