"Miami Son" wrote:
Refresh rate and frame rate are not the same and are not tied to one another. The refresh rate in the US is 60hz, whereas in the UK it is 50hz. Using the wrong one will result in a distorted or flickering picture.
Just want to respond to this, Because yes, Refresh rate and frame rate are linked, as you generally want them to be the same thing.
For instance, if you use 59.94Hz for 23.976Hz content, as is typical for displaying movies on NTSC complaint displays, you get juddering. This is due to frame time. Each frame at 59.94 is 16.7ms, however, this does not have an even multiple from 23.976, where each frame is 41.67ms. Due to this, you end up displaying one frame for 33.4ms and the next frame for 50.1ms.
To avoid these effects, you want to match your refresh rate with the frame rate of the source. Or use an even multiple, such as using 50Hz refresh rate for 25Hz content (see old BBC shows for 25Hz). The goal is to get each frame the same amount of time. Likewise with the error here, we have a frame skip every 41.67 seconds, as it takes that long for the alignment to no longer work for 23.976Hz and 24Hz. (Beat frequency of 0.024Hz).
There is sadly confusion on this topic, often with "higher is better" used. Unless that higher is 120Hz or 144Hz (over displayport or HDMI), higher is not better. We need an even multiple. 30*2=60, 23.976*5=119.88, etc. (Also, higher frame rate means that error due to beat frequency is much harder to see, with one frame going +8.3ms, but this requires an even multiple to start.)