A lot of people in the (Western) World seem preoccupied with this, but consider this from the widely used textbook 21st Century Astronomy (4th edition), page 163, emphasis mine:
And, so, are 60Hz or above monitors/TVs a rip-off? Ditto for any frame rate about 30 fps?
Quote:Integration time is the limited time interval over which the eye can add up photons. The brain “reads out” the information gathered by the eye about every 100 milliseconds (ms). Anything faster appears to happen all at once. If two images on a computer screen appear 30 ms apart, you will see them as a single image because your eyes will sum (or integrate) whatever they see over an interval of 100 ms or less. But if the images occur 200 ms apart, you will see them as separate images. This relatively brief integration time is the biggest factor limiting your nighttime vision. Stars too faint to be seen with the unaided eye are those from which you receive too few photons for your eyes to process in 100 ms.
Quantum efficiency is the likelihood that a particular photon landing on the retina will, in fact, produce a response. For the human eye, 10 photons must strike a cone within 100 ms to activate a single response. So the quantum efficiency of our eyes is about 10 percent: for every 10 events, the eye sends one signal to the brain. Together, integration time and quantum efficiency determine the rate at which photons must arrive at the retina before the brain says, “Aha, I see something.”
And, so, are 60Hz or above monitors/TVs a rip-off? Ditto for any frame rate about 30 fps?