Tsar Hoikas wrote:Of course, those are the displays, which tend to be pretty well tied to the AC current frequency of 60 Hz. Film is still generally shot at 24fps. It was a minor controversy when the The Hobbit was filmed at 48fps in 2012!!! You can find pages and pages and pages about "judder" in movies relating to playing the 24 or 48fps source material on a display that refreshes at 60fps. Facebook even got in on the action by inventing a new time unit called the flick so the math is all integer-based.
Ah, that's what I was wondering about. So even action movies with lots of motion are still mainly 24 fps, then ? That sounds like a good way to get headaches IMHO.
Yeah, I understand better why people claim 48 fps on a 60 Hz monitor is not as good as, say 30 on 60 - it results in a worse frame timing. That's also why some console games still lock the framerate to 30 instead of letting it fluctuate between 30 and 60, I guess. Well, if you ask me, it's still picking your favorite poison - both are pretty bad compared to a stable 60 fps.
On a side note, I have a 144 Hz computer monitor, and I can say it makes a huge difference in 3D apps or when scrolling webpages. Much less eyestrain, which also means less headaches when using it for prolonged periods of time.
I know Facebook used the flick for VR systems, due to it requiring higher refresh rates and proper frame timing. But then I doubt other headsets use this system at all, so I wonder how much of it is actually used in drivers and how much is just marketing.
Emor D'ni Lap wrote:And with higher-data-rate pipelines available, we're now starting to see 4K digital screens and projectors available as well as 120fps rates...but getting both at once may still take awhile!
Yeah, it's impressive: it's very hard to find monitors that are 4k with a high refresh rate. Usually higher-end technology is simply insanely expensive, but for a long while it seems this type of monitor simply didn't exist at all due to how complex it was to produce.